Re: [squid-users] "Quadruple" memory usage with squid

From: Amos Jeffries <squid3_at_treenet.co.nz>
Date: Thu, 26 Nov 2009 12:21:29 +1300

Robert Collins wrote:
> On Wed, 2009-11-25 at 10:43 -0200, Marcus Kool wrote:
>
>> There are alternative solutions to the problem:
>> 1. redesign the URL rewriter into a multithreaded application that
>
>> 2. redesign the URL rewriter where the URL rewriter rereads
>
>> 3. modify the URL rewriter to accept multiple request
>
>> 4. use less URL rewriters. You might get an occasional
>
> 5. Experiment with vfork.
>
> -Rob

Any of the above plus:

Log daemon is overdue for a patch to make it cap file size and
auto-rotate when the cap is hit. If you are able to assist with that
development you can then use the logging daemon instead of expensive log
rotates by squid itself.

Amos

-- 
Please be using
   Current Stable Squid 2.7.STABLE7 or 3.0.STABLE20
   Current Beta Squid 3.1.0.15
Received on Wed Nov 25 2009 - 23:21:39 MST

This archive was generated by hypermail 2.2.0 : Thu Nov 26 2009 - 12:00:03 MST