Re: [squid-users] Squid getting too big?

From: Adrian Chadd <adrian@dont-contact.us>
Date: Mon, 6 Aug 2001 13:27:09 -0600

On Mon, Aug 06, 2001, Florin Andrei wrote:
> On 03 Aug 2001 09:18:57 -0400, Mike Diggins wrote:
> >
> > Are you saying you have to restart squid daily? That scares me.
>
> I know. :-/
>
> > What happens if you don't?
>
> On my machine (Irix-6.5) it grows up to 1.8 GB and then it dies. Lucky
> me, i run it from RunCache so it's automatically restarted.
>
> But, with such a large process, all kind of weird things happen. For
> example, i'm forced to use the old external DNS resolvers (dnsserver),
> because i have to follow the resolv.conf search path, and Squid cannot
> do that with the internal resolver. But, when the process is so huge, if
> a dnsserver dies, it cannot be restarted (the machine doesn't have
> resources to fork/exec a 2 GB process!).
> So, after a while (3...6 days), instead of 32 dnsserver's i have only 1
> or 2. The dns queue gets overloaded, the users start to experience all
> kind of DNS-related errors, and all the hell breaks loose.
>
> I usually end up restarting Squid. :-(

A little more research needs to happen here.

Personally I want to know if the requests are getting "stuck",
ie squid accepts new requests but doesn't actually service them.
A lot of open but stuck connections could be stealing RAM.

I can't really tell right now - it doesn't pop up when I'm using
squid at home, and I don't admin any large caches.

In general, squid's RAM usage needs to be fixed. :-)

Adrian
Received on Mon Aug 06 2001 - 13:27:10 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:01:29 MST