Re: [squid-users] Memory trouble with big files

From: <lst_hoe01@dont-contact.us>
Date: Wed, 29 Jun 2005 21:44:50 +0200

Zitat von Henrik Nordstrom <hno@squid-cache.org>:

> On Mon, 27 Jun 2005 lst_hoe01@kwsoft.de wrote:
>
>> we have since long ago using a Squid cache (Version 2.2.Stable5)
>> with no problem
>> on a 512MB RAM machine for our network. After using an other http gateway
>> virusscanner we get problems with squid trying to get even relly big files
>> first into memory before transfer it to the clients. This lead to a squid
>> process using all available memory until it is killed by the OS if we try to
>> download files excceding the RAM size (~500MB).
>
> I would suggest you to try upgrading to a current Squid version
> before investigating this any deeper. Squid-2.2.STABLE5 is very old
> and has since long been forgotten by all developers.
>
> During the years a number of "sudden increase of memory usage"
> situations have been corrected and there is a very high likelyhood
> the problem you are seeing is already fixed.
>
> Current Squid version is 2.5.STABLE10.

After some fiddling around with the really dated system (Kernel 2.2 :-) we
mananged to get 2.5.STABLE10 to work and the memory problem is in fact gone.
Time to fix the whole system and try out how well Squid works on 64Bit Opteron
with Kernel 2.6 ;-)

Thanxs a lot

Andreas
Received on Wed Jun 29 2005 - 13:44:52 MDT

This archive was generated by hypermail pre-2.1.9 : Fri Jul 01 2005 - 12:00:03 MDT