Re: [SQU] Squid Failure

From: Mehrdad Fatemi <fatemi@dont-contact.us>
Date: Tue, 5 Sep 2000 01:20:13 +0430

decrease the squid memory at about 1/4 or 1/5 of total ram
you have got in your system.

Best Regards

Mehrdad Fatemi
R&D Director

< AFRANET Co. ---------------------------- R&D Dept. >

-----Original Message-----
From: Andrew Blanche <andrew@foxall.com.au>
To: squid-users@ircache.net <squid-users@ircache.net>
Date: Monday, September 04, 2000 3:32 PM
Subject: [SQU] Squid Failure

Hello to all

I have recently developed a rather huge problem with our Squid cache.

It keeps Failing aleast once a day during peak load times!!!
On failurethe squid child process has stopped.
The only message I have been able to get on the failure is:
"Duplicate page error" then some type of memory address.
This message is then repeated a couple of times with different addresses.
Then the message vm:stopping process squid then a message about deleting
duplicate pages.

We had been running Squid for about two years without any problems, I
recently added 30 additional modems to our dial pool making a total of 140
and the problems started.

We were running Squid 2 on Redhat 6.0
The machine was a pentium 233, 256Mb ram 2X 9.1 gig SCSI drives(40Mb)

I then built up a new box:
Pentium 3 733 Mhz
384 Mb Ram(133 Mhz)
1X 9.1 Gig lvd SCSI (160 Mb/sec)
Red hat 6.2
Squid 2

I am still getting the same failure during peak load(About 5 access sec)

If anybody has any ideas I would love to hear them A.S.A.P.
I am getting really frazzeled and don't know what to do!!!!!!!!!!!!!!

Regards Andrew Blanche

--
To unsubscribe, see http://www.squid-cache.org/mailing-lists.html
Received on Mon Sep 04 2000 - 14:59:59 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:55:12 MST