Re: [squid-users] Concurrent Connection Limit

From: Jeffrey Ng <jeffreyn@dont-contact.us>
Date: Sun, 10 Jul 2005 22:36:48 +0800

I tried your suggestion and double checked my server just now. The
server hits 1030'ish total network sockets and then squid stopped
responding to port 80 [as if its network soctet limit is 1024].

File descriptors would be if we had cache objects writing as fast as
network sockets (we dont), but i did up the limit on file descriptiors
to 2048 to 'test' to see if that became the new magic number to halt
port 80 traffic, and it still locked up at 1030 im afraid. What else
can be the problem?

Thank you!

Jeffrey Ng

On 7/10/05, Joshua Goodall <joshua@roughtrade.net> wrote:
> On Sun, Jul 10, 2005 at 02:04:36PM +0800, Jeffrey Ng wrote:
> > Hi, I have problem with squid web accelerator on my site. My site is a
> > photo sharing site like webshots. It has a pretty busy load, so I
> > decided that squid may be able to sooth the load of my image server by
> > caching some of the images. We have set everything up and it uses 1GB
> > RAM. It was fine at first. But suddenly all the images stopped loading
> > after 6 hours. I checked netstat and found that there are 1000
> > connections from outside. and squid stops responding whenever the
> > connections hit that number. I am pretty sure that squid has a
> > concurrent connection limit of 1000. How could I increase that limit?
> > Any help is appreaciated. Thank you!
>
> Sounds like you're running out of filedescriptors.
> See http://www.squid-cache.org/Doc/FAQ/FAQ-11.html#ss11.4
>
> - Joshua.
>
> --
> Joshua Goodall "as modern as tomorrow afternoon"
> joshua@roughtrade.net - FW109
>
Received on Sun Jul 10 2005 - 08:36:52 MDT

This archive was generated by hypermail pre-2.1.9 : Mon Aug 01 2005 - 12:00:02 MDT