Re: [SQU] number of concurrent requests

From: Henrik Nordstrom <hno@dont-contact.us>
Date: Fri, 05 Jan 2001 07:34:48 +0100

Li Xiang wrote:

> By the way, is there any constrains on squid's performance? Like the peak
> request rate. Have heard someone use squid to deal with 60-70
> requests/second. Is there any boundary for this number? Or some way to
> improve this number?

Peak request rate varies between 30 and 200+ depending on hardware and
OS setup selected.

There are four main external performance bottlenecks:

a) Disks:

Good performance requires at least 2-3 physical drives for cache, and
the use of async-io or diskd to make use of those.

b) OS Tuning

Many OS:es requires some tuning. This includes for example

- number of filedescriptors available
- number of sockets available
- number of unbound ports available for outgoing connections
- some OS:es needs tuning for TIME_WAIT
- maximum data segment size
- memory paging priorities

c) CPU

Most modern CPU's can keep up quite fine, but if you are running on an
older system or trying to reach very high request rates then the CPU
might become a bottleneck.

d) Memory

Amount of memory can have a quite significant performance impact. There
is a minimum limit in relation to your cache size that you absolutely
must have to get any decent performance at all due to paging, but above
that there other benefits from having free memory.

A good tool for measuring how your system keeps up is polygraph. See
http://polygraph.ircache.net/

--
Henrik Nordstrom
Squid hacker
--
To unsubscribe, see http://www.squid-cache.org/mailing-lists.html
Received on Thu Jan 04 2001 - 23:36:31 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:57:21 MST