capacity planning question (fwd)

From: Terence Kelly <tpkelly@dont-contact.us>
Date: Fri, 17 Dec 1999 19:16:34 -0500 (EST)

I tried posting this to squid-users but the message bounced. I'm not
sure whether my attempt to subscribe & re-post succeeded, therefore
I'm manually passing this question along to you. It isn't addressed
in the Squid FAQ or, in any systematic way, in the Web caching
literature. Thanks in advance for any wisdom you can offer.

---------- Forwarded message ----------
Date: Fri, 17 Dec 1999 09:05:43 -0800 (PST)
From: squid-users-request@ircache.net
To: tpkelly@eecs.umich.edu
Subject: Re: capacity planning question

Is there a formula, a conventional wisdom, or a folklore regarding
how to determine an appropriate size for a Web cache like Squid based
on expected workload, i.e., how much memory (RAM & disk) should be
devoted to storing cached documents? Anecdotal evidence related to
this question is available, e.g., at

   http://wwwcache.ja.net/servers/squids.html

However I haven't been able to infer from scattered data like this
the principles that admins apply when deciding how much RAM & disk is
appropriate for a cache serving a given workload. I'm especially
interested in academic literature on the subject, but rules of thumb
that sysadmins use would be appreciated too.

Please reply directly to tpkelly@eecs.umich.edu. Thanks in advance
for any pointers you can provide.
Received on Fri Dec 17 1999 - 18:08:59 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:49:56 MST