Re: [squid-users] Caching Web sites

From: Henrik Nordstrom <hno@dont-contact.us>
Date: Tue, 11 Dec 2001 09:55:01 +0100

Andrew Reid wrote:

> What's probably required is a module or add-on to Squid which
> maintains its own internal database of frequently used sites. Part of
> this functionality should include an automated daemon that
> periodically updates that list of so that it always has a fresh
> copy. (eg, when an object expires in the cache that is marked as
> "Frequently Used", the daemon must retrieve a fresh version of the
> object.

Be warned that not all sites is happy with such use. You are now
entering the border between "human" and "robot" use of a site where
opinions differ.

> That even has issues to think about, possibly able to be worked around
> through configuration file options and such. What happens if an object
> that is marked as "Frequently used" expires every 5 minutes? You'll be
> getting a fresh copy so regularly it may have an impact on your
> bandwidth bill.

Depens on your "refresh policy".

> I'd be interested in having a play with the concept, possibly comming
> up with a prototype and going from there. There are some fairly large
> issues which need to be explored first, though.

You are welcome.

Regards
Henrik
Received on Tue Dec 11 2001 - 02:03:28 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:05:18 MST