Re: Forced URL rechecking...

From: Christian Balzer <cb@dont-contact.us>
Date: Wed, 9 Oct 1996 11:01:05 +0200 (MET DST)

Ed Knowles wrote:
>
>G'day All!
>
[A rather high concentration of Aussies and Canucks here. ;-) Not a big
surprise, though. Folks who dangle of meager links have the highest
interest in a working cache. ^_^]

[...]
>
>The 'best/only' solution is to encourage sites to send out Expires: headers,
>and/or the soon to be approved HTTP 1.1 cache/proxy features. Squid could guess
>as much as it likes, but that is all it would be, guessing.
>
True, but the only thing it can do in such a case...

>Many sites like having lots of hits, and seem to do their best not to help in
>even the guessing of a TTL; ie Last-Modified: headers are not sent out.
>
Well, if a site (like cnn.com or such) wants their pages to be current
they'll supply all the headers a cache/proxy would need to do the
"right thing" (tm). Basically I would configure squid to cache pages from
a site that doesn't supply these headers for a _long_ time, as it would
spoil their hit addiction. ^_- :P

Mata ne,

<CB>

-- 
  // <CB> aka Christian Balzer, Tannenstr. 23c, D-64342 Seeheim, Germany
\X/  CB@brewhq.swb.de | Voice: +49 6257 83036, Fax/Data: +49 6257 83037
SWB  - The Software Brewery - | Team H, Germany HQ | Anime no Otaku
Received on Wed Oct 09 1996 - 02:05:03 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:33:15 MST