Re: Forced URL rechecking...

From: Ed Knowles <ed@dont-contact.us>
Date: Wed, 9 Oct 1996 09:37:58 -0500

G'day All!

> The Netscape Proxy server has a setting that will force a check of
> material before delivery to a client. We had it set up for about a
> half hour. But I can find no such similar setting with Squid.
>
> Why is it important? For instance, cnn.com is constantly updating
> its information. I find that with the Netscape Proxy server that if
> the page is updated, we'll see that update within minutes because of
> the 30-minute age setting before a recheck. With Squid, no matter
> what I tried, we keep getting the old page and the users were forced
> to request a refresh to bring in the new material.
>
> Is there a setting I've missed?

The 'best/only' solution is to encourage sites to send out Expires: headers,
and/or the soon to be approved HTTP 1.1 cache/proxy features. Squid could guess
as much as it likes, but that is all it would be, guessing.

Many sites like having lots of hits, and seem to do their best not to help in
even the guessing of a TTL; ie Last-Modified: headers are not sent out.

Later
Ed

-- 
Ed Knowles aka Jasper				         Phone : +61 2 9385 4962
E-mail: ed@fatboy.geog.unsw.edu.au	                 Fax   : +61 2 9313 7878
                 What I lack in morals I make up in principles.
Received on Tue Oct 08 1996 - 16:37:46 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:33:15 MST