Re: Forced URL rechecking...

From: Dancer <program@dont-contact.us>
Date: Wed, 09 Oct 1996 10:15:11 +1000

Ed Knowles wrote:
>
> G'day All!
>
> > The Netscape Proxy server has a setting that will force a check of
> > material before delivery to a client. We had it set up for about a
> > half hour. But I can find no such similar setting with Squid.
> >
> > Why is it important? For instance, cnn.com is constantly updating
> > its information. I find that with the Netscape Proxy server that if
> > the page is updated, we'll see that update within minutes because of
> > the 30-minute age setting before a recheck. With Squid, no matter
> > what I tried, we keep getting the old page and the users were forced
> > to request a refresh to bring in the new material.
> >
> > Is there a setting I've missed?
>
> The 'best/only' solution is to encourage sites to send out Expires: headers,
> and/or the soon to be approved HTTP 1.1 cache/proxy features. Squid could guess
> as much as it likes, but that is all it would be, guessing.
>
> Many sites like having lots of hits, and seem to do their best not to help in
> even the guessing of a TTL; ie Last-Modified: headers are not sent out.

Despite the lack of expiry meta-info in document headers, I've noticed
that Squid seems to deal better with frequently changing documents. I
use Netscape 3.0 (release), and I find myself doing significantly fewer
reloads than with (say) Roxen(Spinner), CERN, or Apache as proxies.

Honestly, our users have gotten used to that over the last year or so,
and the - somewhat - reduced need to reload is much welcomed. Mind you,
_we_ don't have any expiry meta-infos on our pages, either. Something I
guess we should correct. :)

D
Received on Tue Oct 08 1996 - 17:09:58 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:33:15 MST