Re: Forced URL rechecking...

From: Oskar Pearson <oskar@dont-contact.us>
Date: Wed, 9 Oct 1996 09:43:37 +0200 (GMT)

Dancer wrote:
>
> Ed Knowles wrote:
> >
> > G'day All!
> >
> > > The Netscape Proxy server has a setting that will force a check of
> > > material before delivery to a client. We had it set up for about a
> > > half hour. But I can find no such similar setting with Squid.
> > >
> > > Why is it important? For instance, cnn.com is constantly updating
> > > its information. I find that with the Netscape Proxy server that if
> > > the page is updated, we'll see that update within minutes because of
> > > the 30-minute age setting before a recheck. With Squid, no matter

> > The 'best/only' solution is to encourage sites to send out Expires: headers,
> > and/or the soon to be approved HTTP 1.1 cache/proxy features. Squid could guess
> > as much as it likes, but that is all it would be, guessing.
> > Many sites like having lots of hits, and seem to do their best not to help in
> > even the guessing of a TTL; ie Last-Modified: headers are not sent out.

> Honestly, our users have gotten used to that over the last year or so,
> and the - somewhat - reduced need to reload is much welcomed. Mind you,
> _we_ don't have any expiry meta-infos on our pages, either. Something I
> guess we should correct. :)

Does squid actually scan through the text of each document, searching for
META tags?

A "grep -i meta *" in the source directory only seems to return a structure
called meta.* This leads me to suspect that it only handles the tags
returned by the server in the HTTP header... Not people putting text
in the html source...

Am I wrong?

Oskar
Received on Wed Oct 09 1996 - 00:43:49 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:33:15 MST