Re: Querystring vs. Squid Cacheserver

From: David J Woolley <djw@dont-contact.us>
Date: Tue, 2 Mar 1999 12:33:55 +0000

> I'm serving city maps. While the number of output pages is circa
> infinite for all practical purposes, they accumulate around the same
> places.

The company that does this in the UK appears to want their pages to
be forced dynamic and generate a new serial numbered page for each
access. This seems to be to stop people bookmarking the page or
linking to it and thus bypassing the adverts. In fact, we originally
had a problem with a CERN cache in that they were making the maps
cacheable and we ended up accumulating large numbers of identitcal
GIFs; they subsequently fixed this.

> handler simulates what you are suggesting. I eliminate the
> questionmark and suddenly all ignorant cacheservers cache me.

The reason for this is because there are a lot more ignorant CGI
script writers who have misused GET mode forms for applications which
do not behave as pure functions (the returned page only depends on
the parameters, not past history, and there are no side effects). As
a result, I believe that the HTTP and/or HTML specifications now
consider it unsafe to cache any forms URL, although the HTTP
specification does allow this to be overridden by including an
explicit expiration time (section 13.9).

A lot of things that are done on the web are to work around
entrenched abuses.

Note this may fail to post to the Perl/Apache list.

-- 
David Woolley - Office: David Woolley <djw@bts.co.uk>
BTS             Home: <david@djwhome.demon.co.uk>
Wallington      TQ 2887 6421
England         51  21' 44" N,  00  09' 01" W (WGS 84)
Received on Tue Mar 02 1999 - 06:55:53 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:45:06 MST