RE: CGI, Cookies and caching...

From: Redfern Ian <RedfernI@dont-contact.us>
Date: Thu, 4 Sep 1997 10:39:00 +0100

It's interesting to look at sites' robots.txt files to see what should
be excluded from caching. For example,
http://www.atvantage.com/robots.txt has
# shoo them bots

User-agent: *
Disallow: /

so they probably don't want anything to be cacheable either. While
http://www.microsoft.com/robots.txt has
# robots.txt for http://www.microsoft.com/
# do not delete this file, contact Mark Ingalls for edits!!!!
#

User-agent: *
Disallow: /isapi/ # keep robots out of the executable tree
Disallow: /scripts/
which gives a good (if now obsolete) hint as to what not to cache there.

Wouldn't it be great if some of the effort expended on CDF and
robots.txt could be spent on defining what is cacheable for a site? In
fact, couldn't some of the information in a CDF file be used to
implement a site-specific caching strategy? A vain hope, I guess.

Ian Redfern (redferni@logica.com).
Received on Thu Sep 04 1997 - 02:53:42 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:36:55 MST