cache-proof servers?

From: Ira Abramov <>
Date: Wed, 24 Jul 1996 21:33:25 +0300 (IDT)

On Wed, 24 Jul 1996 wrote:

> > Accessing home pages from Netscape, AltaVista and some
> > others big brothers Squid-1.0.3 is checking my parent for this
> > pages every time!
> Even "big" sites with high load fail to provide at least
> "Last-Modified" headers, which are crucial for caching. This is
> e.g. true for many of Netscape's own pages. So it seems they're not
> afraid of burning bandwidth...

I have a setting for my Apache (sorry, no experiance with other servers)
that controls wheather the server's served docs are cachable or not, I
suppose it's a header thingy you can enable to tell the cache server NOT
to cache certain pages, like when the same URL produces random pages, or
random ads (on commercial sites like Geocities and others, an
advertisement chages on each reaload)

I have an even wierder problem, though. I first started using Apache
1.1.1's cache+proxy, but it slowed me down for some sites and I couldn't
see others. Now I see this happens again, in many of the same sites, with
Squid! (can't remember if cached 1.4pl3 did it too, but it will take 3
minutes to check).

simplest test:
from my node ( to netscape's site, and load in similar speeds, when I set Netscape (v.3b5,
win95) to go through my squid (Linux 2.0), doesn't
answer HTTP requests (but Netscape doesn't stop on timeout...). is fine...

from access.log: - - [24/Jul/1996:12:29:45 +0300] "GET" TCP_MISS 116 - - [24/Jul/1996:12:30:00 +0300] "GET" TCP_MISS 116 - - [24/Jul/1996:12:30:08 +0300] "GET" TCP_EXPIRED 116

anyone has any idea?! who knows how many more sites have this problem.

(yo, Mulder! come here, I have something for ya!)

   Ira Abramov <> Scalable Solutions
   Beeper 48484 at 03-610-6666 FAX (972)2-430-471
   POBox 3600 Tel (972)2-6426822
   Jerusalem 91035, Israel
Received on Wed Jul 24 1996 - 11:28:22 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:32:41 MST