[squid-users] cache-control header causes site to fail

From: Graeme Bisset <gbisset@dont-contact.us>
Date: Thu, 17 Nov 2005 13:46:55 -0000

Hi,

I'm trying to access the site http://www.kinderopvang-plein.nl

It works without going through squid but when I do I get a 403 forbidden
error.

I've tracked this down to the cache-control header that squid adds (and
more precisely the value it specifies) by running the following wget
commands...

This command fails...
wget --spider --header='Cache-Control: max-age=259200'
www.kinderopvang-plein.nl

This command works...
wget --spider --header='Cache-Control: max-age=1000'
www.kinderopvang-plein.nl

I take it this is a problem with the web server not coping with the
higher max-age values but is there a way to configure squid so that it
uses smaller values by default?

Thanks in advance,

Graeme

This message has been scanned for viruses by BlackSpider MailControl - www.blackspider.com
Received on Thu Nov 17 2005 - 06:45:47 MST

This archive was generated by hypermail pre-2.1.9 : Thu Dec 01 2005 - 12:00:09 MST