Re: [squid-users] remote 403 error through squid

From: Henrik Nordstrom <hno@dont-contact.us>
Date: Sun, 11 Sep 2005 23:04:07 +0200 (CEST)

On Sun, 11 Sep 2005, Merton Campbell Crockett wrote:

> Is there an inverse max-age dependency? The behaviour of the VATLogic and
> Mufreesboro web sites occurs regardless of max-age. Both sites return a
> 403 (Forbidden) status when the URL references DocumentRoot.

For me VATLogic shows the exact same symptoms. With a high max-age 403 is
returned. Low max-age, none or combined with Pragma: no-cache works fine.

> There may be an inverse max-age dependency but in these two instances I
> suspect that it is a "red-herring". There is a simpler answer. Access is
> being denied because the request appears to be attempting to retrieve a
> directory listing.

I have no problem reaching this site directly, only via Squid. Also works
fine via Squid if the refresh_pattern workaround metioned earlier is used.

Works:

squidclient -h www.vatlogic.com -p 80 -H "Host: www.vatlogic.com\n" /

squidclient -h www.vatlogic.com -p 80 -H "Host: www.vatlogic.com\nCache-Control: max-age=0\n" /

squidclient -h www.vatlogic.com -p 80 -H "Host: www.vatlogic.com\nCache-Control: max-age=20000\n" /

Fails:

squidclient -h www.vatlogic.com -p 80 -H "Host: www.vatlogic.com\nCache-Control: max-age=200000\n" /

Works:

squidclient -h www.vatlogic.com -p 80 -H "Host: www.vatlogic.com\nCache-Control: max-age=20000\nPragma: no-cache\n" /

If you try different max-age values you will find that the limit increases
by the second.

On all three sites the pattern is the same, and only applies to
"directory" URLs. If the URL is modified in any manner (query parameters,
or even by ending in a double //) the problem is not triggered.

Regards
Henrik
Received on Sun Sep 11 2005 - 15:04:09 MDT

This archive was generated by hypermail pre-2.1.9 : Sat Oct 01 2005 - 12:00:03 MDT