Forwarding loops with squid 2.0 RELEASE

From: John Sloan <johns@dont-contact.us>
Date: Mon, 19 Oct 1998 15:50:51 +0100 (BST)

We have 4 caches. One is internal, and parents through the other three.
I have it set to use all 3 external caches as a parent with cache digests,
and this appears to be working fine.

The other three caches used to be ICP neighbours (till I decided that the
latency overhead was too high). I wish to try to set these up as
cache_digest neighbours.

I tried to do this all at once, and ran into a storm of loop errors in
cache_log on all three external caches. I backed off quickly.

Testing more gently this time, I set just two of the caches, mist hail to
peer with the other two. I still got loop errors:

1998/10/19 15:22:22| WARNING: Forwarding loop detected for
'http://www.webcrawler.com/'
1998/10/19 15:22:22| --> 1.0 hail.pipex.net:3128 (Squid/2.0.RELEASE), 1.0
mist.pipex.net:3128 (Squid/2.0.RELEASE)
1998/10/19 15:22:23| WARNING: Forwarding loop detected for
'http://pussylips.fsn.net/tru02.gif'
1998/10/19 15:22:23| --> 1.0 hail.pipex.net:3128 (Squid/2.0.RELEASE), 1.0
mist.pipex.net:3128 (Squid/2.0.RELEASE)

Having checked the FAQ and release notes, I can confirm there are no
never_direct directives on any of the extenal caches, and I have commented
out all visbile_hostname and unique_hostname entries I had. Still the
warning persists.

Is this a bug or an ignorable warning? If the latter, then I wish it
would be quieter about it, if only for the sake of my poor log disk. :)

John Sloan
UK Unix Systems Administrator
UUNET Worldcom
Received on Mon Oct 19 1998 - 08:54:58 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:42:34 MST