Re: Squid 2.2.STABLE2 and ACLs

From: as web server manager <webadm@dont-contact.us>
Date: Tue, 4 May 1999 23:46:47 +0100 (BST)

Tim Burgess wrote:
>
> We have a pornography etc blocking set of ACLs and we get a similar list of
> errors when we start. But it still works - so who cares?
>
> Sure, the errors may be misleading - but squid still runs and the ACLs work.

Did you check it really did work (rather than simply not crash), in all
cases? The warning messages, and comments I received about a related issue,
made me rather wary; the impression given is that depending on "random
factors", the lookup may not match in the way you'd expect, depending on how
building the splay tree structure works out based on the particular entries.
And that could mean it works one day, but miss some matches the next after
the list has been updated.

Actually, since Duane Wessels mailed me a reply indicating it *was* just an
incorrect diagnostic (with a patch to try), you would have been right - but
I'd have been wary of taking the chance without checking that it was going
to work in general, not just with specific entries in the tree. [I've
confirmed that the patch worked, except for a misleadingly-worded warning
for an exact duplicate entry, which was reported as a subdomain, so the
patch or some variant of it will presumably appear in future versions and
maybe as a freestanding patch.]

In response to a suggestion by private mail from someone else, noting that I
could avoid the problem by using multiple acl definition, those would
have to be in the main config file and I was using an acl referencing a file
to avoid frequent edits to the main config (possibly by a variety of
people), plus it would be messy splitting the entries into the minimal
number that avoided conflicts. Not an issue to the extent that the specific
problem was due to a bug anyway.

However, as noted in my reply to Duane, I'd be happier if Squid
automatically ignored exact duplicates and hosts/subdomains covered by
higher-level domain entries in an ACL definition, since that would make it
easier and less error-prone maintaining long lists of unrelated special
cases which could by chance be identical or overlap. Squid should be
well-placed to handle it automatically (but the current implementation may,
of course, make it less easy than it would appear), whereas doing it by hand
(reinstating commented-out entries when conflicting entries are deleted, or
whatever) would be error-prone, and writing checking scripts to spot problems
or derive a "sanitised" version from a master file seems like the wrong way
to do it (complicating things unnecessarily)...

                                John Line

-- 
University of Cambridge WWW manager account (usually John Line)
Send general WWW-related enquiries to webmaster@ucs.cam.ac.uk
Received on Tue May 04 1999 - 16:29:49 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:46:13 MST