RE: [squid-users] Banning all other destinations

From: Chris Robertson <crobertson@dont-contact.us>
Date: Thu, 10 Feb 2005 11:53:30 -0900

> -----Original Message-----
> From: johnsuth@acenet.com.au [mailto:johnsuth@acenet.com.au]
> Sent: Thursday, February 10, 2005 8:40 AM
> To: squid-users@squid-cache.org
> Subject: [squid-users] Banning all other destinations
>
>
>
>
>>ACLs don't seem to be checked when squid serves cached content (likely in

>>the interest of speed).
>
>
> Many thanks Chris for your generous offer and your suggestions. Also to
Henrik for
> clarifying the structure of URLs. My frustration, which I tried to
conceal in my posts, but
> which you detected, was due not to the unwanted object being in cache, but
due to it's
> sailing past what I thought were three consecutive deny by default rules.
I even inverted
> the permission in the last line, although it goes against my grain to
write allow when I
> mean deny. Remember also, these rules were added to shore up the s8 build
of Squid.
> They were not needed for the s5 build, where ACLs worked perfectly.
>
> I tested my rules again by dreaming up a url which was definitely not in
cache,
> www.elephants.com. Alas, there is such a site and it is on screen and in
cache now. It
> sailed past my deny by default rules.
>
> Is my syntax so bad that not one of these rules will deny all destinations
not previously
> allowed?
>
> Is it OK to interleave definitions and rules in squid.conf to make it
easier for me and
> others to follow my logic? e.g.:-
>
> acl government dstdom_regex -i .gov
> http_access allow government
> acl education dstdom_regex -i .edu
> http_access allow education
> acl google dstdomain .google.com.au
> http_access allow google
> acl ip dst 0.0.0.0/0.0.0.0
> http_access deny ip Does this not deny all LAN and WAN
destinations?
> acl http proto HTTP
> http_access deny http Does this not deny all HTTP requests?
> acl www url_regex -i www.
> http_access allow www Does this not deny all www URLs?
>
> In my discussions with Squid users in the OS/2 Usenet, I learned that they
all stick with
> old versions of Squid because they ain't broke. It seems to me that both
binaries I
> have tried are working without error, but not as intended. Have I just
had bad luck in
> selecting recent binaries to work on my particular system?
>
> No one's honour is at stake here. I am just hoping that people with far
more Squid
> experience than I, can point me to the cause of what seems to be an
astoundingly
> elementary problem, when compared with the other challenges discussed in
this mailing
> list.

Interleaving the acls and http_access lines should work just fine. I'd
change the dstdom_regex to dstdomain, because as it stands now, anything
with ".gov" anywhere in the domain (where the dot can represent any
character i.e. thegovenator.com), will be allowed through. Same thing for
the .edu acl. Neither of which explains how you managed to surf to
www.elephants.com. The http_access allow www would let you, but it should
never be reached (all traffic should be blocked by the "deny ip" line).
Replacing the three deny lines (and their associated acls) with a
http_access deny all would also help lead to more clarity.

Two choices here. Post your whole squid.conf file (preferably minus
comments and blank lines) or utilize Squid's native debugging capabilities.
Using "debug_options ALL,1 33,2" will give a pretty good step-by-step of how
squid is acting on ACLs for each web request (output to the cache.log). Be
aware, the output is quite verbose, so it's not something that you likely
want to use on a production server. At least not for long.

One last question... Are you telling Squid to reconfigure (or restart)
after each change to the config file? It may be obvious, but it never hurts
to ask.

Chris
Received on Thu Feb 10 2005 - 13:53:32 MST

This archive was generated by hypermail pre-2.1.9 : Tue Mar 01 2005 - 12:00:02 MST