RE: [squid-users] Squid dstdomain ACL

From: Mike McCall <mccall@dont-contact.us>
Date: Fri, 12 Dec 2003 10:29:45 -0700

> On Fri, 12 Dec 2003, Mike McCall wrote:
>
> > All,
> >
> > I have a fairly busy cache using native squid ACLs to block
> access to
> > certain sites using the dstdomain ACL type. This is fine
> for denying
> > access to sites like www.playboy.com, but doesn't work when
> people use
> > google's cache of pages and google images, since the domain becomes
> > www.google.com.
> >
> > My question; is there an ACL that will deny both
> > http://www.playboy.com and
> > http://www.google.com/search?q=cache:www.playboy.com/?
> >
> > I know regexes might be able to do this, but will there be a
> > performance hit?
>
> You have (at least) two options:
>
> 1) use the 'url_regex' type to block hostnames that appear
> anywhere in the URL, like:
>
> acl foo url_regex www.playboy.com
>
> The "performance hit" depends on the size of your regex
> list and the load on
> Squid. If Squid is not currently running at, say mor than
> 50% of CPU usage,
> you'll probably be fine.
>
>
> 2) Use a similar ACL to block all google cache queries:
>
> acl foo url_regex google.com.*cache:
>
> Duane W.

Thanks Duane. Unfortunately, my domains list is HUGE (~600,000 domains) and
the cache already runs at 50-95% CPU during the day, most of which I assume
is due to the huge domains list. If I were to lose the dstdomain ACL and
only use url_regex, would performance stay where it is? Sadly, I can't use
the second option you mention because google's cache is useful for other
non-offensive websites.

Mike
Received on Fri Dec 12 2003 - 10:28:17 MST

This archive was generated by hypermail pre-2.1.9 : Thu Jan 01 2004 - 12:00:11 MST