RE: [squid-users] squid hogging all cpu and lots of memory

From: Holton, Euan <EHolton@dont-contact.us>
Date: Mon, 5 Dec 2005 10:20:04 -0000

> These are the acl's that give me the problems (or so i
> think, because if i comment those out its working just
> fine):
>
> #acl hacking1 url_regex
> "/usr/local/squid/etc/deny/hacking1"
> #acl noporn url_regex
> "/usr/local/squid/etc/deny/noporn"
> #acl hacking2 url_regex
> "/usr/local/squid/etc/deny/hacking2"
> #acl porn_deny1 url_regex
> "/usr/local/squid/etc/deny/porn_deny1"
> #acl porn_deny2 url_regex
> "/usr/local/squid/etc/deny/porn_deny2"
> #acl proxy_deny url_regex
> "/usr/local/squid/etc/deny/proxy_deny"
> #acl warez_deny url_regex
> "/usr/local/squid/etc/deny/warez_deny"
> #acl warez_deny2 url_regex
> "/usr/local/squid/etc/deny/warez_deny2"
> acl download2 urlpath_regex -i
> "/usr/local/squid/etc/denydownload.txt"
>
> What am I doing wrong? Please help if possible, Thank
> you

Your ACLs are based on regular expressions, which are documented as
being resource intensive. If those files are quite lengthy, then a lot
of processing will be done on each and every request (ie, every HTML
page, every image and so on).

If some of those files are lists of domains, rather than being "bad
expressions" as part of URLs, one thing you could try is to use the
DSTDOMAIN ACL. One thing to note when using it is that an ACL of

ACL dstdomain microsoft.com

will only match requests sent to microsoft.com, while

ACL dstdomain .microsoft.com

will match microsoft.com, www.microsoft.com, download.microsoft.com and
even Linux.is.wonderful.microsoft.com. If you use this when possible,
then you should find your content filtering will become a lot easier on
your server's resources. See squid.conf.default for further
documentation.

An alternative would be to use something like SquidGuard or Dansguardian
for content filtering.
Received on Mon Dec 05 2005 - 03:20:29 MST

This archive was generated by hypermail pre-2.1.9 : Sat Dec 31 2005 - 12:00:02 MST