RE: [squid-users] Sporadic high CPU usage, no traffic

From: Henrik Nordstrom <hno@dont-contact.us>
Date: Fri, 5 Nov 2004 00:35:58 +0100 (CET)

> Here's the proposed solution for site blocking (using url_regex):
> (www\.)?gamesondemand\.yahoo\.com/
> (www\.)?bpssoft\.com/powertools/library\.htm
> (www\.)?www2\.photeus\.com(:[0-9]+)?/~ewot/

The real solution is to not use url_regex, and when you need to use it use
it on as few requests as possible.

url_regex should be viewed as a "last resort" acl when none of the other
ACLs can be used.

In most cases dstdom acls is what should be used.

url_regex scales very badly with the number of patterns and is a
preformance killer.

dstdom scales very well with the number of blocked sites and easily copes
with very large lists. Basically only the startup time is dependent on the
number of entries in a dstdom list.

In terms of filtering Squid really lacks the "urllist" type of acl
available in SquidGuard. Squid only has equivalences of "domainlist"
(dstdomain) and "expressionlist" (url_regex) and a few minor variants
thereof.

Regards
Henrik
Received on Thu Nov 04 2004 - 16:36:01 MST

This archive was generated by hypermail pre-2.1.9 : Wed Dec 01 2004 - 12:00:01 MST