RE: [SQU] Performance Problem

From: Chemolli Francesco (USI) <ChemolliF@dont-contact.us>
Date: Thu, 30 Nov 2000 10:47:32 +0100

> > My next step would be to crank up debugging (possibly start
> with 33 -
> > client_side or 28 - acl checking) and see where most of the time is
> > being spent.
>
> It turned out that the ACLs are the limiting factor. With our
> full list
> of url_regex's (80 regular expressions), squid uses 100% of CPU time.
> Reducing the regular expressions to 10 reduces CPU load to 30--40%. I
> already compiled Squid with --enable-gnuregex. Is there another way to
> speed up ACL checking with url_regex's? Why is it so processor
> intensive?

Regex engines are basically Finite State Machines, and quite complex.
Are all those regexes REALLY needed? You could try to build check
trees for instance, it should increase performances.

Suppose you want to prevent users from retrieving MPEG's from site foo.com.
Using url_regex, it would mean

acl forbidden url_regex ^http://foo.com/.*\.mpeg$
http_access deny forbidden

Now if you do instead.
acl suspect_domain dstdomain .foo.com
acl forbidden url_regex ^http://foo.com/.*\.mpeg$
http_access deny suspect_domain forbidden

This way you'll handle misses very quickly because
they won't match suspect_domain, and thus the regex engine
will not be activated. Hits will take a slight penalty due to the extra
ACL check, but if you don't have many the overall gain should be noticeable.

-- 
	/kinkie
--
To unsubscribe, see http://www.squid-cache.org/mailing-lists.html
Received on Thu Nov 30 2000 - 02:51:54 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:56:44 MST