RE: [squid-users] Squid slows down when a file with more than 250 00 URLs to block is loaded

From: Chris Robertson <crobertson@dont-contact.us>
Date: Fri, 28 Jan 2005 14:34:49 -0900

> -----Original Message-----
> From: Pablo Romero [mailto:psromerozu@hotmail.com]
> Sent: Friday, January 28, 2005 2:20 PM
> To: squid-users@squid-cache.org
> Subject: [squid-users] Squid slows down when a file with more than 25000
> URLs to block is loaded
>
>
> Hello
>
> I am running Squid 2.5 Stable 6 on a Pentium4 1.8 Ghz and 256 Mbytes RAM.
I
> am trying to put this proxy in a production environment soon, but,
although
> squid is performing just fine all tasks, when I began to try the ACL
tests,
> it seems to slow down (very much) when a blacklist file is loaded. The
> configuration is the following:
>
> acl denegar url_regex "/opt/squid/blacklist"
>
>
> This blacklist file has 25000 sites on it, and it seems to take down
squid.
> I don't know if you guys can provide me some tips so I can tune up my
squid
> proxy. Can you tell me if I am using the wrong hardware configuration, if
I
> need more RAM, or if I just have to change some stuff in the squid.conf
> file.
>
>
> I'd appreciate your help
>
>
> Regards
>
>
>
>
> Pablo Romero

url_regex is a last resort, and very CPU intensive. Use dstdomain instead
in any instance that you can (i.e. instead of "url_regex site\.domain" use
"dstdomain .site.domain") and you'll find performance improves greatly.

Chris
Received on Fri Jan 28 2005 - 16:36:13 MST

This archive was generated by hypermail pre-2.1.9 : Mon Mar 07 2005 - 12:59:36 MST