Re: [squid-users] Bad url sites

From: Henrik Nordstrom <henrik_at_henriknordstrom.net>
Date: Tue, 13 Oct 2009 12:54:30 +0200

mån 2009-10-12 klockan 23:12 -0400 skrev Ross Kovelman:
> I use a file called bad_url.squid to represent sites I want blocked. I
> think I have reached a limit to what it can hold as when I do a reconfigure
> it could take a few minutes for the data to be scanned and processing power
> gets sucked up.

What kind of acl are you using?

Tested Squid-2 with dstdomain acls in the order of 100K entries some
time ago and took just some seconds on my 4 year old desktop..

Did a test again on the same box, this time with a list of 2.4M domains.
Total parsing time is then 55 seconds with Squid-3 or 49 seconds with
Squid-2.

Regards
Henrik
Received on Tue Oct 13 2009 - 10:54:36 MDT

This archive was generated by hypermail 2.2.0 : Wed Oct 14 2009 - 12:00:02 MDT