[squid-users] Bad url sites

From: Ross Kovelman <rkovelman_at_gruskingroup.com>
Date: Mon, 12 Oct 2009 23:12:02 -0400

I use a file called bad_url.squid to represent sites I want blocked. I
think I have reached a limit to what it can hold as when I do a reconfigure
it could take a few minutes for the data to be scanned and processing power
gets sucked up. I know there is dansguardian and a few other ways to
achieve a list but can squid handle such a list like that? I think I have
19,000 sites to have blocked.

Thanks
Received on Tue Oct 13 2009 - 03:12:15 MDT

This archive was generated by hypermail 2.2.0 : Tue Oct 13 2009 - 12:00:05 MDT