Re: [squid-users] Blacklist

From: Hendrik Voigtländer <hendrik@dont-contact.us>
Date: Sat, 14 Aug 2004 13:34:59 +0200

ilopez@jalisco.gob.mx wrote:
> Hi
> I`m using squid and squidguard but my boss wants to use websense because he
> thinks that websense has a better blacklist and I have to convince him that
> squidguard can do it better and I`m looking for a good site to get better
> blacklist .
>
Hi,

I use squidguard to filter adult content only.

I am not really convinced by the squidguards blacklists available, but
at least they are a good start. Some sites seem to register new domains
faster than they update their content :-)
Especially not very common TLD's are missing from the lists.

What I did:
I combined all available lists together with the chastity-package
available for debian.
The next step was a personal blacklist robot, go it from

http://sase.de/squid/

You have to feed the script with some p*rn-metasites, the more the
better. It has difficulties to parse something else than direct
httplinks, take care that the site does not use an exit-cgi (a lot of
them do).
The script delivers a lot of false positives, and must be configured to
filter them, so take care.
I wrote a small script which does a double check by taking a quick look
with wget and parsing the result. Nothing sofisticated.

In fact it doesnt matter, how good a content filter is. There are always
possibilities to bypass them.

Easiest way to do this is google either with the cache or the translate
feature.
Jap is another good way: http://anon.inf.tu-dresden.de/index_en.html
It is nearly impossible to block. Only way is to lock down the clients
machines to prevent that software to be used.

IMHO a content filter can be only a warning to the users that surfing a
certain type of sites is forbidden by policy. The enforcement must be
done differently.

Regards, Hendrik Voigtländer
Received on Sat Aug 14 2004 - 05:35:18 MDT

This archive was generated by hypermail pre-2.1.9 : Wed Sep 01 2004 - 12:00:02 MDT