Web filtering

From: Jonathan Hall <jonhall@dont-contact.us>
Date: Sat, 3 Jul 1999 21:31:25 -0500 (CDT)

I'm looking for some software that can filter web sites based on their
content, not just on their URL. I've been looking at squirm and
squidGuard. I like squidGuard's ACL support, and I like the ability to
filter sites based on domain name, URL, or a regex, but it only looks at
the URL, and not the actual content of the document(s). Is there any
software that WILL look at the contents of a document and decide whether
the document should be filtered out or not, either by using a regular
expression, or by reaching a maximum count for "tabooed" words? Whether
it's a squid redirector or a stand-alone proxy server really makes no
difference to me. I should be able to make either work in my setup.

I've seen a package called ActiveGuardian that looks like it would do what
I want (it's not a squid redirector, however), but it doesn't compile yet,
and the author(s) have somewhat abandoned the project, it seems. :-(

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
  Jonathan Hall * jonhall@futureks.net * PGP public key available
 Systems Admin, Future Internet Services; Goessel, KS * (316) 367-2487
         http://www.futureks.net * PGP Key ID: FE 00 FD 51
         -= Running Debian GNU/Linux 2.0, kernel 2.0.36 =-
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Received on Sat Jul 03 1999 - 20:13:04 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:47:17 MST