RE: [squid-users] Restricting Porn sites

From: Henrik Nordstrom <hno@dont-contact.us>
Date: 14 Jan 2003 02:27:51 +0100

For a start I would recommend investigating the dstdomain and
dstdomain_regex acl types for more specific matches..

Also, remember that any regex type acl lists uses regex expressions, not
words. Experimenting a little with egrep to familiarize oneself with
regex is recommended when writing Squid regex filter lists.

If you use regex lists, try to have them as short as possible. The regex
based acl types are farily CPU intensive compared to other fixed
matches.

Do not use url_regex for matching domains unless you have a strong
reason to as url_regex matches anywhere in the url (including query
strings), and this can give quite surprising results from time to time
if the regex list is not very carefully crafted.

And if you use regex lists, make it a standard routine to evaluate what
is beeing blocked by your regex list to catch errors early on.

Regards
Henrik

mån 2003-01-13 klockan 19.18 skrev Robert Adkins:
> Henrik,
>
> What would you recommend using in place of url_regex? I am using that
> along with another url_regex list that allows sites that contain
> "Pornographic" words in them, such as a sight for Essex England. (Which
> has 'sex' in it.)
>
> Also, if you suggest installing another service onto the server, is that
> service more or less hardware intensive then using two url_regex files?
>
> I need to know as I am going to be adding a handful more services to the
> server that we are running and it would be nice to know some things that
> would help optimize this server.
>
> Thanks for the help!
>
> Regards,
> Robert Adkins II
> IT Manager/Buyer
> Impel Industries, Inc.
> Ph. 586-254-5800
> Fx. 586-254-5804
>
>
> -----Original Message-----
> From: Henrik Nordstrom [mailto:hno@squid-cache.org]
> Sent: Monday, January 13, 2003 6:35 PM
> To: mailinglistsquid-users@squid-cache.org; trainier@kalsec.com; Robert
> Adkins
> Cc: squid-users@squid-cache.org
> Subject: RE: [squid-users] Restricting Porn sites
>
>
>
> I don't think you really want url_regex here.. other than this it looks
> fine, assuming ofcourse that you insert the http_access line before
> where you allow your users access..
>
> Regards
> Henrik
>
> mån 2003-01-13 klockan 15.51 skrev trainier@kalsec.com:
> > I just used a regular expression parameter on an acl. Then created a
> > banned.list that stored url's.
> > Here's the acl:
> >
> > acl banned_list url_regex -i "/etc/squid/banned.list"
> > http_access deny banned_list
> >
> > Then, of course, /etc/squid/banned.list is the file that has all the
> urls
> > in it.
> > Note, I'm not using subdomains in the banned.list file. ie:
> > http://www.google.com
> > would be entered as google.com in the banned.list file. Just good
> > practice.
> >
> > Tim Rainier
> > UNIX/Linux Systems Administrator, Kalsec INC.
> > Web: http://www.kalsec.com
> > Email: trainier@kalsec.com
> >
> >
> >
> >
> > Brett Roberts <brett@centralmanclc.com>
> > 01/13/2003 09:28 AM
> >
> >
> > To: 'frank chibesakunda' <fchibesakunda@zesco.co.zm>,
> > squid-users@squid-cache.org
> > cc:
> > Subject: RE: [squid-users] Restricting Porn sites
> >
> >
> > Are you using a web filter eg Squidguard ?
> >
> > -----Original Message-----
> > From: frank chibesakunda [mailto:fchibesakunda@zesco.co.zm]
> > Sent: 13 January 2003 14:23
> > To: squid-users@squid-cache.org
> > Subject: [squid-users] Restricting Porn sites
> >
> >
> > hi
> >
> > How do i restrict porn sites in squid.. i tried it in webmin twice but
> > this
> > kept on corrupting my squid forcing me to re4build it.
> >
> > F
> >
> >
>
>
Received on Mon Jan 13 2003 - 18:27:57 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:12:39 MST