Re: [squid-users] [Newbie] Easy way to expressly limit destination URLs?

From: Shawn Wright <swright@dont-contact.us>
Date: Mon, 29 Nov 2004 10:20:05 -0800

On 29 Nov 2004 at 9:36, Todd Krein wrote:

> [Sorry if this is a rehash... Didn't find an answer in the FAQ]
> Problem: I want to limit by children's web-surfing to only sites that I have pre-screened (i.e. PBSkids.org, Disney.org). All others should be blocked.
>
> I've run Linux systems long enough (>6 years) to know that the learning curve is steep for stuff as complex as Squid, so I want to ensure that it'll work before I invest >
> Can I easily set up a Squid proxy for my girls' computer so that, only for that one machine, only URLs that I've explicitly added to an ACL will work? Is there an easy wa>
>
> [If you could respond to me personally as well as the mailing list, I'd appreciate it.]
> Thanks very much in advance....

Should be fairly easy to do by reversing the rules we use to deny various
lists of sites.

....
acl deny_essays dstdomain "c:/squid/etc/deny/essays.txt"
http_access deny deny_essays
http_access deny all
....

ie, change the 2nd line to an allow, then place the acceptable domains in
the text file such as:

.gradesaver.com
.classicnote.com
.123helpme.com
.4essays.com
.4freeessays.com
.4termpapers.com

This should work, but test it of course, as I might be missing something.
We have about a dozen of these rules, except for denials, and some of
the lists are quite long (>270,000 lines in one case).

-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
Shawn Wright, I.T. Manager
Shawnigan Lake School
http://www.sls.bc.ca
swright@sls.bc.ca
Received on Mon Nov 29 2004 - 11:22:11 MST

This archive was generated by hypermail pre-2.1.9 : Wed Dec 01 2004 - 12:00:02 MST