Re: [squid-users] Blocking mixed URLs

From: Christian Ricardo dos Santos <Christian@dont-contact.us>
Date: Thu, 30 Sep 2004 16:37:10 -0300

Thanks your for the help...

But this way I need to create one rule for each website permitted to each
group of users instead of a single rule by group.

Nowadays we are using rules like the one bellow (two text files where one
list the IP address and the other the websites). Right now we have 36 groups
using squid (+- 3000 users).

acl general url_regex -i "/etc/squid/data/txtgeneral.txt"
http_access allow txtlan general !download

----- Original Message -----
From: "Andreas Pettersson" <andpet@telia.com>
To: "Christian Ricardo dos Santos" <Christian@telefutura.com.br>
Cc: "squid-users" <squid-users@squid-cache.org>
Sent: Thursday, September 30, 2004 4:20 PM
Subject: Re: [squid-users] Blocking mixed URLs

> Everybody can access the site www.telefutura.com.br, but nobody can =
> access the website www.uol.com.br.
>
> Now if any user type one of the three strings bellow the access to this =
> blocked website is grant (anyways you can only read the text of it =
> through, all the other links are broken).
>
> www.uol.com.br/telefutura.com.br
> www.uol.com.br/?telefutura.com.br
> www.uol.com.br/$telefutura.com.br
>
> What can I do to avoid it ?

You must be more specific in the url_regex acl.

For example:
acl Good url_regex ^www\.telefutura\.com\.br

An even better way is to use the dstdomain acl instead.

/Andreas
Received on Thu Sep 30 2004 - 14:07:40 MDT

This archive was generated by hypermail pre-2.1.9 : Fri Oct 01 2004 - 12:00:03 MDT