Re: [squid-users] New Admin

From: Ross Kovelman <rkovelman_at_gruskingroup.com>
Date: Thu, 08 Oct 2009 14:09:17 -0400

> From: Amos Jeffries <squid3_at_treenet.co.nz>
> Date: Thu, 08 Oct 2009 12:13:59 +1300
> To: <squid-users_at_squid-cache.org>
> Subject: Re: [squid-users] New Admin
>
> On Wed, 7 Oct 2009 09:51:13 -0700 (PDT), tookers <gareth_at_garethcoffey.com>
> wrote:
>> rkovelman wrote:
>>>
>>>
>>>> From: Henrik Nordstrom <henrik_at_henriknordstrom.net>
>>>> Date: Tue, 06 Oct 2009 23:29:02 +0200
>>>> To: Ross Kovelman <rkovelman_at_gruskingroup.com>
>>>> Cc: <squid-users_at_squid-cache.org>
>>>> Subject: Re: [squid-users] New Admin
>>>>
>>>> tis 2009-10-06 klockan 16:55 -0400 skrev Ross Kovelman:
>>>>
>>>>> This is what I have for http_access:
>>>>>
>>>>> http_access deny bad_url
>>>>> http_access deny all bad_url
>>>>> http_access deny manager
>>>>> http_access allow manager localhost
>>>>> http_access allow workdays
>>>>> http_access allow our_networks
>>>>>
>>>>>
>>>>> I would think bad_url would do the trick since I have acl bad_url
>>>>> dstdomain,
>>>>> correct?
>>>>
>>>> It should. At least assuming you have not other http_access rules above
>>>> this.
>>>>
>>>> but the rest of those rules looks strange.
>>>>
>>>> I think you want something like:
>>>>
>>>> # Restrict cachemgr access
>>>> http_access allow manager localhost
>>>> http_access deny manager
>>>>
>>>> # Block access to banned URLs
>>>> http_access deny bad_url
>>>>
>>>> # Allow users access on workdays
>>>> http_access allow our_networks workdays
>>>>
>>>> # Deny everything else
>>>> http_access deny all
>>>>
>>>>
>>>> but have no description of what effect workdays is supposed to have...
>>>>
>>>>
>>>> Regards
>>>> Henrik
>>>>
>>>>
>>>
>>>
>>> I made a few changes and still nothing:
>>>
>>> acl bad_url dstdomain "/xxx/xxxx/etc/bad-sites.squid"
>>> acl all src 0.0.0.0/0.0.0.0
>>> acl manager proto cache_object
>>> acl localhost src 127.0.0.1/255.255.255.255
>>> acl our_networks src 192.168.16.0/255.255.255.0
>>> acl to_localhost dst 127.0.0.0/8
>>> acl workdays time M T W H F 8:30-12:00 11:30-18:00
>>> acl SSL_ports port 443 563
>>> acl Safe_ports port 80 # http
>>> acl Safe_ports port 21 # ftp
>>> acl Safe_ports port 443 563 # https, snews
>>> acl Safe_ports port 70 # gopher
>>> acl Safe_ports port 210 # wais
>>> acl Safe_ports port 1025-65535 # unregistered ports
>>> acl Safe_ports port 280 # http-mgmt
>>> acl Safe_ports port 488 # gss-http
>>> acl Safe_ports port 591 # filemaker
>>> acl Safe_ports port 777 # multiling http
>>> acl CONNECT method CONNECT
>>>
>>> # Restrict cachemgr access
>>> http_access allow manager localhost
>>> http_access deny manager
>>>
>>> # Block access to banned URLs
>>> http_access deny bad_url workdays
>>>
>>> # Allow users access on workdays
>>> http_access allow our_networks workdays
>>>
>>> # Deny everything else
>>> http_access deny all
>>>
>>> I would think this would fulfill the request I just emailed to the
> group,
>>> but doesn't
>>>
>>>
>>>
>>> " Thanks, I made those changes although still no luck. I do save the
>>> changes
>>> and then run a ./squid -k reconfigure, not sure if I should run a
>>> different
>>> command.
>>>
>>> I do have this for work days:
>>> acl workdays time M T W H F 8:30-18:00
>>>
>>> If I can I would like to deny those sites during "workdays" and then its
>>> open before or after that time.
>>>
>>> Thanks"
>>>
>>>
>>>
>>
>> Hi There,
>>
>> Maybe try this....
>>
>> Change http_access deny bad_url workdays
>> To... http_access deny our_networks bad_url workdays
>>
>> It should match any source IP address and if the other 2 acls match then
>> you
>> should get 'Access Denied'
>>
>> Thanks,
>> Tookers
>
> the workdays ACL definition is wrong. it will only block on mondays between
> midnoght and one second after.
>
>
>>> acl workdays time M T W H F 8:30-12:00 11:30-18:00
>
> Should be only one time range and no spaces in the day spec:
>
> acl workdays time MTWHF 8:30-12:00
> acl workdays time MTWHF 11:30-18:00
>
> or maybe two ACL. One for afternoon one for morning.
>
> Amos
>

Thanks for all the help and noticing that time issue. I am now left with 5
issues and I should be done:

1) Apparently squid thinks its later in the day then it is. I have a time
server, can I sync it with that?
2) If #1 can be fixed I am not sure the times I have set are working and
being blocked
3) Those sites that are blocked, could I set up a way for certain users to
access those sites if they know a password? ie: victoriasecret.com is not
allowed but lets say a manager wanted to see it and he was given permission.
Could I have a password prompt come up that he can enter in and gain access?
4) Most users here use sharepoint and I cant seem to get IE to get the
authentication prompt. Any reasons why?
5) Is there an easy way, instead of physically touching each computer is
there a way to have either DNS or something look at Squid before routing the
traffic?

Thanks

Received on Thu Oct 08 2009 - 18:09:47 MDT

This archive was generated by hypermail 2.2.0 : Fri Oct 09 2009 - 12:00:02 MDT