Re: [squid-users] limiting connections

From: Carlos Manuel Trepeu Pupo <charlie.mtp_at_gmail.com>
Date: Thu, 29 Mar 2012 16:22:11 -0400

On Thu, Mar 29, 2012 at 4:03 PM, Eliezer Croitoru <eliezer_at_ngtech.co.il> wrote:
> On 29/03/2012 21:05, Carlos Manuel Trepeu Pupo wrote:
>>
>> On Tue, Mar 27, 2012 at 1:23 PM, Eliezer Croitoru<eliezer_at_ngtech.co.il>
>>  wrote:
>>>
>>> On 27/03/2012 17:27, Carlos Manuel Trepeu Pupo wrote:
>>>>
>>>>
>>>> On Mon, Mar 26, 2012 at 5:45 PM, Amos Jeffries<squid3_at_treenet.co.nz>
>>>>  wrote:
>>>>>
>>>>>
>>>>> On 27.03.2012 10:13, Carlos Manuel Trepeu Pupo wrote:
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sat, Mar 24, 2012 at 6:31 PM, Amos Jeffries<squid3_at_treenet.co.nz>
>>>>>> wrote:
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On 25/03/2012 7:23 a.m., Carlos Manuel Trepeu Pupo wrote:
>>>>>>>
>>>>>>>> On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries wrote:
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> I need to block each user to make just one connection to download
>>>>>>>>>> specific extension files, but I dont know how to tell that can
>>>>>>>>>> make
>>>>>>>>>> just one connection to each file and not just one connection to
>>>>>>>>>> every
>>>>>>>>>> file with this extension.
>>>>>>>>>>
>>>>>>>>>> i.e:
>>>>>>>>>> www.google.com #All connection that required
>>>>>>>>>> www.any.domain.com/my_file.rar #just one connection to that file
>>>>>>>>>> www.other.domain.net/other_file.iso #just connection to this file
>>>>>>>>>> www.other_domain1.com/other_file1.rar #just one connection to that
>>>>>>>>>> file
>>>>>>>>>>
>>>>>>>>>> I hope you understand me and can help me, I have my boss hurrying
>>>>>>>>>> me
>>>>>>>>>> !!!
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> There is no easy way to test this in Squid.
>>>>>>>>>
>>>>>>>>> You need an external_acl_type helper which gets given the URI and
>>>>>>>>> decides
>>>>>>>>> whether it is permitted or not. That decision can be made by
>>>>>>>>> querying
>>>>>>>>> Squid
>>>>>>>>> cache manager for the list of active_requests and seeing if the URL
>>>>>>>>> appears
>>>>>>>>> more than once.
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Hello Amos, following your instructions I make this
>>>>>>>> external_acl_type
>>>>>>>> helper:
>>>>>>>>
>>>>>>>> #!/bin/bash
>>>>>>>> result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c
>>>>>>>> "$1"`
>>>>>>>> if [ $result -eq 0 ]
>>>>>>>> then
>>>>>>>> echo 'OK'
>>>>>>>> else
>>>>>>>> echo 'ERR'
>>>>>>>> fi
>>>>>>>>
>>>>>>>> # If I have the same URI then I denied. I make a few test and it
>>>>>>>> work
>>>>>>>> for me. The problem is when I add the rule to the squid. I make
>>>>>>>> this:
>>>>>>>>
>>>>>>>> acl extensions url_regex "/etc/squid3/extensions"
>>>>>>>> external_acl_type one_conn %URI /home/carlos/script
>>>>>>>> acl limit external one_conn
>>>>>>>>
>>>>>>>> # where extensions have:
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> \.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$
>>>>>>>>
>>>>>>>> http_access deny extensions limit
>>>>>>>>
>>>>>>>>
>>>>>>>> So when I make squid3 -k reconfigure the squid stop working
>>>>>>>>
>>>>>>>> What can be happening ???
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> * The helper needs to be running in a constant loop.
>>>>>>> You can find an example
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
>>>>>>> although that is re-writer and you do need to keep the OK/ERR for
>>>>>>> external
>>>>>>> ACL.
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> Sorry, this is my first helper, I do not understand the meaning of
>>>>>> running in a constant loop, in the example I see something like I do.
>>>>>> Making some test I found that without this line :
>>>>>> result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c
>>>>>> "$1"`
>>>>>> the helper not crash, dont work event too, but do not crash, so i
>>>>>> consider this is in some way the problem.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> Squid starts helpers then uses the STDIN channel to pass it a series of
>>>>> requests, reading STDOUt channel for the results. The helper once
>>>>> started
>>>>> is
>>>>> expected to continue until a EOL/close/terminate signal is received on
>>>>> its
>>>>> STDIN.
>>>>>
>>>>> Your helper is exiting without being asked to be Squid after only one
>>>>> request. That is logged by Squid as a "crash".
>>>>>
>>>>>
>>>>>>
>>>>>>>
>>>>>>> * "eq 0" - there should always be 1 request matching the URL. Which
>>>>>>> is
>>>>>>> the
>>>>>>> request you are testing to see if its>1 or not. You are wanting to
>>>>>>> deny
>>>>>>> for
>>>>>>> the case where there are *2* requests in existence.
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> This is true, but the way I saw was: "If the URL do not exist, so
>>>>>> can't be duplicate", I think isn't wrong !!
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> It can't not exist. Squid is already servicing the request you are
>>>>> testing
>>>>> about.
>>>>>
>>>>> Like this:
>>>>>
>>>>>  receive HTTP request ->    (count=1)
>>>>>  - test ACL (count=1 ->    OK)
>>>>>  - done (count=0)
>>>>>
>>>>>  receive a HTTP request (count-=1)
>>>>>  - test ACL (count=1 ->    OK)
>>>>>  receive b HTTP request (count=2)
>>>>>  - test ACL (count=2 ->    ERR)
>>>>>  - reject b (count=1)
>>>>>  done a (count=0)
>>>>
>>>>
>>>>
>>>> With your explanation and code from Eliezer Croitoru I made this:
>>>>
>>>> #!/bin/bash
>>>>
>>>> while read line; do
>>>>        result=`squidclient -h 192.168.19.19 mgr:active_requests | grep
>>>> -c "$line"`
>>>>
>>>>        echo "$line">>    /home/carlos/guarda       # ->    Add this line
>>>> to
>>>> see in a file the $URI I passed to the helper
>>>>
>>>>        if [ $result -eq 1 ]                                   # ->
>>>> With your great explain you made me, I change to "1"
>>>>        then
>>>>        echo 'OK'
>>>>        else
>>>>        echo 'ERR'
>>>>        fi
>>>> done
>>>>
>>>> It's look like it's gonna work, but, here another miss.
>>>> 1- The "echo "$line">>    /home/carlos/guarda" do not save anything to
>>>> the
>>>> file.
>>>> 2- When I return 'OK' then in my .conf I can't make a rule like I
>>>> wrote before, I have to make something like this: "http_access deny
>>>> extensions !limit", in the many helps you bring me guys, I learn that
>>>> the name "limit" here its not functional. The deny of "limit" its
>>>> because when there are just one connection I cant block the page.
>>>> 3- With the script just like Eliezer tape it the page with the URL to
>>>> download stay loading infinitely.
>>>>
>>>> So, I have less work, can you help me ??
>>>>
>>>
>>> 1. the first is that "squidclient -h 192.168.19.19 mgr:active_requests"
>>> can
>>> take awhile in some cases.
>>> the first time i tried to run the command it took couple of minutes for
>>> squid to send the list (1 connections).
>>> so your hanging stuff is probably because of this issue.
>>>
>>> 2. why you are writing to a file? if it's for debugging ok.
>>> and what you need to do is ti use the echo $? to get from the grep the
>>> lookup answer first.
>>> so psudo:
>>> -----------
>>> read the uri line. (just notice that there is a possibility for 2 uris on
>>> two different hosts just to notice it._
>>> request from squid the list of active downloads and see if any of the
>>> downloads in the output has a match to the uri in the line before.
>>> in case the uri exists the outpot of "echo $?" (exit code) will produce 0
>>>
>>> case it will find 1 echo OK
>>> case it will find 0 echo ERR
>>>
>>> end
>>> goto read uri...
>>
>>
>> That's correct, I write to a file as debug, and I'm already using this
>> option, thanks
>>
>>>
>>> -----------
>>> the reason you cant add info to the file is because the file is owned by
>>> other user then the one that is executing the script for squid.
>>> so change the file permissions to 666 or change the group and user i
>>> thing
>>> to squid unprivileged user.
>>> the whole thing is a simple while loop with a nice if (echo $? == 1)
>>
>>
>> Great !! That was the error !!! just a simple permissions problem !!!
>> When this things happen me ... grrrrrrrr
>>
>>>
>>>
>>> #!/bin/bash
>>>
>>> while read line; do
>>> #i'm throwing the echo to background in case of slow disc access(dont
>>> really
>>> know how much it will improve)
>>>         echo $line>>    /home/carlos/guarda&
>>>
>>> # ->    Add this line to  see in a file the $URI I passed to the helper
>>>        result=`squidclient -h 192.168.19.19 mgr:active_requests | grep
>>>  -c
>>> "$line"|echo $?`
>>>
>>>   if [ $result == 1 ]
>>>         then
>>>         echo 'OK'
>>>         echo 'OK'>>/home/carlos/guarda&
>>>   else
>>>         echo 'ERR'
>>>         echo 'ERR'>>/home/carlos/guarda&
>>>   fi
>>> done
>>
>>
>> Every page that I tried to download give error, and using the
>> deny_info I see that the acl that deny me it's the related with this
>> external ACL. What could be this ?? I make a lot of test since Tuesday
>> and I don't know what's happening !!! Another thing it this line:
>> result=`squidclient -h 192.168.19.19 mgr:active_requests | grep  -c
>> "$line"|echo $?`
>> Why are you using this "|echo $?" at the end of the line and what
>> doing, I know that return the last result code of execute a command or
>> action, but, what you need it here and what do ???
>>
> the instead of storing in memory the grep result i'm storing only the exit
> code of the last command which is the grep.
> if the grep exit when no line with the same uri has been found it will give
> 1 as a result.
> if it found something in the list it will return 0.
> and that is the reason for the OK or ERR on the if condition.
> i will give you a good example that i used on a startup script.
> in order to see if there is an exist process running i would use:
> ps aux |grep -v "grep"|grep -i "some process name"
> so if let say i have a running process named whole_life.sh
> i will get exit code of "0" and i can indicate that it's running and i dont
> need to start it again or i need to kill it.
> so it's very useful to use the exit codes.
>
> i will later try the external_acl stuff on live system to make sure the
> thing is good.
>
> Regards,
> Eliezer

Thanks a lot, I will be waiting for your test !!! and thanks again for
the explanation of the command, you are right, its better to store in
memory just one byte that contain the result of the code !!!

>
>
>
>>>
>>> about the acls.
>>> you can use the follow
>>> http_access deny external_helper_acl
>>>
>>> deny_info http://block_url_site_page/block.html external_helper_acl
>>>
>>> http_access allow loalhost manager
>>> http_access allow loalhost
>>> ...
>>>
>>> this will do the trick for you.
>>
>>
>> Thank I'm already using this.
>>
>>> unless... squidclient is stuck with the output.
>>> and also the echo statements that writes to the file gives error output
>>> that
>>> can cause trouble for that.
>>>
>>> by the way this external acl can limit number of current connections to
>>> more
>>> then just 1 with some wc -l stuff.
>>>
>>> Regards,
>>> ELiezer
>>
>>
>> I think once already, then pass another parameter with the number of
>> current active connections that allow, so make it more flexible.
>>
>>>
>>>
>>>
>>>>
>>>>
>>>>>
>>>>>
>>>>>>
>>>>>>>
>>>>>>> * ensure you have manager requests form localhost not going through
>>>>>>> the
>>>>>>> ACL
>>>>>>> test.
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> I was making this wrong, the localhost was going through the ACL, but
>>>>>> I just changed !!! The problem persist, What can I do ???
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> which problem?
>>>>>
>>>>>
>>>>> Amos
>>>
>>>
>>>
>>>
>>> --
>>> Eliezer Croitoru
>>> https://www1.ngtech.co.il
>>> IT consulting for Nonprofit organizations
>>> eliezer<at>  ngtech.co.il
>
>
>
> --
> Eliezer Croitoru
> https://www1.ngtech.co.il
> IT consulting for Nonprofit organizations
> eliezer <at> ngtech.co.il
Received on Thu Mar 29 2012 - 20:22:19 MDT

This archive was generated by hypermail 2.2.0 : Fri Mar 30 2012 - 12:00:04 MDT