Re: [squid-users] Squid stops handling requests after 30-35 requests

From: Eliezer Croitoru <eliezer_at_ngtech.co.il>
Date: Wed, 20 Nov 2013 14:41:54 +0200

Hey,

Can you try another test?
It is very nice to use wget but there are couple options that needs to
be consider.
Just to help you if was not there until now add: --delete-after
to the wget command line.

It's not related to squid but it helps a lot.
Now If you are up to it I will be happy to see the machine specs and OS.
Also what is "squid -v" output?

Can you ping the machine at the time it got stuck? what about tcp-ping
or "nc -v squid_ip port" ?
we need to verify also in the access logs that it's not naukri.com that
thinks your client is trying to covert it into a DDOS target.
What about trying to access other resources?
What is written in this 503 response page?

Eliezer

On 20/11/13 12:35, Bhagwat Yadav wrote:
> Hi,
>
> I enable the logging but didn't find any conclusive or decisive logs
> so that I can forward you.
>
> In my testing, I am accessing same URL 500 times in a loop from the
> client using wget.
> Squid got hanged sometimes after 120 requests ,sometimes after 150 requests as:
>
> rm: cannot remove `index.html': No such file or directory
> --2013-11-20 03:52:37--http://www.naukri.com/
> Resolvingwww.naukri.com... 23.72.136.235, 23.72.136.216
> Connecting towww.naukri.com|23.72.136.235|:80... connected.
> HTTP request sent, awaiting response... 503 Service Unavailable
> 2013-11-20 03:53:39 ERROR 503: Service Unavailable.
>
>
> Whenever it got hanged, it resumes after 1 minute e.g in above example
> after 03:52:37 the response came at 03:53:39.
>
> Please provide more help.
>
> Many Thanks,
> Bhagwat
Received on Wed Nov 20 2013 - 12:42:07 MST

This archive was generated by hypermail 2.2.0 : Thu Nov 21 2013 - 12:00:06 MST