Re: [squid-users] downloads fail after a while in 3.1.0.14

From: Amos Jeffries <squid3_at_treenet.co.nz>
Date: Thu, 03 Dec 2009 14:09:54 +1300

On Wed, 2 Dec 2009 15:10:44 -0800 (PST), Landy Landy
<landysaccount_at_yahoo.com> wrote:
> Right now I'm trying to download a file from intel thats 432MB and it
has
> failed 5 times already. It started and failed at 110MB later started it
> again and failed at 19MB and so on. The best it has done is 110MB.
>
> I don't know whats going on. This time squid won't cache this one since
I
> have maximum_object_size 256 MB

I expect this explains the second failure at 19MB. If Squid were able to
cache the object it would I think get to over 110MB before failing again.

I suspect one of the timeouts (request or read) is set too low for your
bandwidth speeds.

That and/or quick_abort may be clashing to do this.

It could also be the old enemies of broken ECN and TCP window sizing.

>
> I would increase this just for the moment to 512MB to test if it
downloads
> it.
>

Are you able to catch the information between the client browser and
squid?

Running this on the squid box should log it to a file "$CLIENTIP.log" for
detailed analysis:
 tcpdump -i $ETH -s 0 -w $CLIENTIP.log host $CLIENTIP

NP: substituting $CLIENTIP for the client IP address and $ETH with the
interface name Squid talks to the client through (ie. eth0, eth1, etc).

Amos
Received on Thu Dec 03 2009 - 01:09:59 MST

This archive was generated by hypermail 2.2.0 : Thu Dec 03 2009 - 12:00:01 MST