Re: Large HTTP downloads

From: Dancer <dancer@dont-contact.us>
Date: Mon, 12 Jan 1998 13:11:00 +1000

A prime suspect would be (for me, off the top of my pointy little head) memory
getting short.

D

Martin Brown wrote:

> I have just got our squiddly-diddly up and running and have a problem
> that I can't see how to fix. Everything's fine except for large file
> downloads (>1M or so) using http. They gradually get slower, then time
> out after between 500k and 2M. Going around the caching, proxy squid
> server and straight to the firewall, the files come in fine so it's
> definitely something to do with the squid proxy.
>
> Any ideas?
>
> --
> ^~^~^~^~^~^~^^~^~^~^~^~^~^~^~^~^~^~^~~^~^~^~^~^~^~^~^~^~^~^~^~^~^~^~^~^~^~^~
> Martin Brown, UNIX Systems and Network Administrator
> ADI Ltd, Systems Group, Private Bag 2, Nth Ryde 2113, Australia
> Phone: +61 2 9325 1645 Fax: 9325 1600 Email: martin.brown@sg.adisys.com.au

--
Note to evil sorcerers and mad scientists: don't ever, ever summon powerful
demons or rip holes in the fabric of space and time. It's never a good idea.
ICQ UIN: 3225440
Received on Sun Jan 11 1998 - 19:15:08 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:38:22 MST