Squid and limiting bandwidth...

From: Jeff Barrow <jeffb@dont-contact.us>
Date: Wed, 7 Oct 1998 22:36:10 -0500

I'm looking for a way to limit how fast a single request will retrieve from
the remote host.

Currently, if one of our users request a 10 meg (or more) file from a web
site with very good connectivity, it will eat up out entire T1 while
downloading that file.... is there a good way of slowing it down? (their
modem won't download that fast...)

When this maxes out our T1, all requests slow down... (given it only maxes
it for at the most an hour, but it's done this several times on us...)

I'm looking for something so that when a request is larger than, say, one
meg, then it slows down so that other requests get a chance to go through...
maybe slow down to 200kbit/second/request... (preferably configurable)

We're currently running Squid-1.2.beta24+first patch on a P200, 256 MEG
SDRAM, 12 GIG of Raid0 striped Ultra-DMA IDE hd for cache dir, Linux
2.0.35... and have had NO coredumps so far, or any other problems.

Over 1.5 gig a day in requests.... hit rate around 50%, saving aroune 25%
HTML bandwidth.

(we're upgrading to Squid-2 soon...)
Received on Wed Oct 07 1998 - 20:43:47 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:42:22 MST