Multiple squids (was Cookie filtering)

From: Graham Toal <gtoal@dont-contact.us>
Date: Tue, 15 Apr 1997 11:07:10 -0500 (CDT)

> User proxies to junkbuster at 8080, junkbuster forwards to squid (on the
> same machine) at 3128 ...
> Restriction: junkbuster can only handle http-request, for ftp or others,
> Users have to connect directly to squid at 3128.
>
> Look at
> http://internet.junkbuster.com/ijbfaq.html

I'd read the faq to answer these questions but its offline...

1) does it slow access down significantly? Squid goes to great lengths
to avoid forking; if this code doesn't, it will slow things down as
badly as not running squid in the first place, no?

2) Does it clean up its processes properly or leave lots of dead ones?

Before getting squid, we ran one CERN proxy for everyone and a second
CERN proxy for schools, which used rewrite rules to filter out a few
of the more well-known porn sites. This proxy was used as a filter only
and didn't save anything to disk. Now we're running squid, I'm
still using the CERN proxy for the schools because I don't know how
to do that with squid, at least not without running a second copy
which I assume will use another 30Mb of Ram which I can't afford.

What *is* the correct way to do multiple squids? Or should I look
at something like the code above and modify it as a front-end filter
to squid because its more lightweight code?

G
Received on Tue Apr 15 1997 - 09:22:47 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:34:59 MST