Re: HTTP accelerator architecture for failover

From: Henrik Nordstrom <hno@dont-contact.us>
Date: Fri, 23 Jun 2000 02:46:54 +0200

John Econopouly wrote:

> 1. If one of my web servers goes down, can squid be configured to
> automatically reroute traffic away from that server, or would I need to
> restart squid with a new configuration?

Yes. If Squid is requesting based on a multiple IP DNS record then it
will automatically bypass dead servers. To speed up the detection set
connect_timeout to a reasonable value.

> 2. How do I protect against squid itself going down? Is there a way to run
> 2 squids and have them do auto-failover?

Any of the clustering techniques would do fine.

If your goal is only to build a cluster of your web servers then there
are other tools which are probably more suited for the task. For example
Linux LVS.

Squids purpose when being configures as an accelerator is mainly to
speed up slow servers by caching most of the static content. Other
applications where it makes good use is if you are building a virtual
web site by mapping different parts of the URL space to different
backend servers.

--
Henrik Nordstrom
Squid hacker
Received on Fri Jun 23 2000 - 09:13:54 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:54:09 MST