Re: [squid-users] restart url_redirector processe when it dies

From: Chris Woodfield <rekoil_at_semihuman.com>
Date: Mon, 16 Mar 2009 10:48:23 -0400

To elaborate, squid should restart new url_rewrite_program instances
when the number of live children falls to <= 50% of the configured
number. So once 8 processes out of 15 die, squid should launch a whole
new set of 15. You'll then have 23 url_rewriter processes, but squid
will launch 15 more when they die and only 7 are left. The circle of
life continues...

In squid 2.7, if you don't want to wait for half your helpers to die
before squid launches more, you can adjust the threshold in
helper.c:helperServerFree() by changing the following line:

if (hlp->n_active <= hlp->n_to_start / 2) {

And changing the right size of the evaluation from "/ 2" to, say, "* .
75" or similar.

This can be used as a somewhat ghetto way of dealing with a rewriter
that has a memory leak - put a dead-man's counter into the rewriter
that causes the process to exit after X number of requests, and let
squid launch new ones as needed. (Not that I've ever done anything
like that myself, nosiree...)

-C

On Mar 15, 2009, at 5:14 AM, Amos Jeffries wrote:

> Dieter Bloms wrote:
>> Hi,
>> I use an url_rewrite_program, which seems to die after about 400000
>> requests.
>> Squid starts 15 processes, which are enough, but after some time one
>> process after another die and at the end all processes where gone.
>> Is it possible to let squid restart an url_rewrite_program, when it
>> dies ?
>
> What version of Squid are you using that does not do this restart
> automatically?
> Squid only dies when ALL helpers for a needed service are dying too
> fast to recover quickly.
>
> Amos
> --
> Please be using
> Current Stable Squid 2.7.STABLE6 or 3.0.STABLE13
> Current Beta Squid 3.1.0.6
>
Received on Mon Mar 16 2009 - 14:48:50 MDT

This archive was generated by hypermail 2.2.0 : Tue Mar 17 2009 - 12:00:03 MDT