Re: [squid-users] Squid API to External Helper.

From: Adrian Chadd <adrian@dont-contact.us>
Date: Tue, 24 Jul 2007 01:20:41 +0800

On Mon, Jul 23, 2007, gonzales@linuxlouis.net wrote:
> Squid 2.6.stable14
>
> I've got a small dillema. I've written an external Perl helper to return
> "OK" or "ERR" dependant upon some regular expressions and/or domains
> stored in an external Postgresql database.

Pretty simple.

> What I've noticed is that for each URL/FQDN that is requested, squid
> passes 'every' URL embedded in the webpage, one by one, to the external
> helper(s that are running, until I add the asynchronous piece that allows
> just one instance of the helper to run, which will then fork children
> processes as needed). My external helper, makes a call to a postgresql db
> and checks each domain/url against a single table of 50K entries, this
> KILLS the performance and end user experience.

Yup, run multiple helpers.

> Does anyone have any suggestions as to 'how' to make this work well? Or
> Henrik, do you have any suggestions as to where I might start looking in
> the Squid code, as to how I can modify 'how' URL's are passed to the
> external helper?

See why the Postgresql lookups are taking so long. me, I'm doing
this stuff in my spare time by building DBM type files (currently playing
with TDB) and I'm able to do rather large hash table lookups rather quickly.

SQL isn't necessarily the right tool for this.

Adrian
Received on Mon Jul 23 2007 - 11:16:41 MDT

This archive was generated by hypermail pre-2.1.9 : Wed Aug 01 2007 - 12:00:04 MDT