Re: idea for making better the binary tree code for ACL's

From: Gregory Maxwell <nullc@dont-contact.us>
Date: Wed, 17 Sep 1997 19:42:43 -0400 (EDT)

Maby it would be possible to impliment a sort of fast
compression/decompression such that all the url strings stored in memory
are stored in a compressed form..

Most squid machines are mostly unloaded cpu wise.. So the cpu impact would
not be the big issue..

Also, this could help performance alot. Compress the URLs, and use the
compressed form as a hash for a binary tree.. What kinda overhead would
there be in keeping the tree sorted?

On 17 Sep 1997, Stefan Monnier wrote:

> "Tom Minchin" <tom@interact.net.au> writes:
> > It would nice to compact all those "http://www." prefixs into a dictionary
> > lookup.
>
> Maybe a prefix-tree will provide the best of all worlds ?
>
> > Hopefully a substantial memory saving will result as we find the
>
> I doubt that memory useage will be influenced by such a change. Memory is
> mostly used to keep location info for every single page that's in the cache:
> each object in the cache uses up a few bytes in memory and there's no way
> to get around that with Squid, as far as I can tell.
>
>
> Stefan
>
Received on Wed Sep 17 1997 - 17:53:12 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:37:06 MST