Re: Caching and memory compression

From: John Saunders <john@dont-contact.us>
Date: 25 Mar 1997 04:01:47 GMT

Jose Manuel Ruiz (jmruiz@cicyt.es) wrote:
> Does squid use some compression with the objects?
> If not, could it be useful?
> Perhaps, I am wrong but performing some fast compression
> could save memory (perhaps 50%) and make squid more
> reliable.
> I am thinking in something similar to gzip -1 --> gunzip. And source
> code is available!!!

It won't work compressing images or archives, but would save something
on HTML and text files. I tried compressing a few HTML files of between
2K and 8K in size, the 2K files compressed 50% and the 8K ones 75%.

Possibly using zlib would be an idea, one thing is because of the
multiplexed operation of Squid each in transit object would require storage
to keep the state of the compression. Depending of how much state info is
required could see the compression effects undone.

Personally I see the main memory problem in Squid as being fragmentation
of the heap. The only real solution I can think of is making Squid use
handles to the memory rather than real pointers, and running a heap
compaction algorithm at regular intervals. You should be able to turn
the cache_mem setting down to a small value like 1MB and still be able
to cache large objects. Any fragmentation caused by large objects would
be compacted. Of course Squid would still use the memory while the large
object is in transit.

Cheers.
-- +------------------------------------------------------------+
        . | John Saunders - mailto:john@nlc.net.au (EMail) |
    ,--_|\ | - http://www.nlc.net.au/ (WWW) |
   / Oz \ | - 018-223-814 or 02-9477-2881 (Phone) |
   \_,--\_/ | NHJ NORTHLINK COMMUNICATIONS - Supplying a professional, |
         v | and above all friendly, internet connection service. |
              +------------------------------------------------------------+
Received on Mon Mar 24 1997 - 20:19:13 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:34:45 MST