Squid logfile analysis scripts.

From: Apiset Tananchai <aet@dont-contact.us>
Date: Wed, 11 Jun 1997 15:31:09 +0700 (ICT)

Hi,

I'm trying to run an analysis script using data from my access.log file
(~40MB). I use scripts from http://squid.nlanr.net/Squid/Scripts/ and
follow instruction on the same page. I found that access-extract-urls.pl
take more than 6 hours to complete (so I decided to kill it) and make my
system load average increase to more than 1.5 (normally 0.05-0.10)

I ever tried with smaller access.log (~2-3 MB) and access-extract-urls.pl
can complete in a few minutes. I think there should be some parameter in
this script I can tune to speedup. If I'm wrong, is there a maximum size
of access.log that this script can handle? Normally I rotate the logfile
at midnight, average file size is around 40MB/day.

My system is Linux 2.0.29 PPro 200 with 256MB Ram and 16GB cache. ps
report that squid size is around 240MB in memory. cache_mem=32

If you can suggest an alternative to this script please tell me, I ever
tried pwebstat but I prefer output from above script than pwebstat.

Thanks for your help,

--
aet
Received on Wed Jun 11 1997 - 01:22:19 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:35:30 MST