Re: Squid logfile analysis scripts.

From: Geoff Buckingham <geoffb@dont-contact.us>
Date: Wed, 11 Jun 1997 12:04:28 +0100 (BST)

>
> Hi,
>
> I'm trying to run an analysis script using data from my access.log file
> (~40MB). I use scripts from http://squid.nlanr.net/Squid/Scripts/ and
> follow instruction on the same page. I found that access-extract-urls.pl
> take more than 6 hours to complete (so I decided to kill it) and make my
> system load average increase to more than 1.5 (normally 0.05-0.10)
>
> I ever tried with smaller access.log (~2-3 MB) and access-extract-urls.pl
> can complete in a few minutes. I think there should be some parameter in
> this script I can tune to speedup. If I'm wrong, is there a maximum size
> of access.log that this script can handle? Normally I rotate the logfile
> at midnight, average file size is around 40MB/day.
>
> My system is Linux 2.0.29 PPro 200 with 256MB Ram and 16GB cache. ps
> report that squid size is around 240MB in memory. cache_mem=32
>
> If you can suggest an alternative to this script please tell me, I ever
> tried pwebstat but I prefer output from above script than pwebstat.
>
Are you running your analysis on the cache machine? Thats definatly a
bad thing.
I've found Calamaris by Cord Beermann to be faster than the standard
stuff. We happily chomp through multiple 160MB+ logs in less than an hour
on an Ultra. Calamaris is linked to from:

http://squid.nlanr.net/Squid/Scripts/

-- 
GeoffB
Received on Wed Jun 11 1997 - 04:08:59 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:35:30 MST