Re: Using the access.log to get 'Bad' Urls

From: Mario L. Peralta <mario@dont-contact.us>
Date: Fri, 05 May 2000 12:11:06 -0300

We process daily and weekly the access.log to get the following
statistics.
- cache hits.
- Volume per URL.
- Volume per user (we have authentication configured in squid.conf).
- URLs visited per user.
our boss is a bit paranoid, eh?
I have a couple of perl scripts to parse access.log and generate these
statistics.
In addition, i have some perl CGI's to view the results via web.
If you want, i can send all the scripts and you can taylor to your
needs.
Regards.

Don Rolph wrote:
>
> Typically if we have a specific concern, we take last weeks access_log (we
> rotate weekly) and run grep on it looking for the text strings you want to
> search on.
>
> Enjoy!
>
> "Andre P. Piazza" wrote:
>
> > Hello!
> > My squid.conf allows everybody on my network to go connect to any site
> > they want!
> > But, I would like to use the access.log and a dictionary with all the
> > porn words (sex, bizarre, oral, nude...), to do a "crossing" of the data
> > , and get the IP address, date, time , of the people who connect to a
> > site that contain the denyied word in the dictionary! And get the
> > complete URL which contains the denyied word in the dictionary (for
> > example: www.someurl.com/photos/sex.html).
> > How could I do this?
> > Is there any script that do this? Where?
> > Thanks any help,
> >
> > Andre Piazza
> > mailto: piazza@uel.br
>
> --
>
> Regards.
>
> Don Rolph w-rolph@ds.mc.ti.com WD3 MS10-13 (508)-236-1263

-- 
Weiler's Law:
        Nothing is impossible for the man who doesn't have to do it
Received on Fri May 05 2000 - 09:39:21 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:53:16 MST