[squid-users] Looking for web usage reporting solution

From: Aaron Spurlock <aarons_at_technovationdesign.com>
Date: Fri, 13 Nov 2009 09:31:35 -0700

I am looking for a web usage reporting solution that can run via sniffing or from a mirror port on a switch. I envision this solution would simply log each URL request it sees and allow reports to be generated on web sites that internal users have gone to. I've searched high and low, but cannot find a "ready-made" solution, so I'm looking to put it together myself.

Most people/posts suggest using squid/squidgard/dan's guardian, but it appears to me that is only an inline solution, and I would prefer a sniffing solution for safety (if machine crashes, it doesn't take down Internet). In that sense, it would work a lot like websense, but without the blocking, only reporting.

From a high-level pseudo-code standpoint, it would simply sniff all traffic, and when it sees a packet requesting a webpage, it parses it and dumps these results into a database:

-Date
-Time
-Source IP
-Dest IP
-URL requested
-FQDN portion of web request - IE: if request was for
 http://www.microsoft.com/windows/server/2003, it records only
 www.microsoft.com here
-domain portion of web request - only microsoft.com in above example

Using this data, I can then produce reports for the client on who went where when.... Personally, I thought this would be a great program for open source, but I can't find anything like this already out there!!! It seems like kind of a mix between Squid, NTOP and Snort...

Thanks for any thoughts on this project!
Received on Fri Nov 13 2009 - 16:31:47 MST

This archive was generated by hypermail 2.2.0 : Sat Nov 14 2009 - 12:00:02 MST