Re: [squid-users] auto blacklist users

From: ian j hart <ianjhart@dont-contact.us>
Date: Sat, 12 Jan 2008 22:02:07 +0000

On Friday 11 January 2008 22:17:16 ian j hart wrote:
> On Sunday 09 December 2007 07:03:24 Amos Jeffries wrote:
> > dhottinger@harrisonburg.k12.va.us wrote:
> > > Quoting ian j hart <ianjhart@ntlworld.com>:
> > >> On Friday 07 December 2007 23:49:35 Amos Jeffries wrote:
> > >>
> > >> [Apologies in advance if I've miss-understood anything, it's late
> > >> (early) and
> > >> I'm somewhat brain dead. This time zone thing's a killer]
> > >>
> > >>> ian j hart wrote:
> > >>> > On Friday 07 December 2007 00:58:31 Adrian Chadd wrote:
> > >>> >> So if I get this right, you'd like to log the acl list that passed
> > >>> >> or failed the user?
> > >>> >>
> > >>> >>
> > >>> >>
> > >>> >> Adrian
> > >>> >
> > >>> > Near enough.
> > >>> >
> > >>> > I want to log the aclname (or custom error page name) and the
> > >>>
> > >>> username.
> > >>>
> > >>> > I'll probably want the url in short order, followed by anything
> > >>>
> > >>> else that
> > >>>
> > >>> > proves useful.
> > >>> >
> > >>> > I want to do this for users who are denied access.
> > >>> >
> > >>> > [The more general solution you state above would probably be okay
> > >>>
> > >>> too. I
> > >>>
> > >>> > might need to add DENY/ACCEPT so I can include that in the regexp.]
> > >>> >
> > >>> > <tangent>
> > >>> > Here's an example of how this might be generally useful. I have
> > >>> > thee different proxy ACLs.
> > >>> >
> > >>> > A url_regexp
> > >>> > A dstdomain list harvested from a popular list site
> > >>> > A "daily" list gleaned from yesterdays access summary
> > >>>
> > >>> Problem:
> > >>> If a student can get through all day today whats to stop them?
> > >>
> > >> Nothing. But here's what I hope will happen. (I probably shouldn't
> > >> reveal this, but what the hey).
> > >
> > > Ive missed most of this discussion. But it sounds like you may have
> > > gotten this to work. Is there a recap? Id really like to see your
> > > squid.conf (at least snippets that pertain to this). Are you running a
> > > transparent proxy? Do you run any kind of commercial filter? Ive been
> > > struggling with this same thing. Now I catch this through my snort
> > > logs, and looking at access_logs for denied hits. I also block quite a
> > > few sites at my firewall, but it is impossible to stop. I do seem to
> > > have more support from administration than you.
> >
> > Here be mine squid.conf entry:
> >
> > external_acl_type surbl_test ipv6 ttl=5 negative_ttl=5 %SRC %DST
> > /etc/squid6/helper/rhsbl.sh multi.surbl.org RHSBL
> > acl surbl_clean external surbl_test
> > deny_info
> > http://treenet.co.nz/errors/squid-404.php?RBL-SURBL-%m&err=%o&url=%s
> > surbl_clean
> > http_access deny all !surbl_clean
>
> Okay, I'm having a look at this now.
>
> I'm using 2.6.17 on FreeBSD 6.2 (i386)
>
> No problem with %s.
>
> %o works in an error page but not as used above.
>
> %m doesn't seem to be documented.
>
> > The helper is a little complex doing a DNSBL re-formatting and lookup
> > before returning OK/ERR to squid.
> >
> > FYI: SURBL is an anti-malware RHSBL which lists domains advertised in
> > Spam or known for malware distribution.
> > There is no reason particularly why the helper can't do a lookup
> > elsewhere for a locally built list via another medium.
> >
> >
> > Amos

I've tried versions from STABLE14 thru' 18 and even 2.7-DEVEL0-20080112, no
joy.

Instead of "fred" I get 1046317640 or something similar (address?).

Can any developer point me to the code which expands deny_info.

Thanks

-- 
ian j hart
Received on Sat Jan 12 2008 - 15:02:25 MST

This archive was generated by hypermail pre-2.1.9 : Fri Feb 01 2008 - 12:00:04 MST