[squid-users] Re: Rather tricky squid problem / requirement

From: Henrik Nordstrom <hno@dont-contact.us>
Date: Tue, 28 Aug 2001 12:08:58 +0200

Squid is very extensible in how it authenticates users. All it requires is a
program that can read username and passwords pairs and return if the supplied
password is OK or not. This program could for example query the web page you
already have, or duplicate the lookup mechanism used there.

Cookies cannot generally be used for proxy authentication. This because
cookies are only sent to the site/domain they belong to (you cannot make a
world wide cookie), and also users can quite easily fake cookies if they like
(only requires a small amount of JavaScript programming in a local file).

--
Henrik Nordstrom
Squid Hacker
MARA Systems AB
Sweden
Jonathan Lawrence wrote:
> Hi,
>
> I've been searching for many days now for a solution to a rather tricky
> requirement which our organisation has for web caching / logging. We have
> been obliged by senior management to provide full logging for web
> access. We already have a nicely working squid setup, so we decided to
> just turn on the logging features and add user id's to the log.
>
> This is where it gets tricky.
>
> We've got 2 internal networks which feed through 2 NAT devices into our
> external network segment (where the squid cache resides). User
> authentication which would seem to be a problem is already available for
> our intranet server via a cunning PHP script which uses IMAP into one
> device and SAMBA into the other (one internal network is NT, the other is
> Netware we have a very complicated setup). So, from one webpage given a
> username + password I can verify whether a user is acceptable or not.
>
> Now the problem. We can't get squid to do what we want - seemless
> integration with our systems. At the worst case the webpage which
> authenticates for the intranet could write to a mysql db and then we have
> seperate login for the webcache which uses the mysql_auth external
> authenticator for squid to do lookups. But that means we have 2 seperate
> login pages for using the internet, and relies on users logging in the
> correct order.
>
> What we would like to do is a bit more complicated and probably a bit
> silly, but it seems to be the only way around the problem.
>
> All browsers go to the existing login page first and foremost. The cache
> is setup not to require authentication of any form for access to this
> page. This page authenticates them. It then sets a session cookie onto
> their machine. Finally they are redirected to a new page on a distinct
> host name.
>
> Squid is set to do acl lookups for this page. However we've modified acl.c
> so that instead of looking at the Authorization header it looks at the
> Cookie: header which contains a username value. This is then set as the
> ident of the user, which is then user for logging etc.
>
> And no, our NAT devices don't support IDENT so that won't work as a way of
> getting user id info.
>
> We don't actually need to validate users getting onto the internet - we
> are already doing this at the NAT point. We just want to get the user id
> into the logs.
>
> Please please please can you advise us as to whether this is a sensible
> way to go with this - or if we've been exceedingly stupid and missed
> something obvious that will help us with this problem.
>
> Many many thanks,
>
> Jon Lawrence
> Webmaster
> New College
Received on Tue Aug 28 2001 - 04:09:17 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:01:56 MST