[squid-users] Re: Rather tricky squid problem / requirement

From: Henrik Nordstrom <hno@dont-contact.us>
Date: Tue, 28 Aug 2001 12:55:19 +0200

As I said single-sign-on is not all easy. The main problem being how the user
identifies to the services he wants to use without having to supply his
username+password.

Integration of Squid into any form of username+password based authentication
system quite easy.

Available options for "single-sign-on":

* ident. You have already told that this is not an option in your environment.
* cookies: won't work for a proxy. Can work for a domain of web servers only.
* NTLM automatic authentication. Requires Squid-2.5, and your users must be using
Windows, Microsoft IE, and log on to a NT domain.
* IP based access control.. won't work as the users IP is not available to Squid
in your setup.

Squid needs to somehow know who the user is. For this to be possible, something
MUST tell Squid who the user is. HTTP does not unless you use authentication which
requires the user to log on.

--
Henrik Nordstrom
Squid Hacker
MARA Systems AB
Sweden
Jonathan Lawrence wrote:
> Hi,
>
> Thanks for the quick response. However, I suspect that I did not really
> make the problem clear enough - we don't really want to have 2 seperate
> login processes for someone who wishes to get onto the internet /
> intranet. We feel (with some evidence) that our users probably wouldn't
> tolerate a dual login process after having logged into the system.
>
> I accept that we could write a small program to call the webpage (in a
> wget style) and just rewrite our webpage to report OK or ERR. And granted,
> the cookie method was never really an ideal one.
>
> I suspect it is time for me to brush up my C skills and get coding! (as
> a side note are there any good webpages with hints on how to write an
> external validator?)
>
> Thanks for your help again,
>
> Jon Lawrence.
>
> On Tue, 28 Aug 2001, Henrik Nordstrom wrote:
>
> > Squid is very extensible in how it authenticates users. All it requires is a
> > program that can read username and passwords pairs and return if the supplied
> > password is OK or not. This program could for example query the web page you
> > already have, or duplicate the lookup mechanism used there.
> >
> > Cookies cannot generally be used for proxy authentication. This because
> > cookies are only sent to the site/domain they belong to (you cannot make a
> > world wide cookie), and also users can quite easily fake cookies if they like
> > (only requires a small amount of JavaScript programming in a local file).
> >
> > --
> > Henrik Nordstrom
> > Squid Hacker
> > MARA Systems AB
> > Sweden
> >
> >
> >
> > Jonathan Lawrence wrote:
> >
> > > Hi,
> > >
> > > I've been searching for many days now for a solution to a rather tricky
> > > requirement which our organisation has for web caching / logging. We have
> > > been obliged by senior management to provide full logging for web
> > > access. We already have a nicely working squid setup, so we decided to
> > > just turn on the logging features and add user id's to the log.
> > >
> > > This is where it gets tricky.
> > >
> > > We've got 2 internal networks which feed through 2 NAT devices into our
> > > external network segment (where the squid cache resides). User
> > > authentication which would seem to be a problem is already available for
> > > our intranet server via a cunning PHP script which uses IMAP into one
> > > device and SAMBA into the other (one internal network is NT, the other is
> > > Netware we have a very complicated setup). So, from one webpage given a
> > > username + password I can verify whether a user is acceptable or not.
> > >
> > > Now the problem. We can't get squid to do what we want - seemless
> > > integration with our systems. At the worst case the webpage which
> > > authenticates for the intranet could write to a mysql db and then we have
> > > seperate login for the webcache which uses the mysql_auth external
> > > authenticator for squid to do lookups. But that means we have 2 seperate
> > > login pages for using the internet, and relies on users logging in the
> > > correct order.
> > >
> > > What we would like to do is a bit more complicated and probably a bit
> > > silly, but it seems to be the only way around the problem.
> > >
> > > All browsers go to the existing login page first and foremost. The cache
> > > is setup not to require authentication of any form for access to this
> > > page. This page authenticates them. It then sets a session cookie onto
> > > their machine. Finally they are redirected to a new page on a distinct
> > > host name.
> > >
> > > Squid is set to do acl lookups for this page. However we've modified acl.c
> > > so that instead of looking at the Authorization header it looks at the
> > > Cookie: header which contains a username value. This is then set as the
> > > ident of the user, which is then user for logging etc.
> > >
> > > And no, our NAT devices don't support IDENT so that won't work as a way of
> > > getting user id info.
> > >
> > > We don't actually need to validate users getting onto the internet - we
> > > are already doing this at the NAT point. We just want to get the user id
> > > into the logs.
> > >
> > > Please please please can you advise us as to whether this is a sensible
> > > way to go with this - or if we've been exceedingly stupid and missed
> > > something obvious that will help us with this problem.
> > >
> > > Many many thanks,
> > >
> > > Jon Lawrence
> > > Webmaster
> > > New College
> >
Received on Tue Aug 28 2001 - 04:57:43 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:01:56 MST