RE: [squid-users] Re: Implementation issues

From: Rick Whitley <rickwh@dont-contact.us>
Date: Thu, 22 Jul 2004 09:01:37 -0500

From the user perspective, the first thing they will see is the
disclaimer page. Clicking either the login or activation link will be
the same as accepting the conditions. Where would I modify the
authentication window?

thanks

rick...
Rom.5:8

>>> Merton Campbell Crockett <mcc@CATO.GD-AIS.COM> 7/22/2004 8:20:00 AM
>>>
Out of curiousity, why is the notification that computer and network
usage
will be monitored being deferred until this late in the game?

The monitoring began the moment the individual logged into the
sorkstation
or laptop and requested to use the Windows NOS. The pop-up
authentication
window can be modified to display the notification message and inform
the
user that he is agreeing to be monitored by submitting his
credentials.

At one site where I did some work, the login process was modified so
that
a separate pop-up window was used. If you clicked on the accept
button,
the authentication window was displayed.

Merton Campbell Crockett

On Thu, 22 Jul 2004, Chris Perreault wrote:

> Here at my work all employees sign a form stating that they realize
email,
> internet usage, file storage systems, etc are for work purposes and
that
> none of it is truly private. That email will not be directly read
except for
> troubleshooting/maintenance issues unless there are illegal or
non-condoned
> company activities taking place.
>
> All traffic encrypted? Encryption puts a burden on the servers and if
all
> the traffic is being encrypted, then what exactly *is* being
monitored?
>
> Thoughts? Set your PC's gateway to your Apache server, visit
www.google.com,
> get your disclaimer page to come up, sign in, and then have it make
you end
> up at google. If this works, insert Squid afterwards. (one piece at a
time,
> see what works, what doesn't)
>
> Chris Perreault
>
>
> -----Original Message-----
> From: Rick Whitley [mailto:rickwh@dbu.edu]
> Sent: Wednesday, July 21, 2004 5:37 PM
> To: squid-users@squid-cache.org; Chris Perreault
> Subject: RE: [squid-users] Re: Implementation issues
>
>
> The disclaimer page gives us the opportunity to inform the users that
their
> traffic is being monitored. It's the law. I am open to suggestions as
to a
> better way to accomplish this. I agree its ugly. The disclaimer page
would
> have a login link that went to the 2nd proxy. On top of everything,
they
> want the traffic encrypted. I need to get a workable process working
first.
> What if the first stop was a web server?
>
> client -> Apache -> Proxy -> internet
> |
> signon
>
> Here apache would trap the traffic unless authenticated(session
variable?).
> We would have apache be the gateway.
>
> Thoughts?
>
> rick...
> Rom.5:8
>
> >>> Chris Perreault <Chris.Perreault@Wiremold.com> 7/21/2004 3:55:21
PM
> >>>
> First attempt:
>
> client -> Proxy1 -> sign-on/disclaimer -> Proxy2 -> ldap -> Internet
>
> User clicks on a page, sending another request, which then will do
> this:
>
> client -> Proxy1 -> sign-on/disclaimer -> Proxy2 -> ldap -> Internet
>
> If proxy1 sends all traffic to the signon page, then all traffic
passes to
> it. All of it. Each request is coming from the browser, not the
signon
> server, so each request hits the proxy1 first and gets resent to the
logon
> page. Ok...so if a session variable knows the user logged in, then
the
> signon server can redirect the request for the website through
proxy2. All
> requests would pass through proxy1-->signon server-->proxy2. If
that's
> acceptable, it looks like it would work. You'd want to redo the
session
> variable on each hit, or else you could have the user relogging in
every 20
> minutes or so. If they were doing anything on the web (filling in a
form or
> taking a test) and didn't submit another request for a while, their
session
> could time out and they sure wouldn't be happy. It looks doable, but
it
> looks ugly too.
>
> Once a user has signed up, you want them to get the disclaimer page
every
> time they fire up a web browser?
>
> It would be much easier to have them just sign an "internet use
policy" and
> in exchange tell them their username and password:)
>
>
>
> Chris Perreault
> Webmaster/MCSE
> The Wiremold Company
> West Hartford, CT 06010
> 860-233-6251 ext 3426
>
>
> -----Original Message-----
> From: Rick Whitley [mailto:rickwh@dbu.edu]
> Sent: Wednesday, July 21, 2004 3:44 PM
> To: squid-users@squid-cache.org; Chris Perreault
> Subject: RE: [squid-users] Re: Implementation issues
>
>
> What we are thinking of doing is:
>
> client -> Proxy -> sign-on/disclaimer -> Proxy -> ldap -> Internet
>
> The 1st proxy will be open and require no auth and redirect all
traffic to
> the sign-on/disclaim site. User has option to Activate account or
visit
> Internet. The "Visit Internet" link will change go to the 2nd proxy
which
> will have proxy_auth enabled for ldap. The client will be prompted
with a
> userid and passwd dialog to be authenticated and sent to the
internet.
>
> Does this seem possible? I have the gateway for the segment set up to
be the
> 1st proxy so I may still have a loop issue. Is there an automated way
to
> modify the gateway on client systems? Like a forced dhcp without the
> request. If we could make this behave as two separate segments ( seg1
= 1st
> proxy with no auth required, seg2 = 2nd proxy with auth
> required.) client would always start in seg1 and have to request
seg2. Any
> thoughts or suggestions are greatly appreciated!!
>
> thanks
>
> rick...
> Rom.5:8
>
> >>> Chris Perreault <Chris.Perreault@Wiremold.com> 7/21/2004 2:02:23
> PM
> >>>
> Doh, there is a referer_regex ACL type, but I don't see that helping
here
> anyways.
>
> 1)Browser wants to visit website.com and hits the proxy.
> 2)Request gets redirected to the signup_disclaimer.htm website.
3)User signs
> up and/or logs in. 4)The page that verifies the username/password
then
> redirects to the originally requested site with the login information
stored
> in the correct header. (via the proxy)
> 5) the proxy sees the refering site and proxies the request.
>
> A better 5) would be squid sees it has an already authenticated
request so
> passes it through. Otherwise the user would have to log in, on the
> disclaimer page, every time they clicked a link. Open a new
window/session
> though and then the user would have to log in again.
>
> Hopefully my ramblings help Rick out. I see a loop happening though.
If
> proxy_auth is required, the user gets the pop up window, even if they
don't
> have an account yet. Once they visit the login page, that page should
be
> able to write/create the http_proxy_authorization header.
> I don't see how to redirect the user if they are not auth'd yet
without
> redirecting them when they are auth'd later so it looks like they are
stuck
> with the logon prompt and only get to the disclaimer/login page when
they
> fail to authenticate.
>
>
> Chris Perreault
>
>
> -----Original Message-----
> From: Chris Perreault
> Sent: Wednesday, July 21, 2004 10:15 AM
> To: squid-users@squid-cache.org
> Subject: RE: [squid-users] Re: Implementation issues
>
>
>
> We may turn into such a sponser. In reverse proxy mode we are looking
for a
> solution where we need to authenticate users as far out in the DMZ
as
> possible. Using ldap via basic auth results in some difficulty in
reaching
> some of the content on various back end webservers which also uses
basic
> auth. Over the next few days we'll determine the best solution, which
may be
> having users redirected to a logon webpage, use a form based
authentication
> that saves the ip, then passes the username down in a header to the
back end
> webservers, thus allowing some of those servers to match the header
username
> with a username in a profile database for portal type content
delivery while
> still allowing basic auth type apps to do their own authentication.
>
> If squid knew the referring IP address (the webserver that has the
> authenticating form on it that after submital send the browser back
to
> Squid) then all users could be directed to the logon page, and after
hitting
> submit go back to the proxy which would allow that referring IP to
get
> proxied info from the web. Spoof the IP though and you get through
without
> authenticating. Figuring that IP out might be difficult for a end
user
> though.
>
> Chris Perreault
>
> -----Original Message-----
> From: Adam Aube [mailto:aaube01@baker.edu]
> Sent: Tuesday, July 20, 2004 8:15 PM
> To: squid-users@squid-cache.org
> Subject: [squid-users] Re: Implementation issues
>
>
> Henrik Nordstrom wrote:
>
> > On Wed, 14 Jul 2004, Rick Whitley wrote:
> >
> >> I was setting up a proxy server to do the authentication and
> >> cacheing, but have learned from the list that it is not going to
> >> behave the way I expected. Users should only see the initial page

> >> once. I seem to be out in left field as to how to implement this.
> Any
> >> suggestions?
> >
> > In theory it is possible to implement very close to what you want
> with
> > Squid & authentication, but so far no one has been willing to
> actively
> > sponsor such development.
> >
> > It is also possible to use a small proxy.pac script implementing
the
>
> > "startup splash" screen.
>
> Something like this might be a good starting point:
>
> http://www.squid-cache.org/mail-archive/squid-users/200404/0793.html

>
> Adam
>

-- 
BEGIN:				vcard
VERSION:			3.0
FN:				Merton Campbell Crockett
ORG:				General Dynamics Advanced Information
Systems;
				Intelligence and Exploitation Systems
N:				Crockett;Merton;Campbell
EMAIL;TYPE=internet:		mcc@CATO.GD-AIS.COM 
TEL;TYPE=work,voice,msg,pref:	+1(805)497-5045
TEL;TYPE=work,fax:		+1(805)497-5050
TEL;TYPE=cell,voice,msg:	+1(805)377-6762
END:				vcard
Received on Thu Jul 22 2004 - 08:02:14 MDT

This archive was generated by hypermail pre-2.1.9 : Sun Aug 01 2004 - 12:00:02 MDT