[squid-users] Authentication problem upgrading from squid 2 to squid 3

From: Javier Smaldone <javier_at_smaldone.com.ar>
Date: Thu, 20 Dec 2012 21:56:29 -0300

I've been using squid 2.6.STABLE5 for a long time. Now, I'm upgrading
to 3.1.19 (Ubuntu 12.04). On my previous setup i've used ldap_auth
(with basic authentication) and after tuning my configuration I made
it work for squid3.

But now I have a problem with some (allowed) sites that load some
(forbidden) content (as twitter and facebook javascript, for example):
When loading such a page, the user get prompted (again) for the login
credentials.

I've raised the loglevel to 9 and found some differences on the log
for exactly the same request.

Please, take a look at my config and logfile and save my life!

Thanks in advance.

--
Javier
This is the relevant part of my squid.conf file:
auth_param basic program /usr/lib/squid3/squid_ldap_auth -R -b
"dc=mycompany,dc=com,dc=ar" -D
"cn=ldaplinux,ou=ati,dc=mycompany,dc=com,dc=ar" -W /etc/squid3/secret
-f "sAMAccountName=%s" -h ldapserver
auth_param basic children 5
auth_param basic credentialsttl 2 hours
auth_param basic realm Internet access
external_acl_type adsgroup %LOGIN  /usr/lib/squid3/squid_ldap_group
-b "dc=mycompany,dc=com,dc=ar" -D
"cn=ldaplinux,ou=ati,dc=mycompany,dc=com,dc=ar" -W /etc/squid3/secret
-f "(&(objectclass=person)(sAMAccountName=%v)(memberof=cn=%a,ou=internet,dc=mycompany,dc=com,dc=ar))"
-h ldapserver -v 3
http_access allow manager localhost
http_access deny manager
acl forbidden_ip src "/var/squid/acls/noips"
http_access deny forbidden_ip
acl users.privileged external adsgroup internet.privileged
http_access allow users.privileged
[...lot of acl and http_access rules...]\.twitter\
acl domains.banned.re dstdom_regex "/var/squid/acls/domains.banned.re"
http_access deny domains.banned.re
# domains.banned.re includes '\.twitter\'
For the request "GET http://platform.twitter.com/widgets.js", the
first part of the log info is always the same (and it is the expected
behaviour):
| HttpMsg.cc(445) parseRequestFirstLine: parsing possible request:
GEThttp://platform.twitter.com/widgets.js HTTP/1.1
Host: platform.twitter.com
User-Agent: Mozilla/5.0 (Windows NT 6.1; rv:16.0) Gecko/20100101 Firefox/16.0
Accept: */*
Accept-Language: es-ar,es;q=0.8,en-us;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate
Proxy-Connection: keep-alive
Referer: http://www.infobae.com/notas/687652-Cromanon-todos-los-condenados-seran-detenidos-inmediatamente.html
Proxy-Authorization: Basic XXXXXXXXXXXXXXXXXXXXXXX
| Parser: retval 1: from 0->52: method 0->2; url 4->41; version 43->50 (1/1)
| parseHttpRequest: req_hdr = {Host: platform.twitter.com
User-Agent: Mozilla/5.0 (Windows NT 6.1; rv:16.0) Gecko/20100101 Firefox/16.0
Accept: */*
Accept-Language: es-ar,es;q=0.8,en-us;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate
Proxy-Connection: keep-alive
Referer: http://www.infobae.com/notas/687652-Cromanon-todos-los-condenados-seran-detenidos-inmediatamente.html
Proxy-Authorization: Basic XXXXXXXXXXXXXXXXXXXXXXX
}
| parseHttpRequest: end = {
}
[...]
| parsing HttpHeaderEntry: near 'Proxy-Authorization: Basic
XXXXXXXXXXXXXXXXXXXXXXX'
| parsed HttpHeaderEntry: 'Proxy-Authorization: Basic XXXXXXXXXXXXXXXXXXXXXXX'
| created HttpHeaderEntry 0x7f6b92f4d790: 'Proxy-Authorization : Basic
XXXXXXXXXXXXXXXXXXXXXXX
| 0x7f6b7dc43150 adding entry: 40 at 7
[...]
| ACLChecklist::preCheck: 0x7f6b80288658 checking 'http_access deny
forbidden_ip'
| ACLList::matches: checking forbidden_ip
| ACL::checklistMatches: checking 'forbidden_ip'
| aclIpMatchIp: '192.168.1.1:53563' NOT found
| ACL::ChecklistMatches: result for 'forbidden_ip' is 0
| ACLList::matches: result is false
| aclmatchAclList: 0x7f6b80288658 returning false (AND list entry
failed to match)
| aclmatchAclList: async=0 nodeMatched=0 async_in_progress=0
lastACLResult() = 0 finished() = 0
Now, the important part: Checking user credentials (and group membership).
Despite the presence of the "Proxy-Authorization" field on the
request, the log shows:
| ACLChecklist::preCheck: 0x7f6b80288658 checking 'http_access allow
users.privileged'
| ACLList::matches: checking users.privileged
| ACL::checklistMatches: checking 'users.privileged'
| aclMatchExternal: acl="adsgroup"
| authenticateAuthenticate: broken auth or no proxy_auth header.
Requesting auth header.
| Acl.cc(70) AuthenticateAcl: returning 0 sending authentication challenge.
| aclMatchExternal: adsgroup user not authenticated (0)
| ACL::ChecklistMatches: result for 'users.privileged' is 0
| ACLList::matches: result is false
| aclmatchAclList: 0x7f6b80288658 returning false (AND list entry
failed to match)
| ACLChecklist::checkForAsync: requiring Proxy Auth header.
As a result, the browser asks the user for credentials again. When
entered, the requests shows exactly the same for the firs ACL checks,
but when checking "http_access allow users.privileged" it shows:
| ACLChecklist::preCheck: 0x7f6b80288658 checking 'http_access allow
users.privileged'
| ACLList::matches: checking users.privileged
| ACL::checklistMatches: checking 'users.privileged'
| aclMatchExternal: acl="adsgroup"
| authenticateAuthenticate: header Basic ZmJhbGxhcmluOm0zbGwxejRz.
| authenticateAuthenticate: This is a new checklist test on FD:24
| authenticateAuthenticate: no connection authentication type
| AuthConfig::CreateAuthUser: header = 'Basic XXXXXXXXXXXXXXXXXXXXXXXX'
| AuthUserRequest::AuthUserRequest: initialised request 0x7f6b7dc08ef0
| AuthUser::AuthUser: Initialised auth_user '0x7fff72cb1870' with refcount '0'.
| basic/auth_basic.cc(412) decodeCleartext: 'myuser:mypass'
| basic/auth_basic.cc(364) authBasicAuthUserFindUsername: Looking for
user 'myuser'
| basic/auth_basic.cc(532) updateCached: Found user 'myuser' in the
user cache as '0x7f6b829543c0'
| authenticateAuthUserLock auth_user '0x7f6b829543c0'.
| authenticateAuthUserLock auth_user '0x7f6b829543c0' now at '4'.
| AuthUser::~AuthUser: Freeing auth_user '0x7fff72cb1870' with refcount '0'.
| aclCacheMatchFlush called for cache 0x7fff72cb18a0
| authenticateValidateUser: Validating Auth_user request '0x7f6b7dc08ef0'.
| authenticateValidateUser: Validated Auth_user request '0x7f6b7dc08ef0'.
| authenticateValidateUser: Validating Auth_user request '0x7f6b7dc08ef0'.
| authenticateValidateUser: Validated Auth_user request '0x7f6b7dc08ef0'.
| AuthUserRequest::lock: auth_user request '0x7f6b7dc08ef0 0->1
| AuthUserRequest::lock: auth_user request '0x7f6b7dc08ef0 1->2
| AuthUserRequest::unlock: auth_user request '0x7f6b7dc08ef0 2->1
| aclMatchExternal: adsgroup = 0
| ACL::ChecklistMatches: result for 'users.privileged' is 0
| ACLList::matches: result is false
| aclmatchAclList: 0x7f6b80288658 returning false (AND list entry
failed to match)
[...]
| The request GET http://platform.twitter.com/widgets.js is DENIED,
because it matched 'domains.banned.re'
This is just an example. The problem occurs for various requests. It's
really annoying for the user entering the username/password
combination again and again.
--
Javier Smaldone
http://blog.smaldone.com.ar
PGP:
Keyserver: pgp.mit.edu
Key Id: 5FDB79C5
Fingerprint: 94BF 396A E5F3 13F4 5EF2  E371 41EB FD30 5FDB 79C5
Received on Fri Dec 21 2012 - 00:56:58 MST

This archive was generated by hypermail 2.2.0 : Fri Dec 21 2012 - 12:00:09 MST