Re: [squid-users] Strange problem accessing http://Bloomberg.com

From: Amos Jeffries <squid3_at_treenet.co.nz>
Date: Wed, 8 Apr 2009 13:07:13 +1200 (NZST)

> Hello,
>
> I ma having a very bizarre problem and I am wondering if anyone here can
> shed some light on it.
> Our internal users are accessing the Internet via a squid v2.6-STABLE9
> proxy using a proxy.pac file.
> Their browsers (corporate dictates Internet Explorer) are configured to
> "Automatically detect proxy settings".
>
> When I open the page http://bloomberg.com using the above settings, the
> page mostly loads but the browser locks up and needs to be killed.
>
> If I configure the browser to use a statically configured proxy and
> port, then the page loads fine.
>
> The
>

<elided long trace for brevity>

>
> At this point, the page has loaded and is working fine.
> Note the TCP_DENIED 4 lines from the bottom of the second test. This
> looks like a bad URL due to some bad copy-pasting of code by the
> webmasters of Bloomberg.
> As you can see, the additional lines in the session that works are
> nothing special (except for that DENIED entry).
>
> Any idea as to what could be going on here?
> My gut tells me that the "fix" lies in the IE configuration but I also
> think there should be some kind of work-around possible in squid.

I'm minded to suspect there is something in the .PAC file breaking under
that fubar URL.

Being javascript it's susceptible to $url with quote ' and " charecters in
their strings if the browser is broken enough to pass them unencoded ('
seems not to encode easily).

I've also seen Squid helpers which barf on miss-matched quoting in similar
ways (usually SQL-injection holes).

Amos
Received on Wed Apr 08 2009 - 00:07:19 MDT

This archive was generated by hypermail 2.2.0 : Wed Apr 08 2009 - 12:00:02 MDT