Re: [squid-users] Can't play youtube while logged in

From: Eliezer Croitoru <eliezer_at_ngtech.co.il>
Date: Tue, 26 Jun 2012 21:18:21 +0300

the access.log can help a lot....
try to change the "tcp_outgoing.."
try to ping the destination server..
try to download the file without the proxy from a local machine and from
the server.
as squid works for a lot of people all over the world it seems that the
problem is either in you squid setup on network infrastructure setup.
try from the bottom up.. dig\nslookup ... ping .. curl\wget.. with proxy
basic acls... proxy with your acls..
try without any wccp also...

Regards,
Eliezer

On 6/26/2012 1:16 PM, Benjamin Kingston wrote:
> I turned off XSS protection in my browsers chrome and firefox and It
> now appears to progress past beyond the crosssite.xml, but I still see
> that the player is timing out while trying to download the actual
> video file.
>
> I took a look at the network debugging on my browser and found that
> the waiting for o-o.prefered.ord...lscache6.c.youtube.com" is due to
> the video file that the flash player is requesting, and the "an error
> has occurred" appears in the player after the connection times out
> after 15 minutes (chrome). I do believe I'm seeing the
> o-o.prefered.ord...lscache6.c.youtube.com (not the full domain, just
> shortened it where the ... is) appear in the squid access.log, and
>
>> ... it is also possible the browser is using non-HTTP connections when you=
> are logged in. Modern browsers use any of HTTP, HTTPS, SPDY and WebSockets=
> protocols to fetch web content these days.
>
> HTTPS isn't forwarded to squid, and I'd expect that SPDY and
> WebSockets are on different ports, so they shouldn't be forwarded
> either. Elements in youtube.com that don't load over HTTP do work when
> accessing https://youtube.com, but I still have the same issue since a
> lot of youtube content is still http.
>
> And by the way, I'm using Squid 3.1.10 on RHEL/CentOS
>
> Thanks for the pointers, I appreciate the help
>
> -ben
>
> On Sat, Jun 23, 2012 at 4:39 AM, Amos Jeffries <squid3_at_treenet.co.nz> wrote=
> :
>> On 22/06/2012 7:01 a.m., Benjamin Kingston wrote:
>>>
>>> I've been having a very difficult getting squid to play nicely with
>>> youtube while logged in. I always get "an error has occured" unless I
>>> log out, then youtube works just fine. I've tried numerous
>>> configuration changes to try to get it to play, and I'm looking into
>>> icap to see if it may help, but I'm sure there's something simple I'm
>>> missing. I'm not really interested in caching dynamic content or even
>>> youtube right now, I just want to at least get squid making direct
>>> requests on behalf of my clients.
>>
>>
>> Unitl it is clear what the not-mentioned "error" is and why it occured
>> looking for a solution or workaround is a waste of time.
>>
>>
>>
>>>
>>> The last message in access.log is referring to a crossdomain.xml, and
>>> then the requests stop.
>>
>>
>> Aha. Good hint there. crossdomain.xml is a list of domains and URL securi=
> ty
>> settings the browser uses to identify whether plugins (ie flash player) a=
> re
>> allowed to make requests for.
>>
>> =C2=A0* If you or any of the proxies supplying your clients with traffic =
> are
>> playing URL-rewrite games it will break connectivity. CORS security
>> protection is designed to prevent the client being shown one URL and the
>> server receiving another. Use HTTP redirect instead and CORS is not a
>> problem - the client browser is aware of the URL change and adjust its CO=
> RS
>> details to match.
>>
>>
>>
>>> =C2=A0In my browser (which has caching turned off) I
>>> see "waiting for o-o.prefered.ord...lscache6.c.youtube.com".As I said
>>> earlier, if I logout of of my google account the videos load just
>>> fine, and I have yet to find a site that has a similar issue. I
>>> currently have a rule to force all requests to be direct, but I still
>>> have the same issue
>>
>>
>> The problem would then appear to be in Google systems. Not your proxy.
>>
>> However Squid only logs requests once they have finished. Does that URL i=
> t
>> was waiting for show up in the proxy logs if you abort the request in the
>> browser?
>>
>>
>>
>>>
>>> I'm using a WCCPv2 infrastructure, and this is for personal use.
>>
>>
>> ... it is also possible the browser is using non-HTTP connections when yo=
> u
>> are logged in. Modern browsers use any of HTTP, HTTPS, SPDY and WebSocket=
> s
>> protocols to fetch web content these days.
>>
>>
>>> Sorry if this made it out to the list already, I wasn't sure if I was
>>> successfully subscribed when I sent it the first time.
>>>
>>>
>>> Below is my current config:
>>>
>>> # Recommended minimum configuration:
>>> #
>>> acl manager proto cache_object
>>> acl localhost src 127.0.0.1/32 ::1
>>> acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1
>>>
>>> # Example rule allowing access from your local networks.
>>> # Adapt to list your (internal) IP networks from where browsing
>>> # should be allowed
>>> acl localnet src 10.0.0.0/8 =C2=A0 =C2=A0# RFC1918 possible internal net=
> work
>>> acl localnet src 172.16.0.0/12 =C2=A0 =C2=A0# RFC1918 possible internal =
> network
>>> acl localnet src 192.168.0.0/16 =C2=A0 =C2=A0# RFC1918 possible internal=
> network
>>> acl localnet src fc00::/7 =C2=A0 =C2=A0 =C2=A0 # RFC 4193 local private =
> network range
>>> acl localnet src fe80::/10 =C2=A0 =C2=A0 =C2=A0# RFC 4291 link-local (di=
> rectly
>>> plugged) machines
>>>
>>> tcp_outgoing_address 192.168.254.2
>>> udp_outgoing_address 192.168.254.2
>>>
>>> acl SSL_ports port 443
>>> acl Safe_ports port 80 =C2=A0 =C2=A0 =C2=A0 =C2=A0# http
>>> acl Safe_ports port 21 =C2=A0 =C2=A0 =C2=A0 =C2=A0# ftp
>>> acl Safe_ports port 443 =C2=A0 =C2=A0 =C2=A0 =C2=A0# https
>>> acl Safe_ports port 70 =C2=A0 =C2=A0 =C2=A0 =C2=A0# gopher
>>> acl Safe_ports port 210 =C2=A0 =C2=A0 =C2=A0 =C2=A0# wais
>>> acl Safe_ports port 1025-65535 =C2=A0 =C2=A0# unregistered ports
>>> acl Safe_ports port 280 =C2=A0 =C2=A0 =C2=A0 =C2=A0# http-mgmt
>>> acl Safe_ports port 488 =C2=A0 =C2=A0 =C2=A0 =C2=A0# gss-http
>>> acl Safe_ports port 591 =C2=A0 =C2=A0 =C2=A0 =C2=A0# filemaker
>>> acl Safe_ports port 777 =C2=A0 =C2=A0 =C2=A0 =C2=A0# multiling http
>>> acl CONNECT method CONNECT
>>>
>>> #
>>> # Recommended minimum Access Permission configuration:
>>> #
>>> # Only allow cachemgr access from localhost
>>> http_access allow manager localhost
>>> http_access deny manager
>>>
>>> # Deny requests to certain unsafe ports
>>> http_access deny !Safe_ports
>>>
>>> # Deny CONNECT to other than secure SSL ports
>>> http_access deny CONNECT !SSL_ports
>>>
>>> # We strongly recommend the following be uncommented to protect innocent
>>> # web applications running on the proxy server who think the only
>>> # one who can access services on "localhost" is a local user
>>> http_access deny to_localhost
>>>
>>> #
>>> # INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
>>> #
>>> #include /etc/squid/acl.extended.conf
>>>
>>> acl direct dstdomain .
>>>
>>> always_direct allow direct
>>
>>
>> "always_direct allow all" is better. However, all this does is prevent Sq=
> uid
>> sending the request through a cache_peer and forces Squid to ass it to eh
>> DNS (DIRECT) web server for the domain.
>> You have no cache_peer configured, so it has no use in your config.
>>
>>
>>
>>>
>>>
>>> # Example rule allowing access from your local networks.
>>> # Adapt localnet in the ACL section to list your (internal) IP networks
>>> # from where browsing should be allowed
>>> http_access allow localnet
>>> http_access allow localhost
>>>
>>> # And finally deny all other access to this proxy
>>> http_access deny all
>>>
>>> # Squid normally listens to port 3128
>>> http_port 192.168.254.2:3080 intercept
>>>
>>> # We recommend you to use at least the following line.
>>> hierarchy_stoplist cgi-bin ?
>>> strip_query_terms off
>>>
>>> # Uncomment and adjust the following to add a disk cache directory.
>>> cache_dir aufs /mnt/data/squid 9000 16 256
>>> cache_mem 256 MB
>>> maximum_object_size_in_memory 128 KB
>>>
>>> # Leave coredumps in the first cache dir
>>> coredump_dir /mnt/data/squid
>>>
>>> # WCCP Router IP
>>> wccp2_router 192.168.254.1
>>>
>>> # forwarding 1=3Dgre 2=3Dl2
>>> wccp2_forwarding_method 1
>>>
>>> # GRE return method gre|l2
>>> wccp2_return_method 1
>>>
>>> # Assignment method hash|mask
>>> wccp2_assignment_method hash
>>>
>>> # standard web cache, no auth
>>> wccp2_service dynamic 52
>>> wccp2_service_info 52 protocol=3Dtcp priority=3D240 ports=3D80
>>>
>>> maximum_object_size 700 MB
>>> minimum_object_size 4 KB
>>>
>>> half_closed_clients off
>>> quick_abort_min 0 KB
>>> quick_abort_max 0 KB
>>> vary_ignore_expire on
>>> reload_into_ims on
>>> log_fqdn off
>>> memory_pools off
>>> cache_swap_low 98
>>> cache_swap_high 99
>>> max_filedescriptors 65536
>>> fqdncache_size 16384
>>> retry_on_error on
>>> offline_mode off
>>> pipeline_prefetch on
>>>
>>> # Add any of your own refresh_pattern entries above these.
>>> #include /etc/squid/refresh.extended.conf
>>> refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
>>>
>>> refresh_pattern . =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A00 20% 4320
>>
>>
>>

-- 
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer <at> ngtech.co.il
Received on Tue Jun 26 2012 - 18:18:28 MDT

This archive was generated by hypermail 2.2.0 : Wed Jun 27 2012 - 12:00:04 MDT