Re: [squid-users] Set up squid as a transparent proxy

From: Israel Brewster <israel_at_ravnalaska.net>
Date: Thu, 24 Jul 2014 14:41:07 -0800

On Jul 24, 2014, at 2:15 PM, Israel Brewster <israel_at_ravnalaska.net> wrote:

> I have been using Squid 2.9 on OpenBSD 5.0 for a while as a transparent proxy. PF on the proxy box rdr-to redirects all web requests not destined for the box itself to squid running on port 3128. Squid then processes the request based on a series of ACLs, and either allows the request or redirects (deny_info ... all) the request to a page on the proxy box.
>
> I am now working on upgrading the system to OpenBSD 5.5 and Squid 3.4.2 Presumably this will require some config changes, so I'm starting off with a basic config. This is based off of the example installed on OpenBSD, but with most of the rules stripped out.
>
> acl authorized_hosts dstdomain .google.com
> acl authorized_hosts dstdomain .wunderground.com
> acl authorized_hosts dstdomain .noaa.gov
>
> http_access allow authorized_hosts
> http_access deny all
>
> http_port 3128 transparent
> http_port 3129 # to avoid errors on startup
>
> coredump_dir /var/squid/cache
>
> refresh_pattern ^ftp: 1440 20% 10080
> refresh_pattern ^gopher: 1440 0% 1440
> refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
> refresh_pattern . 0 20% 4320
>
> deny_info http://192.168.10.1/login.py all
>
> What I would expect to happen is that attempts to access google, wunderground, or noaa.gov would work (load the requested page), while all other requests would get my custom block page. What is actually happening is that requests for pages in the authorized_hosts acl get the generic squid "Access Denied" page. Other requests do get my custom page, as expected.
>
> So what did I miss? Why can't I even get basic pages to go through? And why, if squid is denying access, do I get the generic "Access Denied" page rather than the custom one I specified?
>
> At least I know my requests are being forwarded to squid by PF properly. Or so I assume, since I am getting the squid "Access Denied" page.

One additional piece of information I just discovered: when running squid in debug mode and making a request to an "allowed" page, it shows the following (for example):

2014/07/24 14:36:07| WARNING: Forwarding loop detected for:
GET / HTTP/1.1
Host: nws.noaa.gov
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.9; rv:20.0) Gecko/20100101 Firefox/20.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Via: 1.1 hom-rtr-proxy.ravnalaska.net (squid/3.4.2)
X-Forwarded-For: 192.168.10.51
Cache-Control: max-age=259200
Connection: keep-alive

Which makes me think that maybe the squid request for the page is being redirected back to squid somehow? I using the same line in PF that I'm using in the older, functional, system, which is:

pass in quick on $insideIF inet proto tcp to !192.168.10.1 port www rdr-to 127.0.0.1 port 3128

So as I read it, it should only redirect incoming requests on the inside interface that are not destined for the machine itself. As such, any requests that squid is making should go through.

> -----------------------------------------------
> Israel Brewster
> Systems Analyst II
> Ravn Alaska
> 5245 Airport Industrial Rd
> Fairbanks, AK 99709
> (907) 450-7293
> -----------------------------------------------
>

-----------------------------------------------
Israel Brewster
Systems Analyst II
Ravn Alaska
5245 Airport Industrial Rd
Fairbanks, AK 99709
(907) 450-7293
-----------------------------------------------
Received on Thu Jul 24 2014 - 22:41:38 MDT

This archive was generated by hypermail 2.2.0 : Fri Jul 25 2014 - 12:00:05 MDT