RE: [squid-users] Problems Using squid 2.6 as a transparent web cache

From: Donoso Gabilondo, Daniel <donoso.d_at_ikusi.es>
Date: Thu, 12 Jun 2008 13:00:02 +0200

Hello again,
Thank you very much for your help.

> I suspect you are trying to do some sort of web mashup involving Squid?
> I've found the best ways to do those is to have squid as the public
> domain gateway and do the app-linking/routing in the squid config.

I want to use squid to cache all the resources needed by the linux application and only download again if they are modified.

I have made the changes that you have indicated me.
I am using firefox to make a test, because with the linux application I can't test at this moment. I put squid as the proxy, but always download the resource.

I saw that the store.log file is updating with the asked resources. This is the file content:

1213266172.237 RELEASE 00 0000000F EAEEC8FE1A6E2D8434959FA6301A18A0 200 1213266
171 1194446956 -1 video/mpeg 6250477/386763 GET http://192.168.240.158:808
0/test/video.mpg
1213266174.770 RELEASE 00 00000010 197E8B6BA5687EDF00E293B32088D2E7 200 1213266
174 1194446956 -1 video/mpeg 6250477/251763 GET http://192.168.240.158:808
0/test/video.mpg

I put maximum_object_size 300000 KB because the video.mpg is higher than 8 MB (10 MB exactly), but I tried to ask small resources (images) and the results are the same.

I read squid configuration and for default squid allow all to be catched.

What am I doing wrong?

Thank you again for your help.

Daniel

 

-----Mensaje original-----
De: Amos Jeffries [mailto:squid3_at_treenet.co.nz]
Enviado el: miércoles, 11 de junio de 2008 15:11
Para: Donoso Gabilondo, Daniel
CC: squid-users_at_squid-cache.org
Asunto: Re: [squid-users] Problems Using squid 2.6 as a transparent web cache

Donoso Gabilondo, Daniel wrote:
> Hello,
> I have an application in linux that uses http resources (videos,
> images..). These resources are in other machine with a http server
> running (under windows).
>
> The linux application always download the resources. I installed and
> configured squid in the linux machine to cache these resources, but the
> linux application always downloads them from the http server. I don't
> know how can I resolve the problem. I need some help, please.

I suspect you are trying to do some sort of web mashup involving Squid?
I've found the best ways to do those is to have squid as the public
domain gateway and do the app-linking/routing in the squid config.

Anyway on to your various problems....

>
> The linux ip address is: 192.168.240.23 and the windows with http server
> ip is: 192.168.233.158
>
> This is my squid.conf file content:
>
> acl all src 0.0.0.0/0.0.0.0
> acl manager proto cache_object
> acl localhost src 127.0.0.1/255.255.255.255
> acl to_localhost dst 127.0.0.0/8
> acl SSL_ports port 443
> acl Safe_ports port 80 # http
> acl Safe_ports port 21 # ftp
> acl Safe_ports port 443 # https
> acl Safe_ports port 70 # gopher
> acl Safe_ports port 210 # wais
> acl Safe_ports port 1025-65535 # unregistered ports
> acl Safe_ports port 280 # http-mgmt
> acl Safe_ports port 488 # gss-http
> acl Safe_ports port 591 # filemaker
> acl Safe_ports port 777 # multiling http
> acl CONNECT method CONNECT
> http_access allow manager localhost
> http_access deny manager
> http_access deny !Safe_ports
> http_access deny CONNECT !SSL_ports
> http_access allow localhost
> http_access deny all

So none of the clients are allowed to make requests?
I'd expect to see a control saying the intercepted network has access
through.
  acl localnet src 192.168.0.0/16
  http_access deny !localnet

and drop the "deny all" down a bit....

> icp_access allow all

allow all with no port configured? looks like you can kill this.

> hierarchy_stoplist cgi-bin ?
> access_log /var/log/squid/access.log squid
> acl QUERY urlpath_regex cgi-bin \?
> cache deny QUERY
> refresh_pattern ^ftp: 1440 20% 10080
> refresh_pattern ^gopher: 1440 0% 1440
> refresh_pattern . 0 20% 4320
> acl apache rep_header Server ^Apache
> broken_vary_encoding allow apache
> coredump_dir /var/spool/squid
> cache_dir ufs /var/spool/squid 700 32 512
> http_port 3128 transparent
> icp_port 0

> cache_peer localhost.home.nl parent 8080 0 default
> acl HOME dstdomain .home.nl

> always_direct allow all
> never_direct allow all

Those lines contradict each other 'everything MUST go direct + nothing
EVER allowed direct'.

You want just:
   never_direct allow HOME
   never_direct deny all
   cache_peer_access localhost.home.nl allow HOME
   cache_peer_access localhost.home.nl deny all
   http_access allow HOME

  .. the deny I mentioned dropping down goes about here. AFTER the peer
access config.

>
>
> I executed these commands:
>
> iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -j DNAT --to
> 192.168.240.23:3128
> iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -j REDIRECT
> --to-port 3128

Okay so far. What about intercepting the requests clients make directly
to your web app?
  Since the app knows its running on port 8080 it will tell the clients
that in its URLs, and the 'clients' do not know about Squid they will
not ask for those objects over port 80.

>
>
> The cache.log content is this:
>
> 2008/06/11 11:30:52| Starting Squid Cache version 2.6.STABLE19 for
> i386-redhat-linux-gnu...
> 2008/06/11 11:30:52| Process ID 8617
> 2008/06/11 11:30:52| With 1024 file descriptors available
> 2008/06/11 11:30:52| Using epoll for the IO loop
> 2008/06/11 11:30:52| ipcacheAddEntryFromHosts: Bad IP address 'tele1'
> 2008/06/11 11:30:52| ipcacheAddEntryFromHosts: Bad IP address 'svc1'

Your hosts file has corrupt content.

Apart from all that, squid looks to be running fine.

Amos

-- 
Please use Squid 2.7.STABLE1 or 3.0.STABLE6
Received on Thu Jun 12 2008 - 11:00:09 MDT

This archive was generated by hypermail 2.2.0 : Thu Jun 12 2008 - 12:00:04 MDT