Re: [squid-users] replacing Apache httpd

From: Daniel Plappert <plappert_at_denkformat.de>
Date: Wed, 6 Apr 2011 12:26:30 +0200

Hi Amos,

thank you very much for your help!
I removed the "forceddomain" option and it almost worked. My only problem now is using subdomains. I've 2 subdomains:

sub1.example.com
sub2.example.com

and I want to delegate them to different servers. The problem is, that squid does not recognize them. If I remove the defaultsite=example.com option, I get the following error:

The following error was encountered while trying to retrieve the URL: /
Invalid URL
Some aspect of the requested URL is incorrect.
...

by calling sub1.example.com or sub2.example.com. If I add the defaultsite=example.com option, all subdomain calls will be delegated to example.com (because of the defaultsite option).
Here is what I added in my conf-file:

cache_peer sub1 parent 80 0 no-query originserver name=sub1
acl sub1_domain dstdomain sub1.example.com
http_access allow sub1_domain
...

Besides, if I remove the defaultsite option, I always get the mentioned error. Would you let me know what did I do wrong?

Regards,
Daniel

Am 03.04.2011 um 09:39 schrieb Amos Jeffries:

> On 02/04/11 02:22, Daniel Plappert wrote:
>> Hi all,
>>
>> I am new to squid, so I hope you don't feel offended if this is a beginner's question. ;-) I am trying to replace a Apache httpd server, who works as a delegating proxy. Let me explain the scenario shortly:
>>
>> internet -> Apache httpd delegator -> server[1-3]
>>
>> Because, to the outside, we have just one ip-address, the httpd delegator forwards the request according to the URL to one of the internal server, i.e. wiki.example.com is forwarded to server1, dms.example.com is forwarded to server2. This is done with virtual-hosts and rewrite rules, i.e. for server1:
>>
>> RewriteRule ^(.*)$ http://wiki/$1 [L,P]
>>
>> As you can see here, the request is delegated to an internal server called wiki.
>>
>> What I am trying to do now is to replace the Apache httpd delegator with squid. What I've done so far is to configure squid as an accelerator and declared the corresponding nodes:
>>
>> acl wiki_sites dstdomain wiki.example.com
>> http_port 80 accel defaultsite=example.com vhost
>> http_access allow wiki_sites
>
> So far good.
>
> Note:
> by using "defaultsite=example.com" this makes the 'broken' clients which do no send hostname properly use "example.com", which does not match your domain ACL "wiki.example.com".
>
> Result: clients which do not send "wiki.example.com" properly as the virtual domain name will not get to the wiki server.
>
> Whether this is a good behaviour is up to you. Just be aware of it.
>
>> cache_peer wiki parent 80 0 no-query originserver forceddomain=wiki name=wiki
>
> Mostly good.
>
> Use "forcedomain=" only if the peer is sightly broken and requires all traffic to arrive with that value as its public domain/host name.
>
> Squid will prefer to send on the public domain FQDN (in this case "wiki.example.com") to the peer so that it can easily and properly generate public redirects, cookies and page content URLs etc.
>
>
>> forwarded_for on
>
> "forwarded_for" is not strictly relevant, but fine.
>
>> cache_peer_access wiki allow wiki_sites
>
> Okay good.
>
>>
>> Forwarding the request works as expected, but there is one problem: server1 (the (t)wiki server) adds now a wrong base url in the html header:
>>
>> <base href="http://wiki" />
>
> Bingo. The wiki server is using what it sees as the public host/domain name (Host: header) to general URLs. see above.
>
>>
>> This doesn't happen with the apache delegator.
>
> Apache is sending rather broken headers to the wiki server.
> They look like this:
>
> GET http://wiki/foo.html HTTP/1.1
> Host: wiki.example.com
> ...
>
>
> Whereas Squid is sending proper HTTP headers based on the URL (as altered by forcedomain):
>
> GET /foo.html HTTP/1.1
> Host: wiki
>
>
>>
>> So, finally my question: how is it possible to configure squid, in a way that the base url is as it was before:<base href="http://wiki.example.com" /> I need the URL from the outside (internet), not from the internal (intranet).
>>
>
> With Squid you will get the same URLs publicly and internally. So traffic will hopefully all go through Squid where you can centralize a set of ACLs for the internal/external access if it actually matters.
>
> Amos
> --
> Please be using
> Current Stable Squid 2.7.STABLE9 or 3.1.11
> Beta testers wanted for 3.2.0.5
Received on Wed Apr 06 2011 - 10:26:41 MDT

This archive was generated by hypermail 2.2.0 : Wed Apr 06 2011 - 12:00:03 MDT