Re: [squid-users] Problem accessing site

From: Jacob S. <j-schroeder@dont-contact.us>
Date: Thu, 13 Mar 2003 23:41:46 -0600

On Thu, 13 Mar 2003 17:15:59 -0600
Jacob S. <j-schroeder@myrealbox.com> wrote:

> I'm currently using an acl to only allow access to a list of sites in
> an"unblock.txt" file and it's working great for most sites, but I've
> hit a snag. (I know, it's kind of a cumbersome process only allowing
> access to a select few sites, but that's what I'm needing it to do
> right now.)
>
> When I try to access http://www.joker.com I get an access denied error
> like the following:
>
> While trying to retrieve the URL: joker.com:443
>
> The following error was encountered:
>
> * Access Denied.

<snip - more details>

Thanks to Simon for his help off list, it's now working. I had to put
the url in as "joker.com" instead of/in addition to ".joker.com".

The strange part is I did a little bit of playing around and noticed
that I can access urls such as http://6texans.net if I have the line
".6texans.net" in my unblock file, but I am not able to access
https://6texans.net. To access an https site with a url of the type
https://something.tld (minus the"www"), I can't have a "." in front of
it in my unblock file.

This seems a little inconsistent for it to work one way with http and
another for https, doesn't it? Or am I just misunderstanding how Squid
works?

Thanks,
Jacob

-----
GnuPG Key: 1024D/16377135

In a world without fences, who needs Gates?
http://www.linux.org/
Received on Thu Mar 13 2003 - 22:41:51 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:14:03 MST