[squid-users] Squid 3.4.6 is not caching anything

From: <liam_at_kzz.se>
Date: Mon, 30 Jun 2014 14:05:58 +1200

I have tried deleting the cache and setting its size to 10GB. I ran squid -z
again and it created the directories before squid -z froze. The maximum
object size is set to 5GB, and I have checked some sites using redbot.org to
see if the can be cached or not. It says that they can, and I have had the
squid proxy running for about 48 hrs now with about 50 clients connected. I
have scanned the access.log and there is not a single hit. Even if the same
page is requested many times.

Are there some settings that I am missing in squid.conf that is stopping the
cache from working? Do you know where I can obtain a already compiled x86
package for Debian 7 with --enable-ssl and --enable-ssl-crtd?

Thanks for your help so far.

2014-06-30 00:46: Eliezer Croitoru sade:
> Well it's a bit hard to say that squid is not caching anything by only
> this.
>
> It depends on what you expect allow and config it to cache.
> Since squid uses a Store-ID to identify content you may think that two
> request for the same "file" so to speak will be the same but in many
> cases it is not due to couple additions in the url by the user or the
> client.
>
> There is also the possibility that you haven't configured squid to
> cache big size files.
> Start with making the cache_dir in a size of more then 100MB(unless
> it's good for you) also try to take a look\change:
> http://www.squid-cache.org/Doc/config/maximum_object_size/
>
> and other object_size affecting config directives that you can see
> here:
> http://www.squid-cache.org/Doc/config/
>
> Note that it is possible that you yet do not understand the complexity
> of caching and I can redirect you if you want to couple articles or
> config directives that can help you understand more about it.
>
> I have seen a nice site you can try to see that is cached well:
> http://www.djmaza.info/
>
> I do not know that much examples of a nice and cachable websites these
> days due to the basic nature of dynamic content and SLL encrypted
> sites out there in the wide web.
>
> If you have specific site you want to analyze you can try redbot at:
> https://redbot.org/
>
> You can insert the url and the interface will verify if the site\url
> is cachable.
>
> Eliezer
>
> On 06/29/2014 09:54 AM, liam_at_kzz.se wrote:
>> I have removed all comments from my config file:
>> http://pastebin.com/kqvNszyp
>>
>> And here is a short excerpt from my access.log - don't think it will
>> be too helpful though. I have removed some IP addresses and URLs.
>>
>> http://pastebin.com/bZiZ3tUN
>>
>> Note that I do get some TCP_MEM_HIT/200 sometimes.
>>
>> I have tried using Firefox 29, Internet Explorer 11 and the latest
>> version of Chrome for Debian 7 stable.
Received on Mon Jun 30 2014 - 02:06:12 MDT

This archive was generated by hypermail 2.2.0 : Mon Jun 30 2014 - 12:00:05 MDT