Re: [squid-users] Question on Squid internals

From: Matus UHLAR - fantomas <uhlar_at_fantomas.sk>
Date: Sun, 27 Dec 2009 19:23:17 +0100

On 23.12.09 12:47, Manjusha Maddala wrote:
> I'm totally puzzled as to how the LRU cache replacement policy has been
> implemented in Squid.
>
> Here's my situation.

[...]

> $ curl -H "User-Agent: some useragent" -H
> "Accept-Encoding: gzip,deflate" "http://some-url"
>
> Squid access.log -
>
> xxx.xxx.xxx.xxx - - [23/Dec/2009:11:03:31 -0800] "GET http://some-url
> HTTP/1.1" 200 19114 "-" "-" "-" "gzip,deflate"
> TCP_MISS/FIRST_UP_PARENT 446
>
> $ ll 001000FF
> ls: 001000FF: No such file or directory
> ............................................................
>
>
> A page that is just 3 days old encountered a cache miss, even though the
> cached page is physically present on the disk. Could somebody explain
> why this has happened? The exact same URL is requested from the command
> line and Squid has removed its copy of the page after the command
> execution. If the copy has become stale, going by the refresh_pattern
> rules, a TCP_REFRESH_MISS should have got logged instead of TCP_MISS.
> Its really weird that the copy on the disk was deleted, and a TCP_MISS
> was reported.

did you check if the page is cacheable?
Did you try fetching it with "Cache-Control: only-if-cached" header?
It's quite possible that the page is not cacheable, so squid won't cache it,
or the cache has expired...

-- 
Matus UHLAR - fantomas, uhlar@fantomas.sk ; http://www.fantomas.sk/
Warning: I wish NOT to receive e-mail advertising to this address.
Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
"They say when you play that M$ CD backward you can hear satanic messages."
"That's nothing. If you play it forward it will install Windows."
Received on Sun Dec 27 2009 - 18:23:22 MST

This archive was generated by hypermail 2.2.0 : Mon Dec 28 2009 - 12:00:02 MST