Re: [squid-users] Denying write access to the cache

From: Chris Robertson <crobertson@dont-contact.us>
Date: Fri, 23 Mar 2007 15:08:35 -0800

Guillaume Smet wrote:
> Hi all,
>
> We're using Squid as a reverse proxy for a couple of years now. We're
> currently migrating to Squid 2.6 and we're really satisfied of all the
> enhancements of this version, especially the fact that we can use a
> lot of features previously reserved to the proxy configuration.
>
> Our context is a bit special :
> - we have _a lot_ of traffic generated by robots (googlebot and so on) ;
> - the problem is that they visit a lot of pages on our site we don't
> really want to keep in cache (nobody else really visits them - for
> instance, the events in a very small city) ;
> - we want the robots to use the cache generated by the other users so
> we can't simply deny cache access to robots.
>
> After this introduction, here is my question: is there any way to deny
> the write access to the cache? I'd like to be able to say: all these
> user agents can use the cache but don't write anything in it.
> I don't find anything to do this sort of things.
>
> Thanks for any ideas.
>
> --
> Guillaume

Make an ACL that matches the URLs you don't want cached, and deny
caching based on those ACLs.

acl CITY urlpath_regex city1 city2 city3
cache deny CITY

If you want to cache these infrequently visited URLs when regular people
visit, add an acl that matches the user-agent of the bots.

acl CITY urlpath_regex city1 city2 city3
acl GOOGLEBOT browser -i googlebot
cache deny CITY GOOGLEBOT

Chris
Received on Fri Mar 23 2007 - 17:08:41 MDT

This archive was generated by hypermail pre-2.1.9 : Sat Mar 31 2007 - 13:00:02 MDT