Re: [squid-users] Cache settings per User Agent?

From: Amos Jeffries <squid3_at_treenet.co.nz>
Date: Wed, 8 Oct 2008 14:07:52 +1300 (NZDT)

> Hello,
>
> On Tue, Oct 7, 2008 at 8:11 PM, Henrik Nordstrom
> <henrik_at_henriknordstrom.net> wrote:
>> Best done by the origin server using the Vary header and Cache-Control:
>> max-age..
>>
>
> It can't, since it will confuse my squid to cache the page for normal
> user. Is should not be cached for normal request. Robot don't need the
> most updated result, don't need personalized contents etc.

Robot DO need the latest version of your page. How can people find your
new content if its not indexed on search engines?

True about personalized content, but then thats done by the web server
right? so it can send generic page with correct Vary, ETag, Cache-Control
for robot and all other unknown agents.

>
> I only want to cache, if and only if UA are robots. The squid will
> block the request and the robots will not hit my backend.
>

Any idea how many robots there are on the web? I've found it FAR better on
bandwidth and processing to have a generic default version of a page that
unknowns get. Personalizing only for knowns who can be personalized
accurately.

Amos
Received on Wed Oct 08 2008 - 01:07:56 MDT

This archive was generated by hypermail 2.2.0 : Wed Oct 08 2008 - 12:00:02 MDT