Re: [squid-users] Duplicate files, content distribution networks

From: Jack Bates <nwv96b_at_nottheoilrig.com>
Date: Sun, 24 Jun 2012 01:02:36 -0700

On 14/06/12 03:33 AM, Amos Jeffries wrote:
> On 14/06/2012 8:53 p.m., Jack Bates wrote:
>> Another idea is to exploit RFC 3230, Instance Digests in HTTP. Given a
>> response with a "Location: ..." header and a "Digest: ..." header, if
>> the "Location: ..." URL isn't already cached then the proxy checks the
>> cache for content with a matching digest and rewrites the "Location:
>> ..." header with the cached URL if found
>>
>> I am working on a proof of concept plugin for Apache Traffic Server as
>> part of the Google Summer of Code. The code is up on GitHub [2]
>>
>> If this is a reasonable approach, would it be difficult to build
>> something similar for Squid?
>
> Please contact Alex Rousskov at measurement-factory.com, he was
> organising a project to develop Digest handling and de-duplication this
> a while back.

Thank you Amos for this info, I will definitely contact Alex Rousskov
Received on Sun Jun 24 2012 - 07:58:03 MDT

This archive was generated by hypermail 2.2.0 : Sun Jun 24 2012 - 12:00:03 MDT