Re: Caching huge files in chunks?

From: Alex Rousskov <rousskov_at_measurement-factory.com>
Date: Fri, 17 Sep 2010 11:39:19 -0600

On 09/16/2010 06:21 PM, Guy Bashkansky wrote:
> Here is the problem description, what solution might Squid or other
> cache tools provide?
>
> Some websites serve huge files, usually movies or binary distributions.
> Typically a client issues byte range requests, which are not cacheable
> as separate objects in Squid.
> Waiting for the whole file to be brought into the cache takes way too
> long, and is not granular enough for optimizations.
>
> A possible solution would be if Squid (or other tool/plugin) knew how
> to download huge files *in chunks*.
> Then the tool would cache these chunks and transform them into
> arbitrary ranges when serving client requests.
> There are some possible optimizations, like predictive chunk caching
> and cold chunks eviction.
>
> Does anybody know how to put together such solution based on any existing tools?

Caching of partial responses is allowed by HTTP/1.1 but is not yet
supported by Squid. It is a complex feature which can, indeed, be a
useful optimization in some environments. For more information, please see

     http://wiki.squid-cache.org/Features/PartialResponsesCaching

 
http://wiki.squid-cache.org/SquidFaq/AboutSquid#How_to_add_a_new_Squid_feature.2C_enhance.2C_of_fix_something.3F

Thank you,

Alex.
Received on Fri Sep 17 2010 - 17:39:45 MDT

This archive was generated by hypermail 2.2.0 : Fri Sep 17 2010 - 12:00:04 MDT