Re: [squid-users] Caching huge files in chunks?

From: Amos Jeffries <squid3_at_treenet.co.nz>
Date: Fri, 17 Sep 2010 17:17:34 +1200

On 17/09/10 12:21, Guy Bashkansky wrote:
> Here is the problem description, what solution might Squid or other
> cache tools provide?
>
> Some websites serve huge files, usually movies or binary distributions.
> Typically a client issues byte range requests, which are not cacheable
> as separate objects in Squid.
> Waiting for the whole file to be brought into the cache takes way too
> long, and is not granular enough for optimizations.
>
> A possible solution would be if Squid (or other tool/plugin) knew how
> to download huge files *in chunks*.
> Then the tool would cache these chunks and transform them into
> arbitrary ranges when serving client requests.
> There are some possible optimizations, like predictive chunk caching
> and cold chunks eviction.
>
> Does anybody know how to put together such solution based on any existing tools?

Ah the BitTorrent-to-HTTP conversion. :)

Still blocked to a large degree by squid not fully supporting range
requests.

A server module for Squid like the FTP, wais and gopher ones is possibly
achievable. As is an eCAP/ICAP module.

HTTP still have the fixed basic requirements that one request equals one
reply and most annoyingly that ranges requested must not re-ordered or
optimized.

Amos

-- 
Please be using
   Current Stable Squid 2.7.STABLE9 or 3.1.8
   Beta testers wanted for 3.2.0.2
Received on Fri Sep 17 2010 - 05:17:39 MDT

This archive was generated by hypermail 2.2.0 : Fri Sep 17 2010 - 12:00:03 MDT