Re: [squid-users] caching websites automatically

From: Amos Jeffries <squid3_at_treenet.co.nz>
Date: Wed, 11 Feb 2009 14:18:46 +1300

Leonardo Rodrigues Magalhães wrote:
> lorenor escreveu:
>> Hello,
>>
>> I'm searching for a method to cache websites automatically with squid.
>> The goal is to give squid a list of URLs and the proxy will cache the
>> sites.
>> I know only one way to cache a site. A client have to make a request.
>> But is there another way without client interaction?
>>

The big question is WHY BOTHER?

Squid will cache whatever it can as it passes through, things get cached
as users request them. This saves you on bandwidth and disk space. While
giving followup clients the speed they would like.

>
> No, squid has no mode do that automagically.
>
> But .... with some linux clients, wget for example, you can easily do
> that !!!!!
>
> cd /tmp/garbage
>
> sites.txt should contain URL of the sites you wanna to fetch
>
> www.onesite.com
> www.othersite.com
> www.anything.com
>
>
>
> export http_proxy=http://your.squid.ip.box:3128
> wget -i sites.txt --mirror
>
> that should fetch in mirror style (EVERYTHING) from the informed
> sites and save them under the directory you started wget. Depending on
> the amount of data, that could take a long time to run. You can probably
> erase everything after wget finishes, but it may be intelligent to keep
> the files and run mirror again some days after, which will make MUCH
> less traffic being generated.
>
> and, in the end of the process, squid should have cached everything
> that is cacheable according to site configurations and your caching
> parameters as well.
>
> squid has no automatic mode for doing that, but that can be easily
> done with wget.
>
>

Amos

-- 
Please be using
   Current Stable Squid 2.7.STABLE6 or 3.0.STABLE13
   Current Beta Squid 3.1.0.5
Received on Wed Feb 11 2009 - 01:18:42 MST

This archive was generated by hypermail 2.2.0 : Wed Feb 11 2009 - 12:00:01 MST