Re: [squid-users] Caching/spidering an entire site

From: Ow Mun Heng <Ow.Mun.Heng@dont-contact.us>
Date: Thu, 23 Dec 2004 09:57:07 +0800

On Tue, 2004-12-21 at 17:35, Kinkie wrote:
> On Tue, 2004-12-21 at 10:40 +0800, Ow Mun Heng wrote:
> > Is there any possiblity of caching an entire site??
> >
> > eg: www.somesite.com
> >
> > reason is to just act sort of like a local mirror for that site instead
> > of contacting the originating server for content.
>
> Sure, just fire up a crawler that uses the proxy walking the site at
> regular intervals.
Cool. Didn't thought about that possibility. Any crawlers recommended?
I've only dealt with the ones from the windows world (From a past life
when I was a windows(tm) Junkie).

The other thing about spidering is chances are it will spider more depth
than I want it to.

I think wget can do spidering as well. let me check.

Thanks

--
Ow Mun Heng
Gentoo/Linux on D600 1.4Ghz 
98% Microsoft(tm) Free!!
Neuromancer 09:55:31 up 48 min, 4 average: 0.44, 0.34, 0.64 
Received on Wed Dec 22 2004 - 19:20:56 MST

This archive was generated by hypermail pre-2.1.9 : Sat Jan 01 2005 - 12:00:02 MST