Re: Can I do thsi with squid ?

From: Richard Huveneers <richard@dont-contact.us>
Date: 26 Aug 1996 17:08:31 GMT

In article <Pine.HPP.3.91.960826134216.15343B-100000@hydra.scitec.com.au>, johns@rd.scitec.COM.AU (John Saunders) writes:
>Rather than add this to squid. I think it would be better to have another
>program that takes a list or URLs and fetches them, through squid, during
>slow periods. This other program could be run from cron. This way the
>squid cache would get pre-loaded by this other program. Thinking about it,
>it wouldn't be hard to program. Open a socket to the proxy port and write
>a GET request passing the URL from the list. Then any data that gets read
>is dumped into the bit bucket. If you wanted to get fancy you can make the
>program scan the returned HTML document and fetch any referenced documents
>or images that are local to that server (limited to a maximum depth
>otherwise it could get interesting :-).

Such a util already exists. It's called 'geturl'. Please note that there are
several 'geturl' named utils around on the net. I'm talking about a C version
with lots of features, including proxy and limited depth support.

I don't have the URL here, so if anybody is interested I'll dig it up from my
mailbox at work...

Regards, Richard.
Received on Mon Aug 26 1996 - 10:20:37 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:32:51 MST