Re: Caching pages ready for later use?

From: Alex Rousskov <rousskov@dont-contact.us>
Date: Thu, 11 Dec 1997 10:01:54 -0600 (CST)

On Fri, 12 Dec 1997, Dancer wrote:

> A fine thought. You could make a specific refresh_pattern for them to try to hold
> them longer, and maybe use wget or the client program that comes with squid in a
> simple script to load them up beforehand.

This does not work very good for me though. It is a lot of work to collect
all the URLs that are needed for a presentation that uses selected portions
of other large sites. Wget fetches one page (not enough) or everything in the
site tree (too much). I do not think there is anything out there that is
smart enough to fetch exactly what you need because there is no formal way to
specify that!

One way to deal with that would be to browse the presentation once and then
"grep" all URLs from your client that are recorded in Squid access.log. This
way you can semi-automatically build a set of URLs to feed wget or other
client emulators.

The same approach could be used to make a custom refresh pattern (otherwise
it is practically impossible to manually generalize all the URLs)...

This all makes sense only if you are doing the same presentation many many
times, I guess. Otherwise, it is probably easier to browse the presentation
just before you need it.

Alex.
Received on Thu Dec 11 1997 - 08:05:43 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:37:55 MST