Re: [SQU] Interesting Question

From: Robert Collins <robert.collins@dont-contact.us>
Date: Wed, 7 Mar 2001 09:28:46 +1100

Thanks Joe,

As an addendum:
a) with squid, don't forget the -z after clearing the cache,
b) you'll need to find some way to add new pages without turning
off_line mode off during school hours.

for wget
a) wget can rewrite absolute page references within the local site to be
relative.
b) to get a single page and all graphics, you'll want -r 1 (the 1st link
off)
c) what is a potential problem is dynamically generated url's - wget
won't run javascript or whatever, so those links won't be sucked down.

And as Joe said, both have problems, but should work fairly well with
some setup effort/testing.
Rob

----- Original Message -----
From: "Joe Cooper" <joe@swelltech.com>
To: "Devin Teske" <devinteske@hotmail.com>
Cc: <squid-users@ircache.net>
Sent: Wednesday, March 07, 2001 4:18 AM
Subject: Re: [SQU] Interesting Question

> One of the methods I suggested was to pre-fill the cache, and then go
> into offline mode.
>
> This /may/ require modification of the no_cache settings in order to
> cache the results of cgi scripts, and other things which are generally
> not cachable. Experience will have to tell you what to do there, as
> I've never personally done anything like what you want.
>
> In short, here's the steps to a pre-fill:
>
> Clear your cache (rm -rf /cachedir/*, or just format the partition).
> Start Squid. Visit the pages that are needed for the class or
whatever.
> Turn on offline_mode in the squid.conf file.
> Restart Squid.
> Browse those pages. offline_mode will prevent any other pages from
> being visited.
>
> If all content is static, and none of the content has aggressive
expiry
> times, this will work fine. And is probably the easiest for your
> teachers to use. You could put up a small cgi script on each system,
> that when called will put the cache into offline mode. And another to
> empty the cache and put it back into online mode. Then the teachers
> could click a button to start filling and a button to allow the
offline
> browsing.
>
> Next is Robert's suggestion for using wget to create a local mirror.
> Also a good option, but also with some potential problems to be worked
> around.
>
> With wget, you can do what is called a recursive web suck (option
> -r)...by default this will suck down copies of every link down to 5
> levels of links (so each link on each page will be pulled down into a
> local directory). You can then browse this local directory (you could
> even put it onto a local webserver if you needed it to be shareable
> across the whole school). The potential problems include absolute
links
> in the pages (http://www.yahoo.com/some/page...will jump out of your
> local mirror...whereas some/page will not).
>
> Note that both have their problems...but both will do what you want
with
> a little work. There is no magic button to push to limit in such a
> strict way the internet. Because resources are so very distributed,
it
> is very hard to pick one 'page' and say let's only allow this one
page.
> That page possibly links to and pulls from a hundred servers. Maybe
> not...but it could.
>
> Devin Teske wrote:
>
> >>>>> Forwarded to Squidusers
> >>>>
> >
> >> Joe's pointer (squid cache retention times) & mine (wget from a
full
> >> access account to make mirrors) will work.
> >>
> >> Squid ACLs, Squid redirectors, WILL NOT.
> >
> >
> > Can you explain in more detail how I would implement either Joe's or
> > Rob's scenario? How would they both work?
> >
> > Thanks,
> > Devin Teske
>
>
> --
> Joe Cooper <joe@swelltech.com>
> Affordable Web Caching Proxy Appliances
> http://www.swelltech.com
>
> --
> To unsubscribe, see http://www.squid-cache.org/mailing-lists.html
>
>

--
To unsubscribe, see http://www.squid-cache.org/mailing-lists.html
Received on Tue Mar 06 2001 - 15:29:02 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:58:32 MST