RW wrote:
> On Fri, 21 Sep 2007 07:36:05 -0600
> Blake Grover <blake@eznettools.com> wrote:
> 
>> We are working on a new project where we will distribute Linux
>> machines in different areas that will be connected to the Internet.
>> But these machines might not have a Internet connection always.  We
>> would like these machines to show certain web pages from a web server
>> on a loop. For example I have 7 pages that jump to one to another
>> after 7 - 10 seconds.  But if the Internet connection goes down we
>> want squid to keep showing the loop of HTML pages until the
>> connection get restored and then squid could update the pages in the
>> cache. 
> 
> 
> You could write a script to switch squid into offline mode when the
> connection goes down, but there will always be race condition problems
> with this.
> 
> Have you considered running local webservers instead?
> 
What I'd do is check to see if the following works (Note I have not 
tested any of this)
  - use a deny_info override for that particular ERROR_PAGE
  - the new errorpage to refresh to the next slide-show page in sequence.
If that works, any pages broken during the downtime will simply be 
skipped in favour of pages that do work.
You will most likely need a small http daemon/script to provide the new 
deny_info page and keep track of what was meant to be next.
Amos
Received on Fri Sep 21 2007 - 20:16:38 MDT
This archive was generated by hypermail pre-2.1.9 : Mon Oct 01 2007 - 12:00:02 MDT