RE: Web Content Suck

From: Dave J Woolley <DJW@dont-contact.us>
Date: Thu, 29 Jul 1999 12:39:43 +0100

> From: Henrique Abreu [SMTP:abreu@newsite.com.br]
>
>
> Have squid a options to can this automatic "suck" the content of web
> sites
> in internet in predefined times, to preserver time from users conected ?
>
        You should note that there has been a recent request for
        a feature to prevent users doing exactly this and that some
        sites have a policy that forbids it (although proper processing
        of robots.txt will make the robots ignore those sites).

        Also, even sites that do permit crawling, often object if the
        requests for each page are less than a few minutes apart
        (30 seconds as an absolute minimum).
Received on Thu Jul 29 1999 - 05:29:01 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:47:37 MST