RE: Squid enhancement thought

From: Dave J Woolley <DJW@dont-contact.us>
Date: Wed, 2 Jun 1999 12:49:41 +0100

> From: Reuben Farrelly [SMTP:reuben-squid@mira.net]
>
> There are client programs which, certainly under Windows, accomplish the
> same task as what is being suggested. I would suggest that they may be a
> better approach to use in situations where this sort of behaviour is
> required. That way, the client pays the cost of the extra speed, rather
> than the squid-maintainer.
>
        Many content providers object violently to these,
        especially when they don't honour robots.txt and don't
        provide a recognizable User-Agent string (so that they
        can be rejected!). IMDB is, I believe, one site which
        takes an active stance against them.

        The problem is that they waste bandwidth at the content
        provider by accessing, often dynamic, pages that are
        never going to be read.

        Any look ahead system must honour robots.txt and in practice
        must honour the equivalent meta information, even if that
        results in a protocol layering violation (looking inside the
        content).

        It must also identify accesses made in this way in a manner
        that makes it easy for the content provider to reject them.

        To be useful to the end users, it would then need to fallback
        to a conventional mode and would need to remember that the site
        had rejected it, as otherwise the site might well block it by
        IP address at their incoming firewall.

        Generally, given that there is not a very high penetration of
        cache usage by end users, the likely effect of adding this
        capability to an ISP cache is that sites will reject any proxied
        requests and force those users using caches to disable them.
Received on Wed Jun 02 1999 - 06:32:38 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:46:42 MST