Re: "Squid" failed to cache from FTP sites that does not support RESUME

From: Dancer <dancer@dont-contact.us>
Date: Thu, 26 Aug 1999 07:17:47 +1000

Idan Feigenbaum wrote:
>
> I was trying to download using a download program that HAD resume ability
> from several sites with
> and without the proxy.
>
> When downloading (without proxy) from site that supported RESUME, the
> program showed me that it did support the resume, and when I aborted the
> download, and then try to resume it - it was all fine - the download started
> immediately and completed successfully.
> When download (without proxy) from site the didn't support RESUME, the
> program showed me that resume is not available, and of course, when aborting
> and trying to RESUME the download, it started from the beginning.
> NOW, when I was downloading (WITH SQUID) from site that supported RESUME,
> the program showed me that it supports the resume, and when I aborted the
> download, and tried to resume it - it was all fine - the download started
> immediately and completed successfully.
>
> BUT, when I was downloading (WITH SQUID) from site the didn't support
> RESUME, the program STILL showed me that RESUME is SUPPORTED,

Which is either correct or incorrect, depending on your point of view.
If the program is fetching an FTP file through an HTTP proxy it has no
way to tell if the FTP server supports resume or not. As for the HTTP
side, I'll bet you a dollar that it's either just assuming that
byte-ranges work, or it's shoving a byte-range request at the proxy,
even if it does not yet have any portion of the file.

> when I aborted
> the download, and tried to RESUME it - it took a lot of time until squid
> reached the point that I requested to resume from. That's logical for me,
> but even if I download the whole file at once, and then try to download it
> again, it wont give it to me from the cache - from some reason it releases
> this file from the cache.
>
> Am I clear enough ?

I think so. I _have_ seen this sort of behaviour before. I can think of
three possible causes:
* Squid isn't caching the ftp files properly, where it should be.
* The program is making the request with some header criteria that
prevents squid from delivering from the cached copy.
* The program is using byterange requests unnecessarily (for the entire
object for example), and confusing the whole shooting match.

>
> -----Original Message-----
> From: Dave J Woolley [mailto:DJW@bts.co.uk]
> Sent: Wednesday, August 25, 1999 2:25 PM
> To: 'squid-users@ircache.net'
> Subject: RE: "Squid" failed to cache from FTP sites that does not
> support RESUME
>
> > From: Idan Feigenbaum [SMTP:idanfei@ibm.net]
> >
> > Could someone tell if if its possible to configure the "Squid" to
> > cache also files from FTP sites, that does not support the
> > REST (resume) feature ?
> >
> What is your evidence that REST makes a difference?
>
> I've not investigated cacheing of FTP in squid, but
> cacheablity is much more likely to depend on the format
> of directory listings than on support for restarts.
>
> Incidentally, I don't know any sites that support the
> REST feature according to the specification - all make
> assumptions about Unix like file systems which the FTP
> RFC doesn't permit and don't support the required
> checkpointing of the data stream.
Received on Wed Aug 25 1999 - 16:26:52 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:48:06 MST