bad ftp URLs make cache messy

From: Richard Blanchet <Richard.Blanchet@dont-contact.us>
Date: Tue, 11 Mar 1997 23:06:05 +0100

  I'm using squid 1.1.8, and when errors occur on ftp URLs tranfers, the cache
gets messy keeping the mis-transfered URL in its cache.

  I went through the list'digets and found about the client programm that
allows to force reload in the cache... I tried... it worked... but it's a one
shot process as you have to do that for each bad URL.

  So this can't be a solution when you are facing several dozens of users
complaining about why they can't retrieve URLs (that someone else messed up in
the cache).

  You may track down specific errors in log files and then do some kind of
robot the reload the messy ftp URL... even it would be cleaner to just remove
it from the cache!

  Is there a better solution to this problem ?
  Or should it be in the ToDo list of Squid group?

  Thanks for comments,

Richard

PS: This messy stuff does happen on ERR_LIFETIME_EXP and ERR_NO_CLIENTS_BIG_OBJ
error...

--
"You may say I'm a dreamer,... but I'm not the only one..."     J.L.
Received on Tue Mar 11 1997 - 14:15:16 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:34:40 MST