> transferring large amounts of data and automatization in processing at
> least some of it, without involving a 3rd party
> 
> "Large amounts" can be "small" like 100MB --- or over 50k files in 12GB,
> or even more.  The mirror feature of lftp is extremely useful for such
> things.
> 
> I wouldn't ever want having to mess around with web pages to figure out
> how to do this.  Ftp is plain and simple.  So you see why I'm explicitly
> asking for a replacement which is at least as good as ftp.

  How about "wget"?  It can handle ftp and http, and it can be scripted.
And it can discriminate on timestamps, i.e. only download a file if it
has been changed since the latest download at your site.

  Then there's always "sneakernet".  To quote Andrew Tanenbaum from 1981

> Never underestimate the bandwidth of a station wagon full of tapes
> hurtling down the highway.

-- 
Walter Dnes <waltd...@waltdnes.org>
I don't run "desktop environments"; I run useful applications

Reply via email to