Hi Tim,

Am Sat, 1 Jun 2024 18:57:00 +0200
schrieb Tim Rühsen <tim.rueh...@gmx.de>: 

> Wget sets the remote time when using the -N / --timestamping option.

Hm, that is related to comparing local and remote timestamps for
deciding to re-download a file or not (I read in the man page). How is
that related to wget not storing the timestmap from FTP, while it does
so from HTTP, without any option? Isn't -N just a follow-up starting
from correct local timestamps?

>  > Is a fix in wget 1.x something to be considered at this point in time?  
> 
> Yes, if we find a volunteer to make up a patch.

OK. I may put that on the list of things to do if I don't have anything
to do … but seriously: This should be something easy, as wget can
already get and parse the date from FTP … or might this incur an
additional command exchange for listing the file instead of just
getting it, as opposed to HTTP where Last-Modified is always sent in
the response?

>  > FTP support seems to be dropped altogether from Wget 2 (correct?)  
> 
> Not dropped, but it is not implemented. The number of FTP users is 
> relatively small and there are plenty of FTP clients (e.g. wget for 
> recursive mode). Also, FTP seems to be dying, so why implement another 
> FTP client at all!?

This is because wget and curl (and others that I'm not that familiar
with) abstract that detail away for me. I don't want to talk HTTP or
FTP … I don't even want to start separate code paths for differing URLs
(having to parse URL schemes, then). I got scripts that download
resources from URLs. If that URL is for HTTP (v1/2/3) or FTP, or any
other option that happens to work, is something that I don't want to
deal with. As long as there is a single FTP server out there (or I
might run a local one, for reasons unknown;-) and wget 1 has the
functionality, it is odd not to implement it in the successor.

I understand that people don't want to deal with the protocol … that is
why we use time-tested tools that implemented it ages ago;-)

This is especially important since Firefox dropped support for FTP, one
sign of the slow death of the protocol … with the browser refusing to
browse/download FTP resources, this is one use-case to pull out the
command-line tool even if one is using the interactive browser normally.

So far I switched to curl in the downloader script at hand … but had a
bit of a learning curve that curl needs extra options for wgets
standard behaviour of exiting with an error if server gives a negative
response (like 403).


Alrighty then,

Thomas

-- 
GPG public key 60D5CAFE: https://thomas.orgis.org/public_key
Fingerprint: D021 FF8E CF4B E097 19D6  1A27 231C 4CBC 60D5 CAFE
And despite all of you, I'm still doing it. Yes, I do write Perl code.

Attachment: pgp4qrJvR_c7E.pgp
Description: Firma digital OpenPGP

Reply via email to