On 12/12/19 1:25 PM, michel.kempene...@telenet.be wrote: > Hi, > > I run into a particular problem when I'm trying to download a bunch of URLs I > grouped together in file "input.txt" like this: > > wget -nv -a log.txt -P .\Images\ -i input.txt > > Some of these files are huge, hence take a long time to download. > As a consequence, they will not appear in the same sorting order in the > download folder as int he input folder, and that's a problem, as this order > has its importance.
Since wget works sequentially, why do you think the order of downloads has something to do with the file size ? If 'Images' is a fresh and empty directory *and* all files download OK, the order in the directory is the same as the order in input.txt. At least a sane file system should keep the order (is NTFS sane ?). Then, what is irritating: 'dir' or 'ls' tools like to use a certain sort order by default. E.g. here on GNU/Linux 'ls' orders the output files alphabetical by name. 'ls -rc' prints with a reverse order by creation time (oldest first, then newer files), which seems to be what you want. In short, wget likely is not your problem. Find out what it really is and you can find a mitigation. As a 'dump' work-around, save your files into a temp directory, then move them to Images\ in the order of occurrence in input.txt. Regards, Tim
signature.asc
Description: OpenPGP digital signature