Hi, On 23/08/16 14:23, Darshit Shah wrote: > Hi, > > Nope, as far as I'm aware, there is absolutely no plan to add such a > feature to Wget. Wget is primarily a tool for downloading web resources, > not for maintaining a sync. Sure, you can use it to archive a website, > to keep it in sync, there are other, more specialised utilities such as > `rsync`. > > However, if someone were to contribute a patch, along with relevant test > cases, we might consider such a feature.
Just dropping an idea. Sure there is demand for such a feature out there, but my inner voices keep on telling me that we should not go too far from what wget was intended to be - as you say, a tool to download web resources. How about a feature that logs (in a CSV, for instance) which files were downloaded, and where they were saved? Then you could easily parse that file with something like Python. Jookia took a similar approach to log URLs that were rejected [1]. [1] http://lists.gnu.org/archive/html/bug-wget/2015-07/msg00094.html > * [email protected] > <[email protected]> [160823 11:18]: >> Hi all, >> >> I'm using wget to mirror (recursively) a remote directory over >> http. I think wget is very nice for this job, but unfortunately it >> lacks an option to delete files (locally) that were downloaded earlier >> but are no longer available on the remote server. Right now wget just >> ignores these files and leaves the local copy unmodified. Is there any >> chance that this option will be added to wget? I've found post online >> by people with this very same problem since at least 2004... >> >> Thanks >
signature.asc
Description: OpenPGP digital signature
