Hi everyone! I'm using wget to check if some files are downloadable, I also use to determine the size of the file. Yesterday I noticed that wget ignores --spider option for ftp addresses. It had to show me the filesize and other parameters, but it began to download the file :( That's too bad. Can anyone fix it? My only idea was to shorten the time of work using supported options, so that the downloading would be aborted. That's a user's solution, now a programmer's one is needed.
The other problem is that wget dowesn't correctly replace hrefs in downloaded pages (it uses hostname of the local machine to replace remote hostname and there's no feature to give any other base-url the -B option is for another purpose.) If anyone is interested, I can describe the problem more detailed. If it won't be fixed, I'll write a perl script to replace base urls after wget downloads pages I need, but that's not the best way). -- Best regards, dEth mailto:[EMAIL PROTECTED] PS Is there any list of known bugs?