On Mon, 11 Aug 2003, dEth wrote:

> Hi everyone!
>
> I'm using wget to check if some files are downloadable, I also use to
> determine the size of the file. Yesterday I noticed that wget
> ignores --spider option for ftp addresses.
> It had to show me the filesize and other parameters, but it began to
> download the file :( That's too bad. Can anyone fix it? My only idea
> was to shorten the time of work using supported options, so that the
> downloading would be aborted. That's a user's solution, now a
> programmer's one is needed.

http://www.google.com/search?q=wget+spider+ftp

> The other problem is that wget dowesn't correctly replace hrefs in
> downloaded pages (it uses hostname of the local machine to replace
> remote hostname and there's no feature to give any other base-url the
> -B option is for another purpose.) If anyone is interested, I can
> describe the problem more detailed. If it won't be fixed, I'll write
> a perl script to replace base urls after wget downloads pages I need,
> but that's not the best way).

would this option help?:

GNU Wget 1.8.1, a non-interactive network retriever.
Usage: wget [OPTION]... [URL]...
..
  -k,  --convert-links      convert non-relative links to relative.
..

Reply via email to