Hello, I use wget to keep current set of files
from the website I voluntarily help with.
I've just uploaded a lot of files and was going
to try to download the current status
when I couldn't find which wget commands I should
be using.
Has something happened to the man page? Is it
broken?
How am I
"M. K. Green" <[EMAIL PROTECTED]> writes:
> Hello, I use wget to keep current set of files from the website I
> voluntarily help with. I've just uploaded a lot of files and was
> going to try to download the current status when I couldn't find
> which wget commands I should be using.
>
> Has so
"Bazuka" <[EMAIL PROTECTED]> writes:
> If I am running Wget overnight to crawl some sites (say about 50,000
> URLs) and it crashes/hangs up for some reason after retrieving half
> of them, is it possible to restart it from the point where it
> crashed (instead of downloading everything again ) ?
> The `-nc' option should do what you want.
My understanding of the -nc option is that it doesn't overwrite the existing
files. What does it do when downloading "index.html" files ? There might be
many copies of that file...so does it not overwrite the previous copy ?
I have modified Wget slight
On 27 Jun 2001, at 9:43, "Bazuka" <[EMAIL PROTECTED]> wrote:
> I have modified Wget slightly so it writes the URL and file info to a
> database and then deletes the actual file (with --delete-after). Can I
> use -nc option in that case ?
Yes, but it download the deleted files again.
Ian> Also, both versions of the character constant '"' and '\"' are valid,
Ian> so if the compiler barfs on any of the above it must be faulty. I
Ian> suggest a bug report to the maintainers of this compiler is in order.
The problems not with the compiler. Indeed, the Mac OS X compiler is gcc. Th