From: R Kimber

> Yes there's a web page.  I usually know what I want.

   There's a difference between knowing what you want and being able to
describe what you want so that it makes sense to someone who does not
know what you want.

> But won't a recursive get get more than just those files? Indeed, won't
> it get everything at that level? The accept/reject options seem to
> assume you know what's there and can list them to exclude them.  I only
> know what I want. [...]

   Are you trying to say that you have a list of URLs, and would like to
use one wget command for all instead of one wget command per URL? 
Around here:

ALP $ wget -h
GNU Wget 1.10.2c, a non-interactive network retriever.
Usage: alp$dka0:[utility]wget.exe;13 [OPTION]... [URL]...
[...]

That "[URL]..." was supposed to suggest that you can supply more than
one URL on the command line.  Subject to possible command-line length
limitations, this should allow any number of URLs to be specified at
once.

   There's also "-1" ("--input-file=FILE").  No bets, but it looks as if
you can specify "-" for FILE, and it'll read the URLs from stdin, so you
could pipe them in from anything.

------------------------------------------------------------------------

   Steven M. Schweda               [EMAIL PROTECTED]
   382 South Warwick Street        (+1) 651-699-9818
   Saint Paul  MN  55105-2547

Reply via email to