On 2002-04-03 08:50 -0500, Dan Mahoney, System Admin wrote:

> > > 1) referrer faking (i.e., wget automatically supplies a referrer
> > > based on the, well, referring page)
> >
> > It is the --referer option, see (wget)HTTP Options, from the Info
> > documentation.
> 
> Yes, that allows me to specify _A_ referrer, like www.aol.com.  When I'm
> trying to help my users mirror their old angelfire pages or something like
> that, very often the link has to come from the same directory.  I'd like
> to see something where when wget follows a link to another page, or
> another image, it automatically supplies the URL of the page it followed
> to get there.  Is there a way to do this?

Somebody already asked for this and AFAICT, there's no way to do
that.

> > > 3) Multi-threading.
> >
> > I suppose you mean downloading several URIs in parallel.  No, wget
> > doesn't support that.  Sometimes, however, one may start several wget
> > in parallel, thanks to the shell (the & operator on Bourne shells).
> 
> No, I mean downloading multiple files from the SAME uri in parallel,
> instead of downloading files one-by-one-by-one (thus saving time on a fast
> pipe).

This doesn't make sense to me. When downloading from a single
server, the bottleneck is generally either the server or the link
; in either case, there's nothing to win by attempting several
simultaneous transfers. Unless there are several servers at the
same IP and the bottleneck is the server, not the link ?

-- 
André Majorel <URL:http://www.teaser.fr/~amajorel/>
std::disclaimer ("Not speaking for my employer");

Reply via email to