On Fri, 11 Feb 2005 20:49:54 +0000, Paul Smith wrote:
>
> Is there some way of downloading at once all the files on a http
> address? For instance, consider the following address:

 man wget

<quote> 
 `-r' 
 `--recursive' 
 Turn on recursive retrieving. 

 GNU Wget is capable of traversing parts of the Web (or a single 
 HTTP or FTP server), depth-first following links and directory 
 structure. This is called recursive retrieving, or recursion. 

 When retrieving an FTP URL recursively, Wget will retrieve all
 the data from the given directory tree (including the subdirectories 
 up to the specified depth) on the remote server, creating its mirror 
 image locally. 
</quote>

 So, either
             wget -r ..... http://www.gkmweb.com/amarok/10.1/
 or
             wget -r ..... ftp://www.gkmweb.com/amarok/10.1/

...whichever suits your purpose.
Jonesy
-- 
  | Marvin L Jones       | jonz         |  W3DHJ   |  linux
  |  Gunnison, Colorado  |  @           |  Jonesy  |    OS/2   __
  |   7,703' -- 2,345m   |   config.com |  DM68mn              SK


____________________________________________________
Want to buy your Pack or Services from MandrakeSoft? 
Go to http://www.mandrakestore.com
Join the Club : http://www.mandrakeclub.com
____________________________________________________

Reply via email to