I need to download ~15G of data from a web site. Using a PLUG mail list
thread from 2008 I tried this syntax:
wget -r --accept *.* http://ph-public-data.com/

What was quickly returned is the contents of a ph-public-data.com/
directory:
about/  contact/  document/  file/  whatsnew/

What I want are all the files from the multiple pages displayed in the table
on that home page. What syntax do I use to download the sub-directory names
and all files within each one? Is there a better command than wget for this
task?

TIA,

Rich



Reply via email to