Even in this case, how it is possible discriminate between files?
I mean, why can I download some files and I can't  with others that have
similar features? It sounds strange to me... 
Thanks anyway
G


On Wednesday 10 May 2006 15:15, Curtis Hatter wrote:
> On Tuesday 09 May 2006 06:18, you wrote:
> > Hi all,
> > I have a problem: I'm trying to download an entire directory from a site
> > and I'm using the command "wget -r  -I <directory_name>  <site_name>.
> > It seems to work but at a certain point it stops but I'm sure that there
> > are some files missing that I can download manually and I'm sure that
> > they are files like the others that wget can download. Any clue about
> > that? Thanks
>
> Have you checked to see if they have a robots.txt file that may restrict
> the download? If it does you'll have to turn off robots, '-e robots=off' on
> the command line.
>
> Curtis

-- 
Per me l'uomo colto non è colui che sa quando è nato Napoleone, ma quello che 
sa dove andare a cercare l'informazione nell'unico momento della sua vita in 
cui gli serve, e in due minuti.
Umberto Eco

Reply via email to