If you run wget with -d you get a lot of (sometimes) usefull debug info.
In this case you'll see something like 
Not following http://club.foto.ru/images/dot.gif because robots.txt forbids
it.
an so on.
As a matter of fact you'll notice a robots.txt which has been downloaded,
containing 
User-agent: *
Disallow: /banner/
Disallow: /banners/
Disallow: /images/
Disallow: /photo/
Disallow: /photo-sc/
Disallow: /preview/
Disallow: /preview-sc/
Disallow: /small/
Disallow: /small-sc/
Disallow: /thumb/
Disallow: /counter.rb

Si this is one of the rare case when you could want to run wget with -e
"robots=off" in order to override the settings configured by the webmaster
of that site.

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

> -----Original Message-----
> From: Vladimir B [mailto:[EMAIL PROTECTED]
> Sent: Sunday, August 31, 2003 10:40 AM
> To: [EMAIL PROTECTED]
> Subject: cannot download images with -p option
> 
> 
> Hello,
> I have wget 1.8.2. I'm trying to download a web page with command:
> wget -p "http://club.foto.ru/school/lesson_part.php?part_id=53&";
> I want it to download with images, but it downloads only this 
> file and 3 more files: common.js, robots.txt and style.css
> I've removed my .wgetrc. but nothing has changed :-(
> what's wrong? Can you help me?
> Thanks a lot.
> -- 
> Sincerely yours,
> Vladimir    
> 

Reply via email to