On 08 Feb, Richard Porter <[email protected]> wrote:
> I rather like the way NetSurf does a full save when you can't download 
> all the images because of lack of memory. I came across a Japanes site 
> with 186 images on one page, most of them big ones. I quit as many 
> apps as I could but even so there wasn't enough room.

> I did a full save and noticed that the index file held the absolute 
> urls for images that weren't downloaded. I was able to delete the tags 
> for images I already had and load the page again to get the next 
> tranche. Three such operations were necessary before I got all the 
> images.

> Would it be possible to have a full save in which all the images were 
> stored on the hard disc even if they couldn't be rendered at the same 
> time? Or perhaps a "Save target as..." option on a link which would do 
> the same? As far as I can see "Download target" will only save the 
> object itself i.e. one image or the page source.

It is easier to download the webpage and its images with wget

wget -r <url>

-- 
Stefaan Claes, Hove, Antwerpen, Belgium, Europe, <[email protected]>

Reply via email to