On Sun, 2012-03-04 at 08:13 +1300, Phill Coxon wrote:
> Hi there.
> 
> I want to download a copy of a small site so I have the layout / design 
> to use as a source of ideas for a similar project I'm going to create in 
> the near future.
> 
> The reason I want to capture it locally is that it's for a short term 
> event (ending in two days) and the site may go offline or change after 
> that. It would be great to have a copy of it as it is now.
> 
> I'm trying to use  wget to mirror the site but all of the site images 
> are cached on Amazon S3 and wget will not download them as local copies.
> 
> The site is: http://paleosummit.com
> 
> Any suggestions on how to use wget or curl to download the images stored 
> on Amazon S3 with links updated for local storage / viewing?
> 
> Thanks!
Depends whether you want to get the wordpress site isself, in which
case, use of stuff like backupbuddy will easily deliver you the code /
database in a single zip file, or if you want a copy of the website as
static pages, in which case wget --mirror will probably get you sorted.

hth,

Steve

-- 
Steve Holdoway BSc(Hons) MNZCS <[email protected]>
http://www.greengecko.co.nz
MSN: [email protected]
Skype: sholdowa

Attachment: smime.p7s
Description: S/MIME cryptographic signature

_______________________________________________
Linux-users mailing list
[email protected]
http://lists.canterbury.ac.nz/mailman/listinfo/linux-users

Reply via email to