Hi there.
I want to download a copy of a small site so I have the layout / design
to use as a source of ideas for a similar project I'm going to create in
the near future.
The reason I want to capture it locally is that it's for a short term
event (ending in two days) and the site may go offline or change after
that. It would be great to have a copy of it as it is now.
I'm trying to use wget to mirror the site but all of the site images
are cached on Amazon S3 and wget will not download them as local copies.
The site is: http://paleosummit.com
Any suggestions on how to use wget or curl to download the images stored
on Amazon S3 with links updated for local storage / viewing?
Thanks!
_______________________________________________
Linux-users mailing list
[email protected]
http://lists.canterbury.ac.nz/mailman/listinfo/linux-users