I forgot to ask if you were looking for local copies of whole websites
or if you were more interested in just saving a webpage's important
content. If its the latter the recommendation I gave was the best I have
come across since it saves it online but if its the first then wget with
the right parameters is good as well as a bunch of other ones.

Thanks,
------------------------------------------
Ali Mesdaq (CISSP, GIAC-GREM)
Security Researcher II
Websense Security Labs
http://www.WebsenseSecurityLabs.com
------------------------------------------

-----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Mesdaq, Ali
Sent: Tuesday, February 12, 2008 2:55 PM
To: hardware@hardwaregroup.com
Subject: Re: [H] Capturing websites

Clipmarks for Firefox works kidna well. It saves content on the server
so thats good but only in like 1k words or letter chunks.
https://addons.mozilla.org/en-US/firefox/addon/1407 


Thanks,
------------------------------------------
Ali Mesdaq (CISSP, GIAC-GREM)
Security Researcher II
Websense Security Labs
http://www.WebsenseSecurityLabs.com
------------------------------------------

-----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Anthony Q.
Martin
Sent: Tuesday, February 12, 2008 12:27 PM
To: hardware@hardwaregroup.com
Subject: [H] Capturing websites

Anyone know of a tool (free is nice) to capture an entire website?

Not interested in stealing, mind you. I just need to preserve the info
there so that I can look at it after the website disappears. 

Doesn't Acrobat (not the reader) do that?

Thanks.



 Protected by Websense Messaging Security -- www.websense.com 

Reply via email to