On Mon, Feb 7, 2011 at 4:53 PM, Juan Cortes <[email protected]> wrote: > Hope all is well guys! > > I know there's such tool but i can't find it. > Basically I want to extract a website of all its content. and save them as a > file per page.
I don't get exactly what your mean by "extract content" Would wget -m work? Or are you looking to extract the text minus formatting? -- Ben Jackson - Mayhemic Labs [email protected] - http://www.mayhemiclabs.com - +1-508-296-0267 "Assume that what is in the power of one man to do, is in the power of another" _______________________________________________ Pauldotcom mailing list [email protected] http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom Main Web Site: http://pauldotcom.com
