On Sun, Aug 31, 2008 at 12:02 PM, Philip <[EMAIL PROTECTED]> wrote:
> I'm looking for a tool which spiders a site, and downloads every page in
>  the domain that it finds linked from a particular url and linked urls
> in the domain, creating a local site that can be manipulated offline as
> static html.
>
> Is there such a tool for linux (better still debian)?

It sounds like you want
$ wget --mirror -k <base URL>

-- 
Michael A. Marsh
http://www.umiacs.umd.edu/~mmarsh
http://mamarsh.blogspot.com
http://36pints.blogspot.com


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED] 
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]

Reply via email to