On Wed, Sep 12, 2007 at 10:15:50PM +0100, Claus Reinke wrote:
>
>
> >Aren't there ways of downloading whole websites onto your
> >local machine. Kind of "follow all links from <url> but stop
> >when going outside of site <site>". I'm hopelessly ignorant
> >but this seems like something that must exist
>
> yes, and i was seriously considering writing a script for wget
> to do that,
FWIW, depending on exactly what you want to do, something like
wget -w 1 -r -l -np
perhaps with reject lists (-R) if things like page histories are in the
wrong place may do what you want. The problems I've snipped still apply,
of course.
Thanks
Ian
_______________________________________________
Cvs-ghc mailing list
[email protected]
http://www.haskell.org/mailman/listinfo/cvs-ghc