Let's not forget wget.

I imagine getleft is a lot like it, but wget comes with most every distro
that has come out in the last few years. It can do complete mirrors of
remote pages, but be warned that pages with javascipt to open new pages will
fail. Unless Getleft is superhuman, it might not fair any better. But I've
never used that one.

wget is really handy to d/l hard-to-get links, because it has infinite
re-try and timeout capabilities. Works well on ftp as well, and has a tiny
memory and cpu footprint. the glory of the command line!

Derek Stark
IT / Linux Admin
eSupportNow
xt 8952

PS: The list is MUCH faster lately.

-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On
Behalf Of Haim Ashkenazi
Sent: Friday, January 26, 2001 1:53 AM
To: [EMAIL PROTECTED]
Subject: Re: [expert] d/ling docs from the web


Hi

I mainly use 2 tools for this. search freshmeat.net for 'Getleft' and
'htmldoc'. the first can download a whole web page including pictures and
the second can convert them to postscript or pdf (both from hard disk or
directly from the web).


On Thu, Jan 25, 2001 at 06:11:19PM -0800, Homer Shimpsian wrote:
>
> I know I can't be the first person to want for this.  I've been searching
> since the web was created.
>
>
> Does anyone know of a way to d/l and concatenate all the different web
pages
> in an online manual to enable one to print the sucker?
>
>
> like this site:
> http://jgo.local.net/LinuxGuide/
>
>
> I imagine the difficulty in programming such a thing is when there are
links
> on the page that are not part of the manual.  A TSR that allowed U to
> highlight the relevent links wound't be to impossible, right?
>
>
>
>
>

Have Fun
--
Haim


Reply via email to