Re: [expert] d/ling docs from the web

2001-01-25 Thread Haim Ashkenazi

Hi

I mainly use 2 tools for this. search freshmeat.net for 'Getleft' and 'htmldoc'. the 
first can download a whole web page including pictures and the second can convert them 
to postscript or pdf (both from hard disk or directly from the web).


On Thu, Jan 25, 2001 at 06:11:19PM -0800, Homer Shimpsian wrote:
> 
> I know I can't be the first person to want for this.  I've been searching
> since the web was created.
> 
> 
> Does anyone know of a way to d/l and concatenate all the different web pages
> in an online manual to enable one to print the sucker?
> 
> 
> like this site:
> http://jgo.local.net/LinuxGuide/
> 
> 
> I imagine the difficulty in programming such a thing is when there are links
> on the page that are not part of the manual.  A TSR that allowed U to
> highlight the relevent links wound't be to impossible, right?
> 
> 
> 
> 
> 

Have Fun
-- 
Haim




Re: [expert] d/ling docs from the web

2001-01-25 Thread Kelley Terry

On Thursday 25 January 2001 07:11 pm, Homer Shimpsian wrote:
> I know I can't be the first person to want for this.  I've been searching
> since the web was created.
>
>
> Does anyone know of a way to d/l and concatenate all the different web
> pages in an online manual to enable one to print the sucker?
>
>
> like this site:
> http://jgo.local.net/LinuxGuide/
>
>
> I imagine the difficulty in programming such a thing is when there are
> links on the page that are not part of the manual.  A TSR that allowed U to
> highlight the relevent links wound't be to impossible, right?


I've just used konqueror with a split window to drag, drop and copy each page 
on other on-line manuals.  I just started with yours, created a new folder 
called linux_guide and just started copying each link into it when I got to 
the one that says: "Download everything".  You can actually copy that page 
also or open it and follow the instructions to download the entire manual in 
different formats. 
If you continue to copy each link into the folder then that folder can be 
opened with a browser and used just like the original.  If you're online and 
the folder recognizes a url it will go to it.  However if it can't recognize 
the url (say it was a url on the original host) it can't go to it.  I haven't 
had that happen to me yet.  Also graphics are left out of the finished manual 
this way - I've tried to get the graphics separately but haven't figured a 
way yet.  Let me know if you find a way.

Kelley Terry <[EMAIL PROTECTED]>




RE: [expert] d/ling docs from the web

2001-01-26 Thread D. Stark - eSN

Let's not forget wget.

I imagine getleft is a lot like it, but wget comes with most every distro
that has come out in the last few years. It can do complete mirrors of
remote pages, but be warned that pages with javascipt to open new pages will
fail. Unless Getleft is superhuman, it might not fair any better. But I've
never used that one.

wget is really handy to d/l hard-to-get links, because it has infinite
re-try and timeout capabilities. Works well on ftp as well, and has a tiny
memory and cpu footprint. the glory of the command line!

Derek Stark
IT / Linux Admin
eSupportNow
xt 8952

PS: The list is MUCH faster lately.

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On
Behalf Of Haim Ashkenazi
Sent: Friday, January 26, 2001 1:53 AM
To: [EMAIL PROTECTED]
Subject: Re: [expert] d/ling docs from the web


Hi

I mainly use 2 tools for this. search freshmeat.net for 'Getleft' and
'htmldoc'. the first can download a whole web page including pictures and
the second can convert them to postscript or pdf (both from hard disk or
directly from the web).


On Thu, Jan 25, 2001 at 06:11:19PM -0800, Homer Shimpsian wrote:
>
> I know I can't be the first person to want for this.  I've been searching
> since the web was created.
>
>
> Does anyone know of a way to d/l and concatenate all the different web
pages
> in an online manual to enable one to print the sucker?
>
>
> like this site:
> http://jgo.local.net/LinuxGuide/
>
>
> I imagine the difficulty in programming such a thing is when there are
links
> on the page that are not part of the manual.  A TSR that allowed U to
> highlight the relevent links wound't be to impossible, right?
>
>
>
>
>

Have Fun
--
Haim





RE: [expert] d/ling docs from the web

2001-01-26 Thread Homer Shimpsian

Yeah..  I saw that d/l everything as well.  You don't always have the
luxury.


So, you're basically cutting and pasting?


I'm thinking of something like one of those web cache programs where U can
point to a site and set how many levels (link) deep U want to go and it will
d/l everything for offline browsing.  Thinking about it, I guess that will
suit my purposes of offline browsing, and while it would be nice to have one
html for various reasons like transfering, I suppose I could just zip the
whole shabang.  (it's going to create a folder for each of those chapters)




-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]]On Behalf Of Kelley Terry
Sent: Thursday, January 25, 2001 8:16 PM
To: Homer Shimpsian; [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Subject: Re: [expert] d/ling docs from the web


On Thursday 25 January 2001 07:11 pm, Homer Shimpsian wrote:
> I know I can't be the first person to want for this.  I've been searching
> since the web was created.
>
>
> Does anyone know of a way to d/l and concatenate all the different web
> pages in an online manual to enable one to print the sucker?
>
>
> like this site:
> http://jgo.local.net/LinuxGuide/
>
>
> I imagine the difficulty in programming such a thing is when there are
> links on the page that are not part of the manual.  A TSR that allowed U
to
> highlight the relevent links wound't be to impossible, right?


I've just used konqueror with a split window to drag, drop and copy each
page
on other on-line manuals.  I just started with yours, created a new folder
called linux_guide and just started copying each link into it when I got to
the one that says: "Download everything".  You can actually copy that page
also or open it and follow the instructions to download the entire manual in
different formats.
If you continue to copy each link into the folder then that folder can be
opened with a browser and used just like the original.  If you're online and
the folder recognizes a url it will go to it.  However if it can't recognize
the url (say it was a url on the original host) it can't go to it.  I
haven't
had that happen to me yet.  Also graphics are left out of the finished
manual
this way - I've tried to get the graphics separately but haven't figured a
way yet.  Let me know if you find a way.

Kelley Terry <[EMAIL PROTECTED]>






Re: [expert] d/ling docs from the web

2001-01-26 Thread Haim Ashkenazi

well, I couldn't agree more about wget. it's my favorite for downloading everything. 
the only advantage of getleft over wget is that it easily and quickly lets you choose 
which other files (pictures, zip files, etc...) that linked in this page you want to 
download.


On Fri, Jan 26, 2001 at 08:21:43AM -0500, D. Stark - eSN wrote:
> Let's not forget wget.
> 
> I imagine getleft is a lot like it, but wget comes with most every distro
> that has come out in the last few years. It can do complete mirrors of
> remote pages, but be warned that pages with javascipt to open new pages will
> fail. Unless Getleft is superhuman, it might not fair any better. But I've
> never used that one.
> 
> wget is really handy to d/l hard-to-get links, because it has infinite
> re-try and timeout capabilities. Works well on ftp as well, and has a tiny
> memory and cpu footprint. the glory of the command line!
> 
> Derek Stark
> IT / Linux Admin
> eSupportNow
> xt 8952
> 
> PS: The list is MUCH faster lately.
> 
> -Original Message-
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On
> Behalf Of Haim Ashkenazi
> Sent: Friday, January 26, 2001 1:53 AM
> To: [EMAIL PROTECTED]
> Subject: Re: [expert] d/ling docs from the web
> 
> 
> Hi
> 
> I mainly use 2 tools for this. search freshmeat.net for 'Getleft' and
> 'htmldoc'. the first can download a whole web page including pictures and
> the second can convert them to postscript or pdf (both from hard disk or
> directly from the web).
> 
> 
> On Thu, Jan 25, 2001 at 06:11:19PM -0800, Homer Shimpsian wrote:
> >
> > I know I can't be the first person to want for this.  I've been searching
> > since the web was created.
> >
> >
> > Does anyone know of a way to d/l and concatenate all the different web
> pages
> > in an online manual to enable one to print the sucker?
> >
> >
> > like this site:
> > http://jgo.local.net/LinuxGuide/
> >
> >
> > I imagine the difficulty in programming such a thing is when there are
> links
> > on the page that are not part of the manual.  A TSR that allowed U to
> > highlight the relevent links wound't be to impossible, right?
> >
> >
> >
> >
> >
> 
> Have Fun
> --
> Haim
> 
> 

-- 
Haim