Are there any generalized utility programs that will grap a web page, extract the text, convert to a text (or fill-in-the-blanks) file for printing?
I'm getting ready to work on some python code to do that for printing the Slackware users' manual, but it would be nice to have a real tool.
html2jpeg creates jpegs (basically screenshots) of webpages: http://freshmeat.net/projects/html2jpg/
html2ps converts html to postscript http://freshmeat.net/projects/html2ps/
html2pdf http://freshmeat.net/projects/html2pdf/
-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ L. Friedman [EMAIL PROTECTED] Linux Step-by-step & TyGeMo: http://netllama.ipfox.com
5:35pm up 26 days, 18:04, 3 users, load average: 0.20, 0.14, 0.04
_______________________________________________ Linux-users mailing list [EMAIL PROTECTED] Unsubscribe/Suspend/Etc -> http://www.linux-sxs.org/mailman/listinfo/linux-users
