On Sun, 28 Feb 1999, Ben Handley wrote:

> I am much the same, but would like to do it with web pages so that i can
> read them offline later. How hard is it to get our server to dial up
> automatically at about 5 in the morning, then get some pages and everything
> linked to from them (even better, without following external links to ads
> etc), and then for the other computers to get these pages later on in the
> day, and for everything to be destroyed the next time around? Is there some
> program that will help me with this?

I suppose you could use cron to schedule de dialup, and if all you want is
the information on the pages you could use lynx with the -dump option. For
example what I do to get my newspaper every morning:

lynx -dump http://www.nacion.co.cr | mail ti2dll

-----
//\/\ario //\/\elendez- TI2DLL
[EMAIL PROTECTED]
http://desvelo.cjb.net
Happily running Linux since 10/28/1998 (Y2K compatible! :)

Reply via email to