On Sun, 28 Feb 1999, Ben Handley wrote:

> I am much the same, but would like to do it with web pages so that i can
> read them offline later. How hard is it to get our server to dial up
> automatically at about 5 in the morning, then get some pages and everything
> linked to from them (even better, without following external links to ads
> etc), and then for the other computers to get these pages later on in the
> day, and for everything to be destroyed the next time around? Is there some
> program that will help me with this?
> 
> Thanks,
> Ben
> 
Have a look at wget . It downloads web sites recursively, and can be
configured to stop at a certain depth, or at site boundaries ...
Destroying everything is as simple as rm -r (or am i missing something?)
Put the whole lot in a shell script 
eg :

#!/bin/bash
cd /mirrors
rm -r slashdot.org              #or whatever site you're mirroring
ppp-on                          #or whatever you use to start your dialup link
wget -m http://slashdot.org/    #read the man pages, not sure about the -m
                                #and of course pick another site
ppp-off                         #or whatever you use to stop your dialup link


Run the script as a cron job

Frank

Reply via email to