On Wed, 17 Oct 2001, Thomas M. Albright wrote: > On Wed, 17 Oct 2001, Marc Nozell wrote: > > > -----BEGIN PGP SIGNED MESSAGE----- > > Hash: SHA1 > > > > On Wednesday 17 October 2001 05:23 pm, Thomas M. Albright wrote: > > > OK, I can't afford the Red Hat Network, but I'd like to keep my system > > > as up-to-date as possible. > > > > Red Hat gives everyone a free account to keep one system updated. You > > could register one of your systems and then be sure to apply the > > updates on all your other machines. Do an occasional comparision of > > 'rpm -qa' to make sure you haven't forgotten anything. > > > > Of course it gets more tricky if you are at different release levels, > > don't always use RPM, use different hardware platforms, etc. > > > And there-in lies the rub. My server [tarogue.net, a 6.2 system] has the > account. My workstaion at work is a 7.0 system, my laptop is 7.1, my > systems at home are three 6.2 systems, and two 7.1 systems. They are all > pretty much custom set-ups, therefore, no two are alike. > > The single system with the account has no X installed, and therefore no > X programs. It needs to have the account, because it's the most public > machine I have. I have two other machines that actually touch the > internet directly, but they're both bare-bones firewall/gateway set-ups. > > Currently I ftp everything from ftp://updates.redhat.com/ relative to > all of my systems and architechtures, burn a couple cd's, and take 'em > home. The problem is overwriting existing files, building up > old/obsolete files, and (especially) finding the time to do these > things. > > If I could just cron an rsync, one at the office, and one at home, I > could then use the local LAN to move files to the machines that need > 'em. > > Now that my story has been expanded, can anyone help?
If I remember correctly RedHat only allows rsync to the mirrors (and then it's from different servers). I've got to think that you could use ncftp/ncftpget to do this, as it will ignore files of the same size/date so you won't get duplicates. As for deleting the old files (aka, when a new update gets released), you could script that.......get the output of ls from the RH server and delete whatever you have that they don't. I did something like this a while ago when I was building ximain on Alpha, as they didn't offer rsync either. --rdp -- Rich Payne [EMAIL PROTECTED] www.alphalinux.org ********************************************************** To unsubscribe from this list, send mail to [EMAIL PROTECTED] with the following text in the *body* (*not* the subject line) of the letter: unsubscribe gnhlug **********************************************************