Gerald Brandt wrote: > > > The backuppc archive host setup is (a) somewhat hard to automate, and > > (b) gives a complete tar image per host. You not only lose the history > > here but you have to store multiple copies of anything redundant. If > > you back up many machines storing copies of the same large files it is > > possible that a single archive export will be larger than the whole > > stored (and pooled) history of the machines. On the other hand, you > > don't need any special tools to recover the exported tar copy. > > > > Hi, > > That's pretty much what I believed happened. Offsite backups is one of > BackupPC's weaknesses, and I was comfortable with the shortcomings of > the archive. > > If anyone has something that is better, faster, cleaner, etc, I'm all > ears. But I've been following the threads and haven't seen anything > that would work for me yet.
Depending on the size of your filesystem, you might be able to make an image copy onto an external drive or raid. Or create it as a raid with an extra 'missing' device that you periodically add and re-sync. You'd want at least two spare devices so they don't have to be in the same place at the same time with the live copy. -- Les Mikesell lesmikes...@gmail.com ------------------------------------------------------------------------------ This SF.Net email is sponsored by the Verizon Developer Community Take advantage of Verizon's best-in-class app development support A streamlined, 14 day to market process makes app distribution fast and easy Join now and get one step closer to millions of Verizon customers http://p.sf.net/sfu/verizon-dev2dev _______________________________________________ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/