Boniforti Flavio wrote: > > > Il 08.04.09 22:11, "Les Mikesell" <lesmikes...@gmail.com> ha scritto: > >> They are compressed if you leave the default setting enabled in your >> configuration, and all duplicate copies of the same file (whether found >> on the same target or not) are hardlinked to one pooled instance. That >> means that backing up multiple instances of similar hosts or keeping a >> long history of backups will only use addition space for the differences >> (plus a small amount for the directory structure). > > OK, so your suggestion is actually to leave them compressed, right? > If so, what may I do to avoid re-syncing of many GB of data?
I'm not sure I understand. The only way backuppc will know they exist is if they are already in the previous full run with the mangled file format (compression doesn't matter). If you are doing a remote copy over a slow link it might be worth setting up what looks like a local copy of the same thing on a different machine to get the first run with the clientalias feature then reconfigure to use the real target. On a local lan I'd just let it copy again. One thing you can do to help get started is set up some includes or excludes so you just do a small portion to let a run complete. Then you can add more in for the next pass. -- Les Mikesell lesmikes...@gmail.com ------------------------------------------------------------------------------ This SF.net email is sponsored by: High Quality Requirements in a Collaborative Environment. Download a free trial of Rational Requirements Composer Now! http://p.sf.net/sfu/www-ibm-com _______________________________________________ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/