John Rouillard wrote at about 20:55:14 +0000 on Wednesday, April 8, 2009: > On Wed, Apr 08, 2009 at 03:29:12PM -0500, Carl Wilhelm Soderstrom wrote: > > On 04/08 10:19 , Boniforti Flavio wrote: > > > OK, so your suggestion is actually to leave them compressed, right? > > > If so, what may I do to avoid re-syncing of many GB of data? > > > > Backuppc uncompresses the files in the course of comparing the old ones to > > the new ones; or else does it by comparing stored checksums. So it's very > > efficient about not re-transferring files that haven't changed. > > If I understood the OP he already has these files on his backup server > from another backup system. He wants to import these files into the > filestore of backuppc rather than having backuppc have to transfer > them all over again. > > I have wanted to do the same and having a BackupPC_compress to > compliment BackupPC_zcat would be a real win. > It's quite easy to do. I have written various stubs that do some or all of that.
> The BackupPC_compress would compress the file as though it was > processed into ROOT/pc/hostname/... tree. then I could just copy the > file into place and manually create a NewFileList file and run > BackupPC_link over the new tree to get the files in the pool (or use > the donated BackupPC_fixlinks). > > For extra credit a: > > BackupPC_import -H hostname -s share directory > > that takes a file tree located at directory and imports that filetree > as though it was a backup done for the share "share" on hostname would > be great as well. Maybe a Google Summer Of Code idea? More like Afternoon of Code - I mean it's not very hard. > > In the absense of those commands: > > http://backuppc.wiki.sourceforge.net/How+to+import+data+for+a+backup > > or using google to search for backuppc import should provide some > ideas. > > -- > -- rouilj > > John Rouillard System Administrator > Renesys Corporation 603-244-9084 (cell) 603-643-9300 x 111 > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by: > High Quality Requirements in a Collaborative Environment. > Download a free trial of Rational Requirements Composer Now! > http://p.sf.net/sfu/www-ibm-com > _______________________________________________ > BackupPC-users mailing list > BackupPC-users@lists.sourceforge.net > List: https://lists.sourceforge.net/lists/listinfo/backuppc-users > Wiki: http://backuppc.wiki.sourceforge.net > Project: http://backuppc.sourceforge.net/ > ------------------------------------------------------------------------------ This SF.net email is sponsored by: High Quality Requirements in a Collaborative Environment. Download a free trial of Rational Requirements Composer Now! http://p.sf.net/sfu/www-ibm-com _______________________________________________ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/