Question #163372 on Duplicity changed:
https://answers.launchpad.net/duplicity/+question/163372

virtual joe gave more information on the question:
I killed the verify also. It was taking forever as well...

I think I'm going to assume that its not going to work and have to settle for 
rdiff-backup since I really don't see anything I've done wrong. Perhaps in the 
4.3MILLION files on this system rdiff-backup uses hardlinks to do its magic or 
something.
I'm backing up entire linux system with it including its samba share...I know 
there's bound to be 'small' changes to some files but not 49GBs or more of 
it...that's where I killed the duplicity-inc's at 49GB (split in 26MB files) 

I would love to hear of others with over half a Terabyte of data or more
and millions of files to see what kind of speed they're getting on their
incrementals and how big they are...

Duplicity alone is very handy for smaller backup jobs I think, but maybe
I'll end up doing a duplicity backup of an rdiff-backup repository and
then rsync'ing that encrypted duplicified rdiff-backup directory over to
the remote server...

Would there be any issues with this plan? Is it a bad idea for some
reason, could duplicity messup the rdiff-backup structure because it
might have hardlinks or any other reason why it would be a bad idea?
thanks

-- 
You received this question notification because you are a member of
duplicity-team, which is an answer contact for Duplicity.

_______________________________________________
Mailing list: https://launchpad.net/~duplicity-team
Post to     : [email protected]
Unsubscribe : https://launchpad.net/~duplicity-team
More help   : https://help.launchpad.net/ListHelp

Reply via email to