p closer to millions of Verizon customers
> http://p.sf.net/sfu/verizon-dev2dev
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.
t, the mount should fail and BackupPC would error out
because its directory tree would not be present.
Perhaps it's not as pretty as the scripted solution you're using, but
I think it should "just work" otherwise.
--
the chunks, you
will most definitely see pain in the backup and restore processes
compared to the existing mechanism of deduping at the file level.
--
Michael Barrow
michael at michaelbarrow dot name
-
This SF.net ema
essing on the backed up files, you'll take a greater
hit. In the long run, I'm not sure it's worth the additional cost of
impacting recovery time objectives and putting additional abuse on
the environment during backups.
--
Michael Barrow
michael at michaelbarrow dot name
-
On Nov 14, 2007, at 11:41 AM, Michael Barrow wrote:
On Nov 14, 2007, at 10:19 AM, Gene Horodecki wrote:
Hi there.. I just did my first big backup with backuppc and to be
honest the results were alittle dissapointing.. It's taken 6 hours
now at approximately 80% CPU to back up my
f
compression now, the next backup will snag all of the files again and
write them into the uncompressed pool.
--
Michael Barrow
michael at michaelbarrow dot name
-
This SF.net email is sponsored by: Splunk Inc.
Still g