On 12/14 04:25 , Robin Lee Powell wrote: > Not with large trees it isn't. I have 3.5 million files, and more > than 300GiB of data, in one file system. The last incremental took > *twenty one hours*. I have another backup that's 4.5 million files, > also more than 300 GiB of data, also in one file system. The full > took 20 hours; it hasn't succeeded at an incremental yet. That's > over full 100BaseT, if not better (I'm not the networking person).
Pardon me if this has been mentioned already and I missed it, but I presume you've already gone over both ends of the connection and checked for possible optimizations? - only one backup running at a time - blowfish or arcfour ssh cipher - no LVM on your disks - various RAID optimizations - avoid swapping I've had similar problems like this at times; I have had to subdivide a machine into several separate backups because one or another part of it would flake out and fail the whole backup. You may want to experiment with subdivisions being shares all backed up under one host profile; or even separate host profiles. Each has their advantages and disadvantages. -- Carl Soderstrom Systems Administrator Real-Time Enterprises www.real-time.com ------------------------------------------------------------------------------ Return on Information: Google Enterprise Search pays you back Get the facts. http://p.sf.net/sfu/google-dev2dev _______________________________________________ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/