Running into a problem with my larger machines doing backups. 90% of the time, 
the backup ends with the following message:

backup failed (Tar exited with error 256 () status)

I believe I read somewhere that it was due to a file changing during backup, 
probably in combination with the latency introducted in backup across the 
network.

The reason I went with tar in the first place is that I read that rsync 
consumes more memory the larger the file list is, and this box has 256MB of RAM.

My question at this point is the best approach to fixing this problem. I have 
been running backuppc since the beginning of the year, so I have at least 2 
fulls and a weeks worth of incrementals, so, at least in theory, the number of 
files being rsynced should not be overly large. So should I convert my "problem 
children" to rsync, or should I convert everything over to rsync? Or is there a 
workaround for tar?

Thanks,
--b

-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to