I'm experiencing something and I just want to check if I'm understanding what is happening.
I backup Linux-Linux with XferMethod = rsync. Occasionally a full backup of one of my large systems fails for one reason or another. So I exclude some big directories and run it again, and then have the subsequent incrementals get the previously excluded directory. However, and this is the part I'm not completely sure of, each subsequent incremental copies the originally missing files over and over and declares them pooled rather than same. I thinks this is the way it is designed, but it is a problem in this case, because copying these large files from large directories has a strong performance impact on the client machine (which annoys my users). Is this indeed what is happening? And is there anything I can do about it, short of running an unscheduled full backup? Tony Schreiner ------------------------------------------------------------------------- This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/ _______________________________________________ BackupPC-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
