Unfortunately, yes.

What you might want to do is put some of the larger directories in the 
BackupFilesExclude folder for that client.  Then, do a full backup.  
After that backup succeeds, remove one of the excluded folders and 
trigger another backup.  Rinse, repeat.

This way you will populate the set of files that BackupPC knows about, 
and subsequent backups will skip them quickly and move on to the 'new' 
files, transferring only those.  Once the whole machine has had all its 
files backed up once, even a slow connection is pretty dependable for 
backups.

In my experience, having a single very long backup over a slow 
connection is doomed to fail repeatedly for various reasons (loss of 
connection, reboot, network hiccup, etc).

Hope that helps,
JH

Yves Trudeau wrote:
> Hi,
>     we are experimenting with backuppc et we are backuping a 30 GB share 
> over Internet with rsyncd.  This morning, after more than 30 hours of 
> transfer,  the remote host was accidentely rebooted so the connection 
> was lost for a few minutes.  Backuppc restarted the backup very nicely 
> but, the content of the "new" folder seems to be lost and all the files 
> are transfert again.   Is it the normal behavior of backuppc?  We use 
> 3.0.0beta3.
>
> Yves
>
>   

-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to