I've got it set very high, something like 700000.

Here is the error from the Xferlog..
same     644 1002/1002       40960 www/xxx/docs.stage/files/national/news/project_costing_survey.doc
same 644 1002/1002 122930 www/xxx/docs.stage/files/national/news/project_costing_survey.pdf
create d 755 1002/1002 4096 www/xxx/docs.stage/files/new_south_wales
create d 755 1002/1002 4096 www/xxx/docs.stage/files/new_south_wales/communique
same 644 1002/1002 78826 www/xxx/docs.stage/files/new_south_wales/communique/aca_communique_03-03-05.pdf
same     644 1002/1002       51831 www/xxx/docs.stage/files/new_south_wales/communique/aca_communique_05-01-05.pdf
Read EOF:
Tried again: got 0 bytes
Child is aborting
Parent read EOF from child: fatal error!
Done: 14625 files, 1369509991 bytes
Got fatal error during xfer (Child exited prematurely)
Backup aborted (Child exited prematurely)


On 3/9/06, Craig Barratt <[EMAIL PROTECTED]> wrote:
Stephen Vaughan writes:

> Is there anyway to get backuppc to continue to backup regardless of
> errors? I had it backing up a 2gb db file, and during the transfer the
> file was modified and backuppc recognised this and aborted the backup.
>
> Remote[1]: send_files failed to open
> misc/backups/netchant-db.blobdump: No such file or directory
> misc/backups/netchant-db.dump: md4 doesn't match: will retry in phase
> 1; file removed
> [ skipped 11751 lines ]
> Can't write 32840 bytes to socket
> [ skipped 3882 lines ]
> Read EOF:
> Tried again: got 0 bytes
> Child is aborting
> Parent read EOF from child: fatal error!
> Done: 14625 files, 1369509991 bytes
> Got fatal error during xfer (Child exited prematurely)
> Backup aborted (Child exited prematurely)

It is continuing: there are 11751 files mentioned before
the next error.

What is the value of $Conf{ClientTimeout}?  If it is 7200
please increase it by 20x.

Craig



--
Best Regards,
Stephen


Reply via email to