Re: [BackupPC-users] avoidable failure?

2007-01-15 Thread Holger Parplies
Hi,

Cristian Tibirna wrote on 14.01.2007 at 13:16:29 [Re: [BackupPC-users] 
avoidable failure?]:
 On 14 January 2007 00:34, Holger Parplies wrote:
 [...]
  Might you simply need to increase your $Conf{ClientTimeout}?
  It would make sense that your backups take longer with busy client machines
  than with idle ones, after all.
 
 Interesting suggestion. I will try to investigate more in this direction. I 
 don't know exactly was should be done as a matter of test though, as the 
 errors aren't reproduceable, as I mentioned in the beginning.

yes, there are always some things you can't really test :-(.
You could look at the logs of your failing backups though and check whether
the time they ran seems to correspond with your current (resp. former)
setting of $Conf{ClientTimeout}. If that is set to 7200 s (2 h) and you have
failed backups running 30 min, 43 min and 22 min and good ones running 15
min, 35 min and 65 min, then that's obviously the wrong track. If your good
backups are comparatively short though and the others fail after roughly 2
hours, I'd simply try maybe doubling the value and seeing if the failures go
away. As I understand it, $Conf{ClientTimeout} is not a value that needs to
be fine tuned to be only slightly larger than your longest backup, but
rather a measure to eventually detect and kill hung backups. The only really
problematic 'resource' a hung backup seems to consume is that it counts in
terms of $Conf{MaxBackups}, thus preventing or postponing running other
backups (well, yes, a hung rsync might consume a considerable amount of
(swappable) memory). I've read of people using timeout values of 72000 (20h)
or more.

Regards,
Holger

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] avoidable failure?

2007-01-15 Thread Craig Barratt
Cristian writes:

 First, I'd like to thank Craig and his collaborators, who gave us this great 
 tool that simplifies our lives greatly. I use BackupPC for many years 
 already, in many settings, and I couldn't think of a better way of dealing 
 with this thorny requirement.

Thanks.

 So, once in a while, I get errors like this:
 
 -
 Xfer PIDs are now 9356,9357
 [ skipped 6674 lines ]
 finish: removing in-process file 
 ctibirna-work/MEF/CVS-HEAD/GIREF/src/commun/Adaptation/.makedep
 [ skipped 39 lines ]
 Done: 15 files, 106665 bytes
 Got fatal error during xfer (aborted by signal=ALRM)
 Backup aborted by user signal
 ---

It is failing because an ALRM (alarm) signal got delivered to
the process.  You should try increasing $Conf{ClientTimeout}
significantly (eg: 10x).

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] avoidable failure?

2007-01-14 Thread Holger Parplies
Hi,

Randy Barlow wrote on 12.01.2007 at 11:58:59 [Re: [BackupPC-users] avoidable 
failure?]:
 Cristian Tibirna wrote:
  The file named in the error is almost always a temporary one. It is thus 
  conceivable that the file was created before rsync's index building and was 
  destroyed before rsync finishing the syncing. But this is only my 
  supposition. I don't know exactly what happens.
  [...]
 
 Usually, temporary files are created in the /tmp or /var/tmp directories
 - - I would recommend that you add these to the excludes directory since
 they can cause problems as you have noted, and also since there is no
 advantage to backing up temporary files.  Hope this helps!

while I perfectly agree that backing up temporary files has no advantage,
the quoted case was

Cristian Tibirna wrote on 12.01.2007 at 10:13:39 [[BackupPC-users] avoidable 
failure?]:
 [...]
 So, once in a while, I get errors like this:
 
 -
 Xfer PIDs are now 9356,9357
 [ skipped 6674 lines ]
 finish: removing in-process file 
 ctibirna-work/MEF/CVS-HEAD/GIREF/src/commun/Adaptation/.makedep
 [ skipped 39 lines ]
 Done: 15 files, 106665 bytes
 Got fatal error during xfer (aborted by signal=ALRM)
 Backup aborted by user signal
 ---

which is an example of a temporary file that is obviously *not* located in
/tmp or /var/tmp and can't be made to be.

You can still exclude such files, as I understand it, by adding them to
$Conf{RsyncArgs} (at the end of the list). Something like

'--exclude=.makedep',
'--exclude=*.o',

should do the trick (modify to suit your needs). Note that adding '.makedep'
et al. to $Conf{BackupFilesExclude} will *not* work (at least not in version
2.1.1), as the code anchors relative paths to the root of the 'share'.

rsync also has a '--cvs-exclude' option, but that would probably exclude more
from your backups than you would want it to (eg. the CVS directories).


That said, 'aborted by signal=ALRM' does *not* sound like a temporary file
problem to me (though I don't know what a temporary file problem *would*
sound like). Might you simply need to increase your $Conf{ClientTimeout}? It
would make sense that your backups take longer with busy client machines
than with idle ones, after all.

Regards,
Holger

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] avoidable failure?

2007-01-14 Thread Cristian Tibirna
On 14 January 2007 00:34, Holger Parplies wrote:
 while I perfectly agree that backing up temporary files has no advantage,
 the quoted case was

Yes, indeed, I don't backup the temporary dirs and I exclude lots of temporary 
(and generated) files.

 Cristian Tibirna wrote on 12.01.2007 at 10:13:39 [[BackupPC-users] avoidable 
failure?]:
  [...]
  So, once in a while, I get errors like this:
 
  -
  Xfer PIDs are now 9356,9357
  [ skipped 6674 lines ]
  finish: removing in-process file
  ctibirna-work/MEF/CVS-HEAD/GIREF/src/commun/Adaptation/.makedep
  [ skipped 39 lines ]
  Done: 15 files, 106665 bytes
  Got fatal error during xfer (aborted by signal=ALRM)
  Backup aborted by user signal
  ---

 which is an example of a temporary file that is obviously *not* located in
 /tmp or /var/tmp and can't be made to be.

And it is only an example. More drastic cases are those of files that are 
undistinguishable from important ones, but are, still, created temporarily 
(e.g. .c or .cpp files) by test scripts and such, and can appear and 
dissapear during backup's activity. Those, I can't exclude. This is why I was 
asking if there would be some way (which I overlooked, or which isn't yet 
programmed into BackupPC) to ignore such errors and gracefully finish backups 
even if files got removed between list generation and actual copying.

 You can still exclude such files, as I understand it, by adding them to
 $Conf{RsyncArgs} (at the end of the list). Something like

   '--exclude=.makedep',
   '--exclude=*.o',

 should do the trick (modify to suit your needs). Note that adding
 '.makedep' et al. to $Conf{BackupFilesExclude} will *not* work (at least
 not in version 2.1.1), as the code anchors relative paths to the root of
 the 'share'.

It works with $Conf{BackupFilesExclude} too. We use it extensively (to ignore 
mp3, avi and .o files, mostly ;-)

 That said, 'aborted by signal=ALRM' does *not* sound like a temporary file
 problem to me (though I don't know what a temporary file problem *would*
 sound like). Might you simply need to increase your $Conf{ClientTimeout}?
 It would make sense that your backups take longer with busy client machines
 than with idle ones, after all.

Interesting suggestion. I will try to investigate more in this direction. I 
don't know exactly was should be done as a matter of test though, as the 
errors aren't reproduceable, as I mentioned in the beginning.

Thanks a lot for the suggestions

-- 
Cristian Tibirna(418) 656-2131 / 4340
  Laval University - Québec, CAN ... http://www.giref.ulaval.ca/~ctibirna
  Research professional - GIREF ... [EMAIL PROTECTED]

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] avoidable failure?

2007-01-12 Thread Cristian Tibirna

Hello

First, I'd like to thank Craig and his collaborators, who gave us this great 
tool that simplifies our lives greatly. I use BackupPC for many years 
already, in many settings, and I couldn't think of a better way of dealing 
with this thorny requirement.

My largest use is in a research network, containing a number of workstations, 
a cluster and two very busy servers. 

My network runs on Linux, the files are distributed in the network by NFS and 
the backuppc method is rsync.

All functions normally, usually, with the noticeable exception of occasionally 
failed backups of the servers and of the cluster's storage node.

The particularity of these two machines is that they are in heavy use as file 
servers. Although I do the backups at night, when people are supposed to 
sleep, the servers are still seeing activity, due especially to automated 
compiling and testing tasks.

So, once in a while, I get errors like this:

-
Xfer PIDs are now 9356,9357
[ skipped 6674 lines ]
finish: removing in-process file 
ctibirna-work/MEF/CVS-HEAD/GIREF/src/commun/Adaptation/.makedep
[ skipped 39 lines ]
Done: 15 files, 106665 bytes
Got fatal error during xfer (aborted by signal=ALRM)
Backup aborted by user signal
---

The file named in the error is almost always a temporary one. It is thus 
conceivable that the file was created before rsync's index building and was 
destroyed before rsync finishing the syncing. But this is only my 
supposition. I don't know exactly what happens.

So, I wonder if this kind of problem (and the consequent generation of a 
partial backup, thus the absence of a leggit backup from the set) is 
avoidable.

Thanks for your attention.

-- 
Cristian Tibirna(1-418-) 656-2131 / 4340
  Laval University - Quebec, CAN ... http://www.giref.ulaval.ca/~ctibirna
  Research professional at GIREF ... [EMAIL PROTECTED]

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] avoidable failure?

2007-01-12 Thread Randy Barlow
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Cristian Tibirna wrote:
 The file named in the error is almost always a temporary one. It is thus 
 conceivable that the file was created before rsync's index building and was 
 destroyed before rsync finishing the syncing. But this is only my 
 supposition. I don't know exactly what happens.
 
 So, I wonder if this kind of problem (and the consequent generation of a 
 partial backup, thus the absence of a leggit backup from the set) is 
 avoidable.

Usually, temporary files are created in the /tmp or /var/tmp directories
- - I would recommend that you add these to the excludes directory since
they can cause problems as you have noted, and also since there is no
advantage to backing up temporary files.  Hope this helps!

R
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFFp75S7So1xaF/eR8RAmqeAJ9RYQYqNrAChi841cTHgThxKy7grACgt92L
ZKv376vWpHZhgwbSQGwKNlU=
=G2Et
-END PGP SIGNATURE-

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/