Re: [BackupPC-users] avoidable failure?

2007-01-14 Thread Holger Parplies
Hi,

Randy Barlow wrote on 12.01.2007 at 11:58:59 [Re: [BackupPC-users] avoidable 
failure?]:
> Cristian Tibirna wrote:
> > The file named in the error is almost always a temporary one. It is thus 
> > conceivable that the file was created before rsync's index building and was 
> > destroyed before rsync finishing the syncing. But this is only my 
> > supposition. I don't know exactly what happens.
> > [...]
> 
> Usually, temporary files are created in the /tmp or /var/tmp directories
> - - I would recommend that you add these to the excludes directory since
> they can cause problems as you have noted, and also since there is no
> advantage to backing up temporary files.  Hope this helps!

while I perfectly agree that backing up temporary files has no advantage,
the quoted case was

Cristian Tibirna wrote on 12.01.2007 at 10:13:39 [[BackupPC-users] avoidable 
failure?]:
> [...]
> So, once in a while, I get errors like this:
> 
> -
> Xfer PIDs are now 9356,9357
> [ skipped 6674 lines ]
> finish: removing in-process file 
> ctibirna-work/MEF/CVS-HEAD/GIREF/src/commun/Adaptation/.makedep
> [ skipped 39 lines ]
> Done: 15 files, 106665 bytes
> Got fatal error during xfer (aborted by signal=ALRM)
> Backup aborted by user signal
> ---

which is an example of a temporary file that is obviously *not* located in
/tmp or /var/tmp and can't be made to be.

You can still exclude such files, as I understand it, by adding them to
$Conf{RsyncArgs} (at the end of the list). Something like

'--exclude=.makedep',
'--exclude=*.o',

should do the trick (modify to suit your needs). Note that adding '.makedep'
et al. to $Conf{BackupFilesExclude} will *not* work (at least not in version
2.1.1), as the code anchors relative paths to the root of the 'share'.

rsync also has a '--cvs-exclude' option, but that would probably exclude more
from your backups than you would want it to (eg. the CVS directories).


That said, 'aborted by signal=ALRM' does *not* sound like a temporary file
problem to me (though I don't know what a temporary file problem *would*
sound like). Might you simply need to increase your $Conf{ClientTimeout}? It
would make sense that your backups take longer with busy client machines
than with idle ones, after all.

Regards,
Holger

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Proper way to schedule Archive jobs

2007-01-14 Thread Timothy J. Massey
Hello!

I have set up a cronjob that automatically archives my servers every 
night.  This works fine, in that I get an archive created every day. 
However, there are a couple of annoyances with this that I was hoping I 
could address.

The first is that it overwrites the previous archive daily.  Is it 
possible to get ArchiveHost/TarCreate to use the backup number in the 
file name even when you use "-1" as the backup job number?

The second issue, though, is more troublesome.  By running the archive 
from cron, there is no record of the archive within BackupPC.  I'd 
rather have the process managed from within BackupPC, rather than 
generate a stream of cron e-mails that will quickly get ignored by the 
end-user.

Is there a way to launch an archive on a regular basis where the jobs 
are recorded and managed within BackupPC?

Thank you very much for any help you might be able to give me.  I 
greatly appreciate it.

Tim Massey

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] blackout fulls & incrs differently?

2007-01-14 Thread brien dieterle
I'd like to run "full" backups at night (say, 10pm-2am), but run 
incrementals every 2 hours from 6am-6pm.  There doesn't seem to be any 
way to do this.  Unless, maybe I can use a predump script to test the 
time and $type and abort fulls that try to run during the day?  It would 
be annoying to see a lot of bogus "errors", though.  Any ideas? 

Thanks!
brien



-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] avoidable failure?

2007-01-14 Thread Cristian Tibirna
On 14 January 2007 00:34, Holger Parplies wrote:
> while I perfectly agree that backing up temporary files has no advantage,
> the quoted case was

Yes, indeed, I don't backup the temporary dirs and I exclude lots of temporary 
(and generated) files.

> Cristian Tibirna wrote on 12.01.2007 at 10:13:39 [[BackupPC-users] avoidable 
failure?]:
> > [...]
> > So, once in a while, I get errors like this:
> >
> > -
> > Xfer PIDs are now 9356,9357
> > [ skipped 6674 lines ]
> > finish: removing in-process file
> > ctibirna-work/MEF/CVS-HEAD/GIREF/src/commun/Adaptation/.makedep
> > [ skipped 39 lines ]
> > Done: 15 files, 106665 bytes
> > Got fatal error during xfer (aborted by signal=ALRM)
> > Backup aborted by user signal
> > ---
>
> which is an example of a temporary file that is obviously *not* located in
> /tmp or /var/tmp and can't be made to be.

And it is only an example. More drastic cases are those of files that are 
undistinguishable from important ones, but are, still, created temporarily 
(e.g. .c or .cpp files) by test scripts and such, and can appear and 
dissapear during backup's activity. Those, I can't exclude. This is why I was 
asking if there would be some way (which I overlooked, or which isn't yet 
programmed into BackupPC) to ignore such errors and gracefully finish backups 
even if files got removed between list generation and actual copying.

> You can still exclude such files, as I understand it, by adding them to
> $Conf{RsyncArgs} (at the end of the list). Something like
>
>   '--exclude=.makedep',
>   '--exclude=*.o',
>
> should do the trick (modify to suit your needs). Note that adding
> '.makedep' et al. to $Conf{BackupFilesExclude} will *not* work (at least
> not in version 2.1.1), as the code anchors relative paths to the root of
> the 'share'.

It works with $Conf{BackupFilesExclude} too. We use it extensively (to ignore 
mp3, avi and .o files, mostly ;-)

> That said, 'aborted by signal=ALRM' does *not* sound like a temporary file
> problem to me (though I don't know what a temporary file problem *would*
> sound like). Might you simply need to increase your $Conf{ClientTimeout}?
> It would make sense that your backups take longer with busy client machines
> than with idle ones, after all.

Interesting suggestion. I will try to investigate more in this direction. I 
don't know exactly was should be done as a matter of test though, as the 
errors aren't reproduceable, as I mentioned in the beginning.

Thanks a lot for the suggestions

-- 
Cristian Tibirna(418) 656-2131 / 4340
  Laval University - Québec, CAN ... http://www.giref.ulaval.ca/~ctibirna
  Research professional - GIREF ... [EMAIL PROTECTED]

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem with File::RsyncP

2007-01-14 Thread Craig Barratt
Mailinglist writes:

> Ok, I understand. But the documentation says that at least perl 5.6.x is 
> required.  I will upgrade to perl 5.8.

I will fix that.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Stuck getting localhost to work

2007-01-14 Thread Craig Barratt
James writes:

> First, the obligatory and much deserved thank you for all the work  
> put into backuppc.

Thanks.

> $Conf{BackupFilesExclude} = ['/usr/local/var/backups'];
> $Conf{RsyncClientCmd} = 'sudo $rsyncPath $argList+';
> $Conf{RsyncClientRestoreCmd} = 'sudo $rsyncPath $argList+';

You should use a full path for sudo.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/