Re: [BackupPC-users] Email reports

2007-01-15 Thread Jean-Michel Beuken
Hi,

You can use in per-pc config.pl, the parameter

$Conf{DumpPostUserCmd} = '/usr/local/BackupPC/bin/BackupNotify $user $xferOK 
$host $type $client $hostIP ';

the script BackupNotify is something like that :


#!/usr/bin/perl
#
#
# $user $xferOK $host $type
#
$mailprog = '/usr/lib/sendmail';

$user = $ARGV[0];
$xferOK = $ARGV[1];
$host = $ARGV[2];
$type = $ARGV[3];
$client = $ARGV[4];
$hostIP = $ARGV[5];

$recipient = $user;

$msg = "Rapport de sauvegarde de BackupPC pour le PC \"$client ($host)\" : 
\n\n";
if ( $xferOK) {
 $msg .= "Le backup ($type) s'est bien deroule\n";
 $subject = "Backup de $client : OK !";
 &sendmail($msg);
} else {
 $msg .= "Le backup ($type) a pose probleme ! \n";
 $subject = "Probleme de backup de $client...";
}
#$msg .= "$user, $xferOK, $host, $type ";

sub sendmail {
my($msg) = @_;
open(MAIL, "|$mailprog -t") && do {
 print MAIL "To: $recipient\n";
 print MAIL "From: [EMAIL PROTECTED]";
#print MAIL "Cc:\n";
#print MAIL "Bcc:\n";
 print MAIL "Subject: $subject \n\n";
 print MAIL "$msg\n";
 print MAIL "\nVisiter regulierement le site 
\n";
 print MAIL "\nContact support : mailto:[EMAIL PROTECTED]";
 close (MAIL);
};
}


Francisco Daniel Igual Peña wrote:
> 
> Hi, 
> 
> Is it possible that Backuppc sends weekly (or daily, dont mind) reports to a
> specific email even though there are no errors with the backups?. I want
> something like that, but I dont know how to do it.
> 
> Thanks very much.
> 
> 
> -
> Take Surveys. Earn Cash. Influence the Future of IT
> Join SourceForge.net's Techsay panel and you'll get the chance to share your
> opinions on IT & business topics through brief surveys - and earn cash
> http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/backuppc-users
> http://backuppc.sourceforge.net/
> 
> 

-- 

Jean-Michel Beuken
Informaticien departemental (MAPR/FSA/SISE)

Universite catholique de Louvain-La-Neuve
Lab. PCPM/FSA, Bat. BOLTZMANN
1,Place Croix du Sud
1348 Louvain-La-NeuveBELGIUM

Tel : (3210) 473570  Fax : (3210) 473452
HTTP://www.mapr.ucl.ac.be/~beuken


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Unknown host error but host exists?

2007-01-15 Thread Craig Barratt
Jesse writes:

> 2007-01-12 15:42:12 User bbBackup requested backup of unknown host  
> colo.vipmn.com
> 2007-01-12 15:42:13 Unknown host colo.vipmn.com for status request
> 
> But I can SSH to the machine from the backup server using that  
> address just fine.

"unknown host colo.vipmn.com" means that host isn't in BackupPC's
hosts file, or you haven't reloaded the hosts file since adding it.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] avoidable failure?

2007-01-15 Thread Holger Parplies
Hi,

Cristian Tibirna wrote on 14.01.2007 at 13:16:29 [Re: [BackupPC-users] 
avoidable failure?]:
> On 14 January 2007 00:34, Holger Parplies wrote:
> [...]
> > Might you simply need to increase your $Conf{ClientTimeout}?
> > It would make sense that your backups take longer with busy client machines
> > than with idle ones, after all.
> 
> Interesting suggestion. I will try to investigate more in this direction. I 
> don't know exactly was should be done as a matter of test though, as the 
> errors aren't reproduceable, as I mentioned in the beginning.

yes, there are always some things you can't really test :-(.
You could look at the logs of your failing backups though and check whether
the time they ran seems to correspond with your current (resp. former)
setting of $Conf{ClientTimeout}. If that is set to 7200 s (2 h) and you have
failed backups running 30 min, 43 min and 22 min and good ones running 15
min, 35 min and 65 min, then that's obviously the wrong track. If your good
backups are comparatively short though and the others fail after roughly 2
hours, I'd simply try maybe doubling the value and seeing if the failures go
away. As I understand it, $Conf{ClientTimeout} is not a value that needs to
be fine tuned to be only slightly larger than your longest backup, but
rather a measure to eventually detect and kill hung backups. The only really
problematic 'resource' a hung backup seems to consume is that it counts in
terms of $Conf{MaxBackups}, thus preventing or postponing running other
backups (well, yes, a hung rsync might consume a considerable amount of
(swappable) memory). I've read of people using timeout values of 72000 (20h)
or more.

Regards,
Holger

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_dump Segmentation Fault

2007-01-15 Thread Craig Barratt
Art writes:

> I have been running BackupPC for over two years on a Mepis Linux box. 
> All was well until early in the month when one of the WinXP boxes 
> mysteriously stopped backing up. I have upgraded BackupPC to 2.1.2pl1 
> (using the Mepis Kpackage utility). I have used CPAN up re-install 
> File::RsyncP. I have replaced the rsyncd on the PC with 
> cygwin-rsyncd-2.6.8_0.zip from SourceForge.
> 
> I ran BackupPC_dump manually (with -v) and all went well until...
>   same 644   400/401 5304636 old 
> pictures/2006/2006-07-08/2006-07-08 15-29-04.JPG
>   same 644   400/401 5124612 old 
> pictures/2006/2006-07-08/2006-07-08 15-29-05.JPG
>   same 644   400/401 5098832 old 
> pictures/2006/2006-07-08/2006-07-08 15-30-07.JPG
>   same 644   400/401 5422218 old 
> pictures/2006/2006-07-08/2006-07-08 15-30-09.JPG
> Segmentation fault
> 
> The "old pictures" directory has 35,377 files encompassing 93.6 GB. Have 
> I hit a wall or is there a workaround???

Have you installed that latest File::RsyncP (0.68)?  Sounds like you
have, but I wanted to confirm.  What are your perl and Compress::Zlib
versions?

Any chance you are out of memory?  Also, the next step is to get a
stack trace out of perl.  It's important to know which process it
is and where it is failing.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] reinstalling backuppc

2007-01-15 Thread Craig Barratt
David writes:

> I would like to reinstall Linux on new hard disks and copy the existing
> backuppc archives from the old disk to the new. What are the potential
> pitfalls? Can I just copy the whole directory of backuppc recursively?
> How can I copy and preserve hard links accross hard disks?

There is a lot of discussion of these issues on the list.
It is quite time consuming to copy the pool because of all
the hardlinks.

The easiest approach is to start fresh with a new BackupPC
install and keep the old disks around for a while in case
you need one of the old backups.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] avoidable failure?

2007-01-15 Thread Craig Barratt
Cristian writes:

> First, I'd like to thank Craig and his collaborators, who gave us this great 
> tool that simplifies our lives greatly. I use BackupPC for many years 
> already, in many settings, and I couldn't think of a better way of dealing 
> with this thorny requirement.

Thanks.

> So, once in a while, I get errors like this:
> 
> -
> Xfer PIDs are now 9356,9357
> [ skipped 6674 lines ]
> finish: removing in-process file 
> ctibirna-work/MEF/CVS-HEAD/GIREF/src/commun/Adaptation/.makedep
> [ skipped 39 lines ]
> Done: 15 files, 106665 bytes
> Got fatal error during xfer (aborted by signal=ALRM)
> Backup aborted by user signal
> ---

It is failing because an ALRM (alarm) signal got delivered to
the process.  You should try increasing $Conf{ClientTimeout}
significantly (eg: 10x).

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] New version of BackupPC 3?

2007-01-15 Thread Timothy J. Massey
Hello!

I was wondering if there will be a new release of the BackupPC 3 code? 
I know that there was at least one change (the permissions on web-edited 
files), and at least one bug (the GUI brings up the oldest log file when 
you click on the LOG link) that I am specifically interested in.

Are we near to another beta release, or will there be a RC release (or 
even 3.0.0) in the near future?

I must say that I have been *extremely* happy with BackupPC 3 so far.  I
am excited to see it in an offical release version!  I do not want to 
start updating my production servers with beta code...  :)

Tim Massey

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] [Semi-OT] Encrypting backup partitions?

2007-01-15 Thread Carl Wilhelm Soderstrom
On 01/11 08:20 , Bradley Alexander wrote:
> I (currently) have a 200GB partition for backups, and I was considering
> using Truecrypt on-the-fly encryption. I'm still on the fence regarding
> whether to set it up as individual containers or one large one. 

I don't know anything about this encryption tool; but I would suspect that
you'll need to set it all up as one encrypted store. Otherwise the hardlinks
in the pool won't work correctly.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Email reports

2007-01-15 Thread Travis Fraser
On Thu, 2007-01-11 at 10:30 +0100, Francisco Daniel Igual Peña wrote:
> 
> Hi, 
> 
> Is it possible that Backuppc sends weekly (or daily, dont mind) reports to a
> specific email even though there are no errors with the backups?. I want
> something like that, but I dont know how to do it.
> 
I have a daily status email and RSS feed. You can edit out the RSS stuff
if you want just email. The status email address is configurable. I
wrote up a little howto for myself (I am running backuppc-2.1.1, so
adjustments may be necessary if using a newer version):

BackupPC RSS feed and email status HOWTO
---

1. I created a script[see step 5] called BackupPC_statusUpdate modeled
on BackupPC_sendEmail. The script parses the backup status of each host,
creates an RSS feed and also sends the information by email.
BackupPC_statusUpdate resides in $BinDir (/usr/lib/backuppc/bin/ in my
case) and runs once each night.


2. Added $Conf{EMailStatusUserName} to the main config
file /var/lib/backuppc/conf/config.pl for email address(es) to receive
nightly status emails:

 #
 $Conf{EMailFromUserName} = 'backuppc';

+#
+# Destination address for daily positive status email.
+#
+$Conf{EMailStatusUserName} = '[EMAIL PROTECTED]';

 #
 # Destination address to an administrative user who will receive a
 


3. Added a call to BackupPC_statusUpdate in BackupPC_nightly (note the
addition of the semicolon on the first system command below):

 if ( $opts{m} ) {
 print("log BackupPC_nightly now running BackupPC_sendEmail\n");
!system("$BinDir/BackupPC_sendEmail");

+# RSS and positive status email
+#
+print("log BackupPC_nightly now running BackupPC_statusUpdate\n");
+system("$BinDir/BackupPC_statusUpdate");
 }


4. Added header (to advertise feed to RSS readers e.g. Firefox) on my
backup server documentation webpage (this can be any spot viewable from
your intranet) at  /var/www/localhost/htdocs/index.html. This is an
optional step. The link path is the place in the webroot that the main
script writes the xml file.

 

+

 


5. BackupPC_statusUpdate

#!/usr/bin/perl
#=
-*-perl-*-
#
# BackupPC_statusUpdate
#
# DESCRIPTION
#
#   This script implements a positive status email and an RSS feed.
#
#   The script is called from BackupPC_nightly.
#
# AUTHOR
#   Travis Fraser [EMAIL PROTECTED]
#
# Credit to Rich Duzenbury for the original idea.
#
#
# Requires XML::RSS
#
# Edit the variable $serverName to suit depending on DNS status on your
# network.
# Edit the "use lib ..." in the 3rd line of code below.
# Edit the $base_url in the RSS section to reflect the correct path to
# the cgi page.
# Edit the "$rss->save ..." line near the end of the script to suit.
#
#

use strict;
no  utf8;
#
# The lib path needs to match that in the stock backuppc files.
#
use lib "/usr/lib/backuppc/lib";
use BackupPC::Lib;
use XML::RSS;

use Data::Dumper;
use Getopt::Std;
use DirHandle ();
use vars qw($Lang $TopDir $BinDir %Conf);

#
# Variables
#
my($fullTot, $fullSizeTot, $incrTot, $incrSizeTot, $str, $mesg,
   $strNone, $strGood, $hostCntGood, $hostCntNone);
$hostCntGood = $hostCntNone = 0;

my $serverName = '192.168.1.3';

#
# Initialize
#
die("BackupPC::Lib->new failed\n") if ( !(my $bpc =
BackupPC::Lib->new) );
$TopDir = $bpc->TopDir();
$BinDir = $bpc->BinDir();
%Conf   = $bpc->Conf();
$Lang   = $bpc->Lang();

$bpc->ChildInit();

my $err = $bpc->ServerConnect($Conf{ServerHost}, $Conf{ServerPort});
if ( $err ) {
print("Can't connect to server ($err)\n");
exit(1);
}
#
# Retrieve status of hosts
#
my $reply = $bpc->ServerMesg("status hosts");
$reply = $1 if ( $reply =~ /(.*)/s );
my(%Status, %Info, %Jobs, @BgQueue, @UserQueue, @CmdQueue);
eval($reply);
#
# Ignore status related to admin and trash jobs
foreach my $host ( grep(/admin/, keys(%Status)) ) {
delete($Status{$host}) if ( $bpc->isAdminJob($host) );
}
delete($Status{$bpc->trashJob});

#
# Set up RSS feed
#
my $now = $bpc->timeStamp(time);

#
# The cgi page in this case is over HTTPS
#
my $base_url = 'https://' . $serverName . '/cgi-bin/BackupPC_Admin';

my $rss = new XML::RSS (version => '2.0', encoding => 'ISO-8859-1');

$rss->channel( title => 'BackupPC Server',
   link => $base_url

Re: [BackupPC-users] Stuck getting localhost to work

2007-01-15 Thread Holger Parplies
Hi,

James Kyle wrote on 13.01.2007 at 10:56:43 [[BackupPC-users] Stuck getting 
localhost to work]:
> [...]
> I have backuppc installed and working on localhost. I've set my pc  
> specific configl.pl:
> 
> $Conf{RsyncShareName} = ['/Network/Servers/mydomain/Users/','/usr/ 
> local','/opt/', '/Volumes/'];
> $Conf{BackupFilesExclude} = ['/usr/local/var/backups'];
> $Conf{RsyncClientCmd} = 'sudo $rsyncPath $argList+';
> $Conf{RsyncClientRestoreCmd} = 'sudo $rsyncPath $argList+';

unless I'm completely mistaken, it should be '$argList' without the '+' in
both RsyncClientCmd and RsyncClientRestoreCmd as sudo preserves its argument
boundaries (i.e. does not split arguments) and does not remove quoting. If
'$rsyncPath $argList+' works, that is probably not your current problem though
(but it might cause problems in the future, if at some point argList gets to
contain something that is actually quoted).

> [...]
> Now, from the command line I can execute sudo rsync without a  
> password and the event is logged in my /var/log/system.log. However,  
> if I attempt to do a full backup with the above $Conf settings, it  
> fails with a BackupPC error log entry of:
> 
> Backup failed on localhost (fileListReceive failed)

That sounds a bit as if sudo were adding some output, though in that case I'd
expect a 'Fatal error (bad version)'. Maybe you could give some more details
from your error log such as the exact command being run? Actually, it's
almost always a good idea to quote not only error messages but also some
context, i.e. what happens immediately preceeding the error.

Hope that helps.

Regards,
Holger

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] schedule problem since using latest beta

2007-01-15 Thread Craig Barratt
Brad writes:

> On all of the servers we have installed the latest beta I have seen a
> problem with the blackout periods.  They appear to be ignored or
> interpreted incorrectly.
> 
> I am using the default blackout periods and schedules in all my configs
> but all the incremental backups are starting in the middle of the day,
> 11am mostly.

By default, Blackouts only apply after a host has been successfully
pinged 7 times in a row, which could be ~1 week of elapsed time.
A sequence of 3 failures (eg: host down) resets the count.

Look in the host summary page and you should see text like:

 - Pings to HOST have succeeded 77 consecutive times.
 - Because HOST has been on the network at least 7 consecutive times,
   it will not be backed up from 1:00 to 8:00 on Sun, Mon, Tue, Wed,
   Thu, Fri, Sat.

What does your's say?

Finally, please check your settings of

$Conf{BlackoutPeriods}
$Conf{BlackoutBadPingLimit}
$Conf{BlackoutGoodCnt}

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Proper way to schedule Archive jobs

2007-01-15 Thread Craig Barratt
Tim writes:

> The first is that it overwrites the previous archive daily.  Is it 
> possible to get ArchiveHost/TarCreate to use the backup number in the 
> file name even when you use "-1" as the backup job number?

You could modify the code to use the date in the file name.

> Is there a way to launch an archive on a regular basis where the jobs 
> are recorded and managed within BackupPC?

The proper way to do it is to create an archive request file (see the
CGI code to see how to do that) and to use BackupPC_serverMesg to send
an archive request (again, see the CGI code to see how to do that) to
BackupPC.  BackupPC will then run the archive and keep the log information
about what it did.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Email reports

2007-01-15 Thread Les Stott

Travis Fraser wrote:

On Thu, 2007-01-11 at 10:30 +0100, Francisco Daniel Igual Peña wrote:
  
Hi, 


Is it possible that Backuppc sends weekly (or daily, dont mind) reports to a
specific email even though there are no errors with the backups?. I want
something like that, but I dont know how to do it.



I have a daily status email and RSS feed. You can edit out the RSS stuff
if you want just email. The status email address is configurable. I
wrote up a little howto for myself (I am running backuppc-2.1.1, so
adjustments may be necessary if using a newer version):
  

Travis,

that looks like some fantastic work!! Certainly something that i think 
is worthwhile. I only wish i knew perl so i could knock out stuff like 
this ;)


I haven't tested myself, but if you have a sample email/rss output could 
you post so we can see what it looks like?


Has this been added to any feature requests? implemented in version 3? 
If not i would like to see it become a feature. A daily email to send to 
an admin about the status of the system and good/ bad backups etc is 
worthwhile. In my experience sometimes the end user in charge of 
monitoring the state of hosts being backed up is a little slack.


Regards,

Les



BackupPC RSS feed and email status HOWTO
---

1. I created a script[see step 5] called BackupPC_statusUpdate modeled
on BackupPC_sendEmail. The script parses the backup status of each host,
creates an RSS feed and also sends the information by email.
BackupPC_statusUpdate resides in $BinDir (/usr/lib/backuppc/bin/ in my
case) and runs once each night.


2. Added $Conf{EMailStatusUserName} to the main config
file /var/lib/backuppc/conf/config.pl for email address(es) to receive
nightly status emails:

 #
 $Conf{EMailFromUserName} = 'backuppc';

+#
+# Destination address for daily positive status email.
+#
+$Conf{EMailStatusUserName} = '[EMAIL PROTECTED]';

 #
 # Destination address to an administrative user who will receive a
 



3. Added a call to BackupPC_statusUpdate in BackupPC_nightly (note the
addition of the semicolon on the first system command below):

 if ( $opts{m} ) {
 print("log BackupPC_nightly now running BackupPC_sendEmail\n");
!system("$BinDir/BackupPC_sendEmail");

+# RSS and positive status email
+#
+print("log BackupPC_nightly now running BackupPC_statusUpdate\n");
+system("$BinDir/BackupPC_statusUpdate");
 }


4. Added header (to advertise feed to RSS readers e.g. Firefox) on my
backup server documentation webpage (this can be any spot viewable from
your intranet) at  /var/www/localhost/htdocs/index.html. This is an
optional step. The link path is the place in the webroot that the main
script writes the xml file.

 

+

 


5. BackupPC_statusUpdate

#!/usr/bin/perl
#=
-*-perl-*-
#
# BackupPC_statusUpdate
#
# DESCRIPTION
#
#   This script implements a positive status email and an RSS feed.
#
#   The script is called from BackupPC_nightly.
#
# AUTHOR
#   Travis Fraser [EMAIL PROTECTED]
#
# Credit to Rich Duzenbury for the original idea.
#
#
# Requires XML::RSS
#
# Edit the variable $serverName to suit depending on DNS status on your
# network.
# Edit the "use lib ..." in the 3rd line of code below.
# Edit the $base_url in the RSS section to reflect the correct path to
# the cgi page.
# Edit the "$rss->save ..." line near the end of the script to suit.
#
#

use strict;
no  utf8;
#
# The lib path needs to match that in the stock backuppc files.
#
use lib "/usr/lib/backuppc/lib";
use BackupPC::Lib;
use XML::RSS;

use Data::Dumper;
use Getopt::Std;
use DirHandle ();
use vars qw($Lang $TopDir $BinDir %Conf);

#
# Variables
#
my($fullTot, $fullSizeTot, $incrTot, $incrSizeTot, $str, $mesg,
   $strNone, $strGood, $hostCntGood, $hostCntNone);
$hostCntGood = $hostCntNone = 0;

my $serverName = '192.168.1.3';

#
# Initialize
#
die("BackupPC::Lib->new failed\n") if ( !(my $bpc =
BackupPC::Lib->new) );
$TopDir = $bpc->TopDir();
$BinDir = $bpc->BinDir();
%Conf   = $bpc->Conf();
$Lang   = $bpc->Lang();

$bpc->ChildInit();

my $err = $bpc->ServerConnect($Conf{ServerHost}, $Conf{ServerPort});
if ( $err ) {
print("Can't connect to server ($err)\n");
exit(1);
}
#
# Retrieve status of hosts
#
my $reply = $bpc->ServerMesg("status hosts");
$reply = $1 if ( $reply =~ /(.*)/s );
my(%Status, %Info, %Jobs, @BgQueue, @UserQueue, @CmdQueue);
eval($

Re: [BackupPC-users] Email reports

2007-01-15 Thread Craig Barratt
Francisco writes:

> Is it possible that Backuppc sends weekly (or daily, dont mind) reports to a
> specific email even though there are no errors with the backups?. I want
> something like that, but I dont know how to do it.

I really don't like programs that annoy you with emails.  That said,
I agree there hasn't been a way to be sure BackupPC is running ok.

In BackupPC 3.0.0beta there is a new option to BackupPC_sendEmail:

-c  check if BackupPC is alive and send an email if not

That allows you to run BackupPC_sendEmail -c from cron and
it will send an email to $Conf{EMailAdminUserName} if it
can't contact the BackupPC server.

The existing email options should handle the cases where BackupPC
is running but backups are failing.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] blackout fulls & incrs differently?

2007-01-15 Thread Craig Barratt
brien writes:

> I'd like to run "full" backups at night (say, 10pm-2am), but run 
> incrementals every 2 hours from 6am-6pm.  There doesn't seem to be any 
> way to do this.  Unless, maybe I can use a predump script to test the 
> time and $type and abort fulls that try to run during the day?  It would 
> be annoying to see a lot of bogus "errors", though.  Any ideas? 

You could disable automatic backups and schedule eveything from cron
using BackupPC_serverMesg.

Or you could

  - set $Conf{FullPeriod} to, say, 0.9,
  - set $Conf{IncrPeriod} to 2/24 = 0.08,
  - use blackouts on the midnight - 6am and 6pm-10pm windows
  - kick things off by starting the first full backup at 10pm.

The drawback with this approach is there no guarantee that the
full backup won't shift to during the day (eg: if a server is
down).

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] searching backups

2007-01-15 Thread Krsnendu dasa
I am looking for a file that I think is in the backups somewhere. Is
there a way to search for files saved in backups?

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Email reports

2007-01-15 Thread Les Stott

Craig Barratt wrote:

Francisco writes:

  

Is it possible that Backuppc sends weekly (or daily, dont mind) reports to a
specific email even though there are no errors with the backups?. I want
something like that, but I dont know how to do it.



I really don't like programs that annoy you with emails.  That said,
I agree there hasn't been a way to be sure BackupPC is running ok.

In BackupPC 3.0.0beta there is a new option to BackupPC_sendEmail:

-c  check if BackupPC is alive and send an email if not

That allows you to run BackupPC_sendEmail -c from cron and
it will send an email to $Conf{EMailAdminUserName} if it
can't contact the BackupPC server.

The existing email options should handle the cases where BackupPC
is running but backups are failing.

  

True.

Is there a way to generate an email summary of hosts and backup sizes? 
say for instance send an email with summary results of good backups from 
hosts in the last x days and server status output (disk space etc)?


Regards,

Les
-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] CGI scripts to manage removable hard drive media.

2007-01-15 Thread Timothy J. Massey
Hello!

Here are the scripts I use to manage the removable hard drive media used 
to store daily archives of my backup servers via the GUI, instead of 
from the command line.  Briefly, the system is set up like this:

BackupPC's pool is stored on a large internal hard drive.  Every day at 
a little after 7:00 A.M., the backup server starts an archive of each 
host, which is stored on a second hard drive that is mounted in a 
removable tray.  Once this is complete, the user can shut down the 
server, remove the hard drive, replace it with a different one, and turn 
the server back on.  Once the new drive is in place, it is 
repartitioned, reformatted and remounted in place, ready for the next 
archive.

There are two scripts that make this happen.  The first one simply shuts 
the server down.  The second one handles the repartitioning, 
reformatting and remounting.

There is absolutely no reason why this couldn't be handed by simply 
ssh'ing into the server.  Except that these servers are destined for 
network administrators for whom the command line is a tremendously evil 
thing, and if you try to sell them a solution that contains instructions 
like "Use putty, log into the server and type this command", they will 
say no.  Hence, the CGI scripts...

Because the scripts will be run by the webserver, they will be run with 
its permissions, which likely do not include the ability to shut down 
the server, or other such commands.  The way I have done this is to use 
sudo, with the proper lines in sudoers.  I've tried to make the commands 
as specific as posssible, to avoid possible security issues.

To partition the drive, I am echoing responses to the fdisk command.  I 
looked into parted, but I could not find a clean way of getting it to 
create a single large partition without knowing how big the partition 
was.  Seeing as this will be used with drives of different sizes, I 
decided to stick with fdisk.

Also, the HTML files that are cat to the user are created simply by 
saving any old BackupPC HTML page to a file, and chopping the part 
before the main body DIV into the top file, and the part after the main 
body DIV into the bottom file.

If you have any suggestions as to how to make this script better, I 
would be happy to hear them.  Otherwise, I hope they are useful to 
someone else.

Tim Massey




#!/bin/sh
# shutdown.cgi - Shut down server
echo "Content-type: text/html"
echo ""
cat bpc_top.html
echo "

Shut Down Server
The system is being shut down!
This will take approximately 60 seconds. Do not remove the drive before
the system has powered itself off."
cat bpc_bottom.html
sudo /sbin/shutdown -h now >/dev/null 2>&1
exit



#!/bin/sh
# instmedia.cgi - Install new media for BackupPC Archive
echo "Content-type: text/html"
echo ""
cat bpc_top.html
echo "

Initialize Removable Media
Initializing Removable Media
This will take approximately 10 minutes to complete, depending upon the
size of the removable drive.  Do not naviagate away from this page."

echo "Unmounting removable drive."
sudo /bin/umount /var/lib/BackupPC/removable 2>&1
if [ `sudo /bin/df /var/lib/BackupPC/removable | grep 
"/var/lib/BackupPC/removable" | wc -l` = "1"  ]; then
   echo "Error:  drive did not unmount."
   cat bpc_bottom.html
   exit
fi

echo "Creating proper partition on drive.
This will take approximately 45 seconds.  Please wait."
# Pipe responses for fdisk command via echo.
# This does the following:
#  The first series of lines will delete all partitions on a drive with 
up to 9
#   partitions.  It does this by having pairs of delete commands:  d9, 
d8, etc.
#   until it gets to the end.  When there's just one partition, fdisk
#   doesn't ask for a number, so the last one is just a d.
#   This will actually generate lots of errors in practice:  when there 
are no
#   more partitions left, the d's will generate an error saying that 
there are
#   no partitions to delete, and the numbers are interpreted as nonsense
#   commands.  However, this is harmless.
#  It then goes through the sequence to create a new partition:
#   n (New partition)
#   p (Primary parition)
#   1 (First partition)
#(Default starting cylinder is the first one)
#(Default ending cylinder is the last one)
#   w (Write the changes to disk and exit)
#  Several newlines are added at the end in case something goes wrong.
#  Three newlines in a row is interpreted by the fdisk command by
#  exiting immediately.
echo "

d
9
d
8
d
7
d
6
d
5
d
4
d
3
d
2
d
n
p
1


w




" | sudo /sbin/fdisk /dev/hdc >/dev/null

echo "Formatting partition for use.
This can take up to 10 minutes.  Please wait."
sudo /sbin/mke2fs -j -m 1 -LRemovableData /dev/hdc1 >/dev/null

echo "Mounting drive."
sudo /bin/mount /var/lib/BackupPC/removable 2>&1
if [ `sudo /bin/df /var/lib/BackupPC/removable | grep 
"/var/lib/BackupPC/removable" | wc -l` = "0"  ]; then
   echo "Error:  Drive did not mount."
   cat bpc_bottom.html
   exit
fi

echo "Setting permissions."
sudo /bin/chmod -

Re: [BackupPC-users] searching backups

2007-01-15 Thread Carl Wilhelm Soderstrom
On 01/15 02:09 , Krsnendu dasa wrote:
> I am looking for a file that I think is in the backups somewhere. Is
> there a way to search for files saved in backups?

find | grep 

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] What happens, if client rsync process crashes?

2007-01-15 Thread Clemens von Musil
Hi all,

I am very new to backupPC - and I like it more more and more every day. ;-)

I made a direcory with four 1MB files and some smaller text files for
test purposes.
Today, I tried some error cases and killed the rsync process on my
client machine while syncing with xfermethod=rsync. After killing the
process, the cgi interface showed this backup run as OK, but most of my
test files were not backed up. The files transfered until my interrupt
seem to be in the archive, all other files lack.

Is this the expected behaviour?

If yes: How can I distinguish between an correctly backed up archive and
an archive, that is half backed up due to a process crash on the client
machine?

Thanks a lot,
Clemente

Btw: I use backuppc as packaged by debian maintainers (sarge). It is v2.1.1
-- 


Clemens von Musil
[EMAIL PROTECTED]

Je weniger die Leute davon wissen,
wie Würste und Gesetze gemacht werden,
desto besser schlafen sie.
(Bismarck)


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Email reports

2007-01-15 Thread Travis Fraser
On Tue, 2007-01-16 at 08:24 +1100, Les Stott wrote:
> Travis Fraser wrote: 
> > On Thu, 2007-01-11 at 10:30 +0100, Francisco Daniel Igual Peña wrote:
> >   
> > > Hi, 
> > > 
> > > Is it possible that Backuppc sends weekly (or daily, dont mind) reports 
> > > to a
> > > specific email even though there are no errors with the backups?. I want
> > > something like that, but I dont know how to do it.
> > > 
> > > 
> > I have a daily status email and RSS feed. You can edit out the RSS stuff
> > if you want just email. The status email address is configurable. I
> > wrote up a little howto for myself (I am running backuppc-2.1.1, so
> > adjustments may be necessary if using a newer version):
> >   
> Travis, 
> 
> that looks like some fantastic work!! Certainly something that i think
> is worthwhile. I only wish i knew perl so i could knock out stuff like
> this ;)
> 
> I haven't tested myself, but if you have a sample email/rss output
> could you post so we can see what it looks like?
A typical email looks like so (for an RSS screenshot, I can email that
later):

To: 
[EMAIL PROTECTED]
Subject: BackupPC status: 4 hosts with good backups
  Date: 
Mon, 15 Jan 2007 01:01:01 -0500

Host: crescent
Full Count: 2Full age/days: 435.6
Full Size/GB: 0.15   Speed MB/sec: 3.03
Incremental Count: 0 Incremental Age/Days: 
State: idle  Last Attempt: nothing to do

Host: marmolata
Full Count: 9Full age/days: 3.2
Full Size/GB: 0.59   Speed MB/sec: 3.66
Incremental Count: 6 Incremental Age/Days: 0.2
State: idle  Last Attempt: nothing to do

Host: pigwin
Full Count: 5Full age/days: 53.4
Full Size/GB: 1.28   Speed MB/sec: 6.30
Incremental Count: 1 Incremental Age/Days: 212.3
State: backup starting   Last Attempt: no ping (host not found)

Host: sweetpea
Full Count: 9Full age/days: 3.2
Full Size/GB: 0.94   Speed MB/sec: 1.91
Incremental Count: 6 Incremental Age/Days: 0.2
State: backup starting   Last Attempt: nothing to do

-- 
Travis Fraser <[EMAIL PROTECTED]>


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Email reports

2007-01-15 Thread Michael Mansour
Hi,

> Travis Fraser wrote:
> > On Thu, 2007-01-11 at 10:30 +0100, Francisco Daniel Igual Peña wrote:
> >   
> >> Hi, 
> >>
> >> Is it possible that Backuppc sends weekly (or daily, dont mind) reports to 
> >> a
> >> specific email even though there are no errors with the backups?. I want
> >> something like that, but I dont know how to do it.
> >>
> >> 
> > I have a daily status email and RSS feed. You can edit out the RSS stuff
> > if you want just email. The status email address is configurable. I
> > wrote up a little howto for myself (I am running backuppc-2.1.1, so
> > adjustments may be necessary if using a newer version):
> >   
> Travis,
> 
> that looks like some fantastic work!! Certainly something that i 
> think is worthwhile. I only wish i knew perl so i could knock out 
> stuff like this ;)
> 
> I haven't tested myself, but if you have a sample email/rss output 
> could you post so we can see what it looks like?
> 
> Has this been added to any feature requests? implemented in version 
> 3? If not i would like to see it become a feature. A daily email to 
> send to an admin about the status of the system and good/ bad 
> backups etc is worthwhile. In my experience sometimes the end user 
> in charge of monitoring the state of hosts being backed up is a 
> little slack.

Maybe a project like this can be integrated into BackupPC:

http://sourceforge.net/projects/backupmon

?

Michael.

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Switching cipher

2007-01-15 Thread Philip Gleghorn
Hi,
Just wanted to share this interesting test result I found about 
switching the ssh cipher. I am running the 3.0.0 beta3 in production on 
a fairly small centos4 server (basic spec but with a 1.5TB raid5 setup) 
and backing up a variety of windows, linux and solaris across a 100mbit 
lan.

I wanted to see whether I could speed up the data transfer without 
resorting to rsh or applying the no-encryption patch for ssh, so I did 
some tests and found that arcfour was the fastest cipher for me (seems 
to be a common conclusion), by a factor of about 2.

Interestingly when I start using this in the backuppc ssh config it has 
made huge improvements to the backup completion time by a factor of up 
to 10. Here are stats showing the difference, the cipher was switched 
after backup 21:

  Backup#TypeFilled  Level   Start Date  
Duration/mins   
Age/days Server Backup Path
15  fullyes 0   12/28 20:00 811.7   18.5 
/home/backuppc/data/pc/oddball/15
18  incrno  1   1/1 01:00   124.9   15.3
/home/backuppc/data/pc/oddball/18
19  incrno  1   1/2 01:00   137.2   14.3
/home/backuppc/data/pc/oddball/19
20  incrno  1   1/3 01:00   124.2   13.3
/home/backuppc/data/pc/oddball/20
21  incrno  1   1/4 01:00   145.2   12.3
/home/backuppc/data/pc/oddball/21
22  fullyes 0   1/13 12:43  212.4   2.8 
/home/backuppc/data/pc/oddball/22
23  incrno  1   1/14 12:14  7.7 1.8 
/home/backuppc/data/pc/oddball/23
24  incrno  1   1/15 12:14  7.5 0.8 
/home/backuppc/data/pc/oddball/24

 Totals  Existing Files  New Files
Backup# Type#Files  Size/MB MB/sec  #Files  Size/MB 
#Files  Size/MB
15  full269365  5971.7  0.1299242   863.5   180563  5115.6
18  incr2151256.2   0.03133521.2855 235.1
19  incr2161258.2   0.03150925.1686 233.2
20  incr2167260.2   0.03151826.2681 234.0
21  incr2173262.1   0.03124426.9962 235.3
22  full268940  5993.7  0.47268640  5722.1  11914   279.1
23  incr2136257.2   0.56159818.12965239.3
24  incr2142259.1   0.57149322.1682 237.1

I'd be interested to know if others have seen similar results.

Phil


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/