Re: [BackupPC-users] Backuppc Does Not Like Mixed-Case Host Config File?
Norbert writes: I created a new share on an existing web server that I am backing up. The host configuration file was called '1and1-MW-common.pl', consistent with the case of the directory that I was backing up. I had added '1and1-MW-common 0 user' to the backuppc 'hosts' file. The backups were not starting, supposedly because of slow PING times, even though I had set '$Conf{PingMaxMsec} = 1000;' in the configuration file. To make a long story short, it appears that backuppc converted the host name in the 'hosts' file to lower case, and was looking for (but did not find) '1and1-mw-common.pl'. I renamed the host configuration file and everything is sunny once again. I am running backuppc 2-1-2 on an Ubuntu 6.10 server - waiting for a package to be available before upgrading. BackupPC forces the host name to lower case. I removed that in 3.0.0beta0, but since it broke some installations, I kept the 2.x behavior in 3.0.0. The original reason to force to lower case is that some tools to look up the host name (like nmblookup) return upper case. Craig - This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
Re: [BackupPC-users] Backing Up Symbolic Links Using Rsync
Norbert writes: I am backing up a directory structure on a Linux server running rsync version 2.5.6cvs protocol version 26. Most of the files and subdirectories are symbolic links to a common 'source' directory structure. It appears that backuppc is backing up the symbolic links to files, but the XferLOG shows no indication that any of the symbolic links to directories are being processed. So far, I have done a full and incremental backup. Is this something I should worry about if I need to do a restore? I am backing up the 'source' directory separately, so I can always re-establish the links manually. I am running backuppc 2-1-2 on an Ubuntu 6.10 server - waiting for a package to be available before upgrading. What are your rsync options? I'd be curious to see the XferLOG file with $Conf{XferLogLevel} set to 4. Craig - This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
[BackupPC-users] Fwd: BackupPC administrative attention needed - Email alerts
Hello, I've been using BackupPC for about 2 months now and I configured exim to send emails using a smarthost. Since then I've been getting emails from anacron, logwatch and logcheck, but never BackupPC. Last night, as my PC was being backed up I shut it down and I saw this email in my inbox this morning, so BackupPC can send emails How can I get the summary report email to be sent to the users that are responsible for the hosts, as declared in the hosts file? - Regards, Richard Bailey -- Forwarded message -- From: BackupPC . Date: Jun 27, 2007 1:02 AM Subject: BackupPC administrative attention needed To: [EMAIL PROTECTED] The following hosts had an error that is probably caused by a misconfiguration. Please fix these hosts: - .. (aborted by signal=PIPE) Regards, PC Backup Genie -- Reporter: What is your opinion on the obesity problem? King: I prefer it to the famine problem - Wizard of ID - This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
Re: [BackupPC-users] Backing Up Symbolic Links Using Rsync
What are your rsync options? I'd be curious to see the XferLOG file with $Conf{XferLogLevel} set to 4. Craig, the backup command is: $Conf{RsyncClientCmd} = '$sshPath -q -x -l userid $host $rsyncPath $argList+'; Clearly, I am blind (:-). The symlinks are being backed up, although not flagged as directories. Backuppc also does not recurse into the target directory. Here is an extract from the XferLOG for a normal directory: attribSet(dir=f.%2fbioeducation, file=images) create d 755 10102/6004096 images attribSet(dir=f.%2fbioeducation, file=images) and a symbolically linked directory: attribSet(dir=f.%2fbioeducation, file=maintenance) pool l 777 10102/600 77 maintenance attribSet(dir=f.%2fbioeducation, file=maintenance) I will do a restore of the common files and directories that are the targets of the symlinks and then restore the symlinks themselves - that will be the true test. Thanks! Norbert PS. Thanks for the fast response on the mixed-case host configuration files. It did throw me for a loop for a bit, until I noticed that the host-specific section of the GUI did not show any configuration file. - This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
[BackupPC-users] Tar errors all of a sudden for localhost only
Hello all... i started getting localhost backup errors... Log file is... 2007-06-27 13:10:29 incr backup started back to 2007-06-21 19:00:01 for directory /etc 2007-06-27 13:11:45 incr backup started back to 2007-06-21 19:00:01 for directory /var/www 2007-06-27 13:12:06 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/mysql 2007-06-27 13:12:06 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/amavis 2007-06-27 13:12:07 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/spamassassin 2007-06-27 13:12:07 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/clamav 2007-06-27 13:12:08 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/mdadm 2007-06-27 13:12:08 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/postgrey 2007-06-27 13:12:10 Got fatal error during xfer (Tar exited with error 512 () status) 2007-06-27 13:12:15 Backup aborted (Tar exited with error 512 () status) It started on the 22nd of June, however all other hosts are fine My localhost .pl is as follows I am on a Debian etch system backuppc version version 2.1.2pl1 Thanks to all in advance # # Local server backup of /etc as user backuppc # $Conf{XferMethod} = 'tar'; $Conf{TarShareName} = ['/etc','/var/www','/var/lib/mysql','/var/lib/amavis','/var/lib/spamassassin','/var/lib/clamav','/var/lib/mdadm','/var/lib/postgrey','/var/mail']; $Conf{TarClientCmd} = '/usr/bin/env LC_ALL=C $tarPath -c -v -f - -C $shareName' . ' --totals'; # remove extra shell escapes ($fileList+ etc.) that are # needed for remote backups but may break local ones $Conf{TarFullArgs} = '$fileList'; $Conf{TarIncrArgs} = '--newer=$incrDate $fileList'; -- Rob Morin Dido InterNet Inc. Montreal, Canada Http://www.dido.ca 514-990- - This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
[BackupPC-users] Incremental Backup every two hours
Hi there I am trying to do an Incremental backup dataserver Departmental data (word,excel,access,pics,pdf files) every two hours. Size of the departmental data folder is about 80 GB. I have installed Backup pc, I have done following 1) Changed confil.pl using web interface so that backup starts every 2 hours. [ 7,9,11,13,15,17,19] 2) Changed the Backout period from 20 to 6 3) Changed number of incremental backup to 140 4) Change minmum days before next full backup to 30.97 5) change minimium no of full backup to keep. 1 what else do I need to configure to achieve Incremental backup every 2 hours ? Also if a file that has been created on my dataserer after full backup, will that file be backed up every incremental backup ? Thank you in advance for all your help. Regards Chir - This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
Re: [BackupPC-users] Tar errors all of a sudden for localhost only
Never mind got it I added to the backup /var/lib/postgrey here is an las of that dir joe:/var/lib/postgrey# ls -al total 10364 drwx-- 2 postgrey postgrey 4096 2007-06-22 13:21 . drwxr-xr-x 41 root root 4096 2007-06-27 09:31 .. -rw--- 1 postgrey postgrey24576 2007-06-22 13:21 __db.001 -rw--- 1 postgrey postgrey 155648 2007-06-22 13:21 __db.002 -rw--- 1 postgrey postgrey 270336 2007-06-22 13:21 __db.003 -rw--- 1 postgrey postgrey98304 2007-06-22 13:21 __db.004 -rw--- 1 postgrey postgrey16384 2007-06-22 13:21 __db.005 -rw--- 1 postgrey postgrey 10485760 2007-06-27 13:44 log.01 -rw--- 1 postgrey postgrey69632 2007-06-27 12:56 postgrey_clients.db -rw--- 1 postgrey postgrey 942080 2007-06-27 13:44 postgrey.db -rw--- 1 postgrey postgrey0 2007-05-30 16:23 postgrey.lock What should be done to get it to work, in a nice way? Rob Morin Dido InterNet Inc. Montreal, Canada Http://www.dido.ca 514-990- Rob Morin wrote: Hello all... i started getting localhost backup errors... Log file is... 2007-06-27 13:10:29 incr backup started back to 2007-06-21 19:00:01 for directory /etc 2007-06-27 13:11:45 incr backup started back to 2007-06-21 19:00:01 for directory /var/www 2007-06-27 13:12:06 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/mysql 2007-06-27 13:12:06 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/amavis 2007-06-27 13:12:07 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/spamassassin 2007-06-27 13:12:07 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/clamav 2007-06-27 13:12:08 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/mdadm 2007-06-27 13:12:08 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/postgrey 2007-06-27 13:12:10 Got fatal error during xfer (Tar exited with error 512 () status) 2007-06-27 13:12:15 Backup aborted (Tar exited with error 512 () status) It started on the 22nd of June, however all other hosts are fine My localhost .pl is as follows I am on a Debian etch system backuppc version version 2.1.2pl1 Thanks to all in advance # # Local server backup of /etc as user backuppc # $Conf{XferMethod} = 'tar'; $Conf{TarShareName} = ['/etc','/var/www','/var/lib/mysql','/var/lib/amavis','/var/lib/spamassassin','/var/lib/clamav','/var/lib/mdadm','/var/lib/postgrey','/var/mail']; $Conf{TarClientCmd} = '/usr/bin/env LC_ALL=C $tarPath -c -v -f - -C $shareName' . ' --totals'; # remove extra shell escapes ($fileList+ etc.) that are # needed for remote backups but may break local ones $Conf{TarFullArgs} = '$fileList'; $Conf{TarIncrArgs} = '--newer=$incrDate $fileList'; - This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
Re: [BackupPC-users] Incremental Backup every two hours
Chir patel [EMAIL PROTECTED] writes: Hi Chir Hi there I am trying to do an Incremental backup dataserver Departmental data (word,excel,access,pics,pdf files) every two hours. Size of the departmental data folder is about 80 GB. I have installed Backup pc, I have done following 1) Changed confil.pl using web interface so that backup starts every 2 hours. [ 7,9,11,13,15,17,19] 2) Changed the Backout period from 20 to 6 3) Changed number of incremental backup to 140 4) Change minmum days before next full backup to 30.97 5) change minimium no of full backup to keep. 1 what else do I need to configure to achieve Incremental backup every 2 hours ? It seems right. What error message(s) are you getting? Also if a file that has been created on my dataserer after full backup, will that file be backed up every incremental backup ? Yes, all files will be backed up, it doesn't matter if the backup is full or incremental. Keep in mind that backuppc stores just once each identical file. Regards, Rodrigo Thank you in advance for all your help. Regards Chir - This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/ - This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
Re: [BackupPC-users] Tar errors all of a sudden for localhost only
You need to find a way to give access to those directories (read exec) and files (read) the backuppc user. look for How can client access as root be avoided? in the FAQ On Wednesday 27 June 2007 18:43, Rob Morin wrote: Never mind got it I added to the backup /var/lib/postgrey here is an las of that dir joe:/var/lib/postgrey# ls -al total 10364 drwx-- 2 postgrey postgrey 4096 2007-06-22 13:21 . drwxr-xr-x 41 root root 4096 2007-06-27 09:31 .. -rw--- 1 postgrey postgrey24576 2007-06-22 13:21 __db.001 -rw--- 1 postgrey postgrey 155648 2007-06-22 13:21 __db.002 -rw--- 1 postgrey postgrey 270336 2007-06-22 13:21 __db.003 -rw--- 1 postgrey postgrey98304 2007-06-22 13:21 __db.004 -rw--- 1 postgrey postgrey16384 2007-06-22 13:21 __db.005 -rw--- 1 postgrey postgrey 10485760 2007-06-27 13:44 log.01 -rw--- 1 postgrey postgrey69632 2007-06-27 12:56 postgrey_clients.db -rw--- 1 postgrey postgrey 942080 2007-06-27 13:44 postgrey.db -rw--- 1 postgrey postgrey0 2007-05-30 16:23 postgrey.lock What should be done to get it to work, in a nice way? Rob Morin Dido InterNet Inc. Montreal, Canada Http://www.dido.ca 514-990- Rob Morin wrote: Hello all... i started getting localhost backup errors... Log file is... 2007-06-27 13:10:29 incr backup started back to 2007-06-21 19:00:01 for directory /etc 2007-06-27 13:11:45 incr backup started back to 2007-06-21 19:00:01 for directory /var/www 2007-06-27 13:12:06 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/mysql 2007-06-27 13:12:06 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/amavis 2007-06-27 13:12:07 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/spamassassin 2007-06-27 13:12:07 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/clamav 2007-06-27 13:12:08 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/mdadm 2007-06-27 13:12:08 incr backup started back to 2007-06-21 19:00:01 for directory /var/lib/postgrey 2007-06-27 13:12:10 Got fatal error during xfer (Tar exited with error 512 () status) 2007-06-27 13:12:15 Backup aborted (Tar exited with error 512 () status) It started on the 22nd of June, however all other hosts are fine My localhost .pl is as follows I am on a Debian etch system backuppc version version 2.1.2pl1 Thanks to all in advance # # Local server backup of /etc as user backuppc # $Conf{XferMethod} = 'tar'; $Conf{TarShareName} = ['/etc','/var/www','/var/lib/mysql','/var/lib/amavis','/var/lib/spamassas sin','/var/lib/clamav','/var/lib/mdadm','/var/lib/postgrey','/var/mail']; $Conf{TarClientCmd} = '/usr/bin/env LC_ALL=C $tarPath -c -v -f - -C $shareName' . ' --totals'; # remove extra shell escapes ($fileList+ etc.) that are # needed for remote backups but may break local ones $Conf{TarFullArgs} = '$fileList'; $Conf{TarIncrArgs} = '--newer=$incrDate $fileList'; - This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/ -- Ali - This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/