Re: [BackupPC-users] Problems rsyncing WinXP
Am 06/09/2011 04:41 PM, schrieb Michael Stowe: Hello, I'm trying to backup a WinXP-Machine with Backup-PC and rysncd for a long time now. But the rsync is always aborted, but I don't now why. In the error-log on the webinterface I found: = full backup started for directory cDrive Connected to b-cdcopy:873, remote version 29 Negotiated protocol version 28 Connected to module cDrive Sending args: --server --sender --numeric-ids --perms --owner --group -D --links --hard-links --times --block-size=2048 --recursive --ignore-times . . Sent exclude: /System Volume Information Sent exclude: /pagefile.sys Sent exclude: /hiberfil.sys Sent exclude: /RECYCLER Sent exclude: /rsyncd Sent exclude: /Dokumente und Einstellungen/*/Lokale Einstellungen/Temporary Internet Files Sent exclude: /Dokumente und Einstellungen/*/Lokale Einstellungen/Temp Sent exclude: /Dokumente und Einstellungen/*/Anwendungsdaten/Microsoft/Search Sent exclude: /Dokumente und Einstellungen/NetworkService/* Sent exclude: /Dokumente und Einstellungen/LocalService/* Sent exclude: /Dokumente und Einstellungen/*/NTUSER.DAT Sent exclude: /Dokumente und Einstellungen/*/ntuser.dat.LOG Sent exclude: /WINDOWS/Temp Sent exclude: /WINDOWS/Registration/*.crmlog Sent exclude: /Temp Sent exclude: *.dat Sent exclude: *.LOG Sent exclude: /WINDOWS/* Sent exclude: *\x{c2}\x{ae}* Xfer PIDs are now 26205 [ überspringe 5636 Zeilen ] Done: 4580 files, 3227159801 bytes Backup aborted () Saving this as a partial backup, replacing the prior one (got 4580 and 4580 files versus 0) I use backupPC 3.2.0 on a Machine with OpenSuSE, on Windows-XP the cygwin-rsyncd from http://sourceforge.net/projects/backuppc Any ideas why the backup is abborted? Regards Daniel I can tell you what I encountered, and speculate that you may be experiencing something similar: Some antivirus programs are protective of attempts to read their files or directories, and respond by either killing the process or suspending its ability to read files (neither of which rsync reacts well to.) There are a few other things that can confuse and abort rsync: - not using vshadow/locked files -- in most cases, this is just an error, but in some corner cases, it seems that rsync will abort. I'm not sure what the pattern is. - antivirus and related programs either protect the filesystem or the port from rsync, interpreting it as an attack (Norton and McAffee in particular, but the behavior is by no means restricted to these two.) In many cases, the port and binary can be placed into an exception list. - rsync may encounter a filename or directory structure that it is not able to deal with: for example, a link loop, a particularly long filename, or one containing characters it's not equipped to handle. Some of these are solvable using different options when compiling rsync. Hm, I've tested it for 5 minutes with the normal rsnc of my distribution The Rsync did not stop, I only get the message sent 646598 bytes received 15201260820 bytes 8223915.29 bytes/sec total size is 19531542623 speedup is 1.28 rsync error: some files could not be transferred (code 23) at main.c(1310) [generator=2.6.8] at the end of the process. So I think the rsyncd on the Windows-host is not the problem. Regards Daniel -- EditLive Enterprise is the world's most technically advanced content authoring tool. Experience the power of Track Changes, Inline Image Editing and ensure content is compliant with Accessibility Checking. http://p.sf.net/sfu/ephox-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Daniel Spannbauer Software Entwicklung marco Systemanalyse und Entwicklung GmbH Tel +49 8333 9233-27 Fax -11 Rechbergstr. 4 - 6, D 87727 Babenhausen Mobil +49 171 4033220 http://www.marco.de/ Email d...@marco.de Geschäftsführer Martin Reuter HRB 171775 Amtsgericht München -- EditLive Enterprise is the world's most technically advanced content authoring tool. Experience the power of Track Changes, Inline Image Editing and ensure content is compliant with Accessibility Checking. http://p.sf.net/sfu/ephox-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Problems rsyncing WinXP
Am 06/09/2011 04:41 PM, schrieb Michael Stowe: Hello, I'm trying to backup a WinXP-Machine with Backup-PC and rysncd for a long time now. But the rsync is always aborted, but I don't now why. In the error-log on the webinterface I found: = full backup started for directory cDrive Connected to b-cdcopy:873, remote version 29 Negotiated protocol version 28 Connected to module cDrive Sending args: --server --sender --numeric-ids --perms --owner --group -D --links --hard-links --times --block-size=2048 --recursive --ignore-times . . Sent exclude: /System Volume Information Sent exclude: /pagefile.sys Sent exclude: /hiberfil.sys Sent exclude: /RECYCLER Sent exclude: /rsyncd Sent exclude: /Dokumente und Einstellungen/*/Lokale Einstellungen/Temporary Internet Files Sent exclude: /Dokumente und Einstellungen/*/Lokale Einstellungen/Temp Sent exclude: /Dokumente und Einstellungen/*/Anwendungsdaten/Microsoft/Search Sent exclude: /Dokumente und Einstellungen/NetworkService/* Sent exclude: /Dokumente und Einstellungen/LocalService/* Sent exclude: /Dokumente und Einstellungen/*/NTUSER.DAT Sent exclude: /Dokumente und Einstellungen/*/ntuser.dat.LOG Sent exclude: /WINDOWS/Temp Sent exclude: /WINDOWS/Registration/*.crmlog Sent exclude: /Temp Sent exclude: *.dat Sent exclude: *.LOG Sent exclude: /WINDOWS/* Sent exclude: *\x{c2}\x{ae}* Xfer PIDs are now 26205 [ überspringe 5636 Zeilen ] Done: 4580 files, 3227159801 bytes Backup aborted () Saving this as a partial backup, replacing the prior one (got 4580 and 4580 files versus 0) I use backupPC 3.2.0 on a Machine with OpenSuSE, on Windows-XP the cygwin-rsyncd from http://sourceforge.net/projects/backuppc Any ideas why the backup is abborted? Regards Daniel I can tell you what I encountered, and speculate that you may be experiencing something similar: Some antivirus programs are protective of attempts to read their files or directories, and respond by either killing the process or suspending its ability to read files (neither of which rsync reacts well to.) There are a few other things that can confuse and abort rsync: - not using vshadow/locked files -- in most cases, this is just an error, but in some corner cases, it seems that rsync will abort. I'm not sure what the pattern is. - antivirus and related programs either protect the filesystem or the port from rsync, interpreting it as an attack (Norton and McAffee in particular, but the behavior is by no means restricted to these two.) In many cases, the port and binary can be placed into an exception list. - rsync may encounter a filename or directory structure that it is not able to deal with: for example, a link loop, a particularly long filename, or one containing characters it's not equipped to handle. Some of these are solvable using different options when compiling rsync. Hm, I've tested it for 5 minutes with the normal rsnc of my distribution The Rsync did not stop, I only get the message sent 646598 bytes received 15201260820 bytes 8223915.29 bytes/sec total size is 19531542623 speedup is 1.28 rsync error: some files could not be transferred (code 23) at main.c(1310) [generator=2.6.8] at the end of the process. So I think the rsyncd on the Windows-host is not the problem. Regards Daniel If the problem is something I listed, 5 minutes is unlikely to be sufficient to reproduce. Also, it's worth looking at the detailed log to see what file(s) it's attempting when it aborts. -- EditLive Enterprise is the world's most technically advanced content authoring tool. Experience the power of Track Changes, Inline Image Editing and ensure content is compliant with Accessibility Checking. http://p.sf.net/sfu/ephox-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] archive to ftp?
I have backuppc running and tested and it works great but our company requires offsite storage of backups. Someone used to take a tape home each night when we used Amanda. I've read the docs on the archive function and it says BackupPC supports archiving to removable media. For users that require offsite backups, BackupPC can create archives that stream to tape devices, or create files of specified sizes to fit onto cd or dvd media. I would like to send these archives to an ftp site. Is there any docs on how to do something like this? Also is there docs on how to do a restore from an archive? -- EditLive Enterprise is the world's most technically advanced content authoring tool. Experience the power of Track Changes, Inline Image Editing and ensure content is compliant with Accessibility Checking. http://p.sf.net/sfu/ephox-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Backup of VM images
Jeffrey J. Kosowsky backuppc at kosowsky.org writes: Just as an FYI, BackupPC uses a more limited blocksize range that does not get that huge. In fact, the block size ranges from 2048 to 16384 with the values within the range set by int(file_size/1). Thanks Jeffrey. I did a little more research, and newer versions of rsync have a dynamically-sized hash table, so my comment about it going CPU bound because of hash table collisions depends on the version of rsync being used. Jim -- EditLive Enterprise is the world's most technically advanced content authoring tool. Experience the power of Track Changes, Inline Image Editing and ensure content is compliant with Accessibility Checking. http://p.sf.net/sfu/ephox-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] 18 hour incremental backups?
Hi, Boris HUISGEN wrote on 2011-03-02 15:53:10 +0100 [Re: [BackupPC-users] 18 hour incremental backups?]: The compression is disabled (level 0 = no compression) just for the archives: that's complete nonsense. What *should* be disabled is top-posting. Level 0 means full backup, not compression disabled. And disabled compression would usually not make the backups slower, not by orders of magnitude in any case. In fact, if there are a lot of files that are not yet to be found in the pool, disabling compression should make the backup run *faster*. Le 28/02/11 18:09, Rob Morin a écrit : Hello all? I cannot seem to wonder why incremental backups would take 18 hours? 2011-02-26 23:00:10 incr backup started back to 2011-02-21 20:00:01 (backup #0) for directory /etc 2011-02-26 23:02:15 incr backup started back to 2011-02-21 20:00:01 (backup #0) for directory /home 2011-02-27 16:34:08 incr backup started back to 2011-02-21 20:00:01 (backup #0) for directory /usr/local/src 2011-02-27 16:35:41 incr backup started back to 2011-02-21 20:00:01 (backup #0) for directory /var/lib/mysql/mysql_backup 2011-02-27 18:01:46 incr backup 5 complete, 12825 files, 28192345844 bytes, 0 xferErrs (0 bad files, 0 bad shares, 0 other) The home dir is under 50 gigs, and its incremental? [I hope you've solved the problem in the mean time, but I thought I'd add something for the benefit of anyone finding this in the archives. If you found out what the problem was, you could share your results.] Well, you see that it *is* the home dir that is taking 17.5 of 19 hours, and you seem to be transferring 28 GB in total, so I'd guess there are a lot of changes. In fact, the next incremental backup, based on the same full backup, ran significantly faster. You could look into the XferLOG for that backup to get an idea of what was actually happening (lots of small temporary files?). Backup# Type Filled Level Start Date Duration/mins Age/days Server Backup Path 0 https://mail.6948065.com/backuppc/index.cgi?action=browsehost=d9.interhub.localnum=0 full yes 0 2/21 20:00 202.1 6.7 /var/lib/backuppc/pc/d9.interhub.local/0 [...] 5 https://mail.6948065.com/backuppc/index.cgi?action=browsehost=d9.interhub.localnum=5 incr no 1 2/26 23:00 1141.6 1.5 /var/lib/backuppc/pc/d9.interhub.local/5 6 https://mail.6948065.com/backuppc/index.cgi?action=browsehost=d9.interhub.localnum=6 incr no 1 2/27 23:00 566.9 0.5 /var/lib/backuppc/pc/d9.interhub.local/6 Still, the fact that your full backup is considerably faster does seem strange. From the information you've given, I can't even begin to guess what might be wrong. Regards, Holger -- EditLive Enterprise is the world's most technically advanced content authoring tool. Experience the power of Track Changes, Inline Image Editing and ensure content is compliant with Accessibility Checking. http://p.sf.net/sfu/ephox-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] archive to ftp?
On 6/10/2011 9:06 AM, Joe Konecny wrote: I have backuppc running and tested and it works great but our company requires offsite storage of backups. Someone used to take a tape home each night when we used Amanda. I've read the docs on the archive function and it says BackupPC supports archiving to removable media. For users that require offsite backups, BackupPC can create archives that stream to tape devices, or create files of specified sizes to fit onto cd or dvd media. I would like to send these archives to an ftp site. Is there any docs on how to do something like this? Also is there docs on how to do a restore from an archive? First be sure you understand what the archiving feature does. It will create a tar image of the latest backup of a host (with incrementals merged with the previous full as necessary), optionally compressed and split into chunks. You can restore these without needing backuppc (just zcat the chunks and pipe to tar), but they don't provide earlier history and don't have any pooling of duplicate data. If this is really what you want offsite, you could have backuppc write to a directory (possibly nfs mounted from a different machine, but something local), then rsync or ftp the resulting files to the offsite location. If you want finer control of the archive generation, you can script your own with BackupPC_tarCreate piped through gzip and split. -- Les Mikesell lesmikes...@gmail.com -- EditLive Enterprise is the world's most technically advanced content authoring tool. Experience the power of Track Changes, Inline Image Editing and ensure content is compliant with Accessibility Checking. http://p.sf.net/sfu/ephox-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] archive to ftp?
On 6/10/2011 10:55 AM, Les Mikesell wrote: On 6/10/2011 9:06 AM, Joe Konecny wrote: I have backuppc running and tested and it works great but our company requires offsite storage of backups. Someone used to take a tape home each night when we used Amanda. I've read the docs on the archive function and it says BackupPC supports archiving to removable media. For users that require offsite backups, BackupPC can create archives that stream to tape devices, or create files of specified sizes to fit onto cd or dvd media. I would like to send these archives to an ftp site. Is there any docs on how to do something like this? Also is there docs on how to do a restore from an archive? First be sure you understand what the archiving feature does. It will create a tar image of the latest backup of a host (with incrementals merged with the previous full as necessary), optionally compressed and split into chunks. You can restore these without needing backuppc (just zcat the chunks and pipe to tar), but they don't provide earlier history and don't have any pooling of duplicate data. If this is really what you want offsite, you could have backuppc write to a directory (possibly nfs mounted from a different machine, but something local), then rsync or ftp the resulting files to the offsite location. If you want finer control of the archive generation, you can script your own with BackupPC_tarCreate piped through gzip and split. What you describe would be great. Thanks! -- EditLive Enterprise is the world's most technically advanced content authoring tool. Experience the power of Track Changes, Inline Image Editing and ensure content is compliant with Accessibility Checking. http://p.sf.net/sfu/ephox-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/