[BackupPC-users] Linux backups with rsync vs tar
Hello list. I have tried to relate with the various threads I have found regarding the issues with using the BackupPC/rsync combination; especially the 2007 thread with Les Mikesell and Timothy J. Massey. More recently the question by Pavel Hofman and the response from Holger Parplies. But none of these have explained or resolved the problems I am facing. I'm using BackupPC to take daily backups of a maildir totaling 250 GB with average file sizes of 500 MB (text mailboxes, these files change everyday). Currently, my setup take full backups once a week and incremental backups every day between the full backups. The servers are directly connected with a cross-cable, allowing 100 Mbps. However, these backups take about 8 hours to complete, averaging 8 Mbps and the BackupPC server is CPU-bound through-out the entire process. Thus I have reason to suspect the rsync overhead as being guilty. Note that I have disabled hard links, implemented checksum caching, increased the block size to 512 KB and enable --whole-file to no avail. With this background, I will appreciate answers to the following questions: 1. since over 90% of the files change every day and incremental backups involve transferring the whole file to the BackupPC server, won't it make better sense to just run a full backup everyday? 2. from Pavel's questions, he observed that BackupPC is unable to recover from interrupted tar transfer. Such interruptions simply cannot happen in my case. Should I switch to tar? And in the unlikely event that the transferred does get interrupted, what mechanisms do I need to implement to resume/recover from the failure? 3. What is the recommended process for switching from rsync to tar - since the format/attributes are reportedly incompatible? I would like to preserve existing compressed backups as much as possible. Regards, Charles 'Boyo +-- |This was sent by charlesb...@gmail.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Special Offer -- Download ArcSight Logger for FREE! Finally, a world-class log management solution at an even better price-free! And you'll get a free Love Thy Logs t-shirt when you download Logger. Secure your free ArcSight Logger TODAY! http://p.sf.net/sfu/arcsisghtdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Large file backup problem (12GB)
I apologise if my question is answered before (I didnt find it) I have a problem when BackupPC (3.2.1) tries to backup a large file, about 12GB in size. I use a CentOS BackupPC server backing up an CentOS client with rsync over ssh. I first saw this using CentOS 5 and rsync 2.x, I therefore upgraded to CentOS 6 which has rsync 3.x as google search indicated there might be an issue with rsync that was fixed in 3.0X? But the problem still exists ... More details; The source file (ls l) on the client; -rwxr- 1 user1 server 12773621760 Feb 4 2011 2010-12-19 Huge HD recording 2010.m2ts The transfer of date does not stop, and I had to stop it manually when the backup file reached ~1/2 TB on the BackupPC server (in the TopDir/pc structure); -rw-r- 1 backuppc backuppc 499482475314 Aug 31 11:59 f2010-12-19 Huge HD recording 2010.m2ts I dont know the processes involved, but it seemed to me that /usr/share/BackupPC/bin/BackupPC_dump was the process writing to the 1/2 TB file (just guessing here?) I have other of size up to 6GB where the backup succeeds, where are the limitations above that? But where is this limitation, is it BackupPC, rsync or? What can I do? +-- |This was sent by d...@hegreberg.com via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Special Offer -- Download ArcSight Logger for FREE! Finally, a world-class log management solution at an even better price-free! And you'll get a free Love Thy Logs t-shirt when you download Logger. Secure your free ArcSight Logger TODAY! http://p.sf.net/sfu/arcsisghtdev2dev___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] My problems with restore.
On Wed, Aug 31, 2011 at 3:34 AM, Joe Skop joe.s...@gmail.com wrote: I had a lot of problems. The solutions for recovering was, in my chase: - Create an archive with the whole files, transfer on the new instance, decompress it, and put on the right place; - Download singolar files or folders, not bigger of 70/100 Mb (or the download process hangs), transfer on the new instances and so like before. Hangs? Is this a network problem? If I send the recovering via shell: /usr/bin/ssh -x -l USER localhost env LC_ALL=C /bin/tar -x -p --numeric-owner --same-owner -v -f - -C /tmp /usr/share/backuppc/bin/BackupPC_tarCreate -h SERVER_ORIG -n 134 -s /mnt/dump -t -r /mysql -p /mysql/ /mysql/backup.log /bin/tar: You may not specify more than one `-Acdtrux' option Try `/bin/tar --help' or `/bin/tar --usage' for more information. I don't understand what you are trying to do there. Can you explain where you are when you issue that command and where you expect the parts to execute? I assume you want the output of BackupPC_tarCreate on the backuppc server piped into a tar extract (which probably needs to run as root) on the host you are restoring, but that command won't do it. -- Les Mikesell lesmikes...@gmail.com -- Special Offer -- Download ArcSight Logger for FREE! Finally, a world-class log management solution at an even better price-free! And you'll get a free Love Thy Logs t-shirt when you download Logger. Secure your free ArcSight Logger TODAY! http://p.sf.net/sfu/arcsisghtdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Linux backups with rsync vs tar
tar is faster since it doesn't spend hours building a file list should there be thousands or millions of files involved. -- Special Offer -- Download ArcSight Logger for FREE! Finally, a world-class log management solution at an even better price-free! And you'll get a free Love Thy Logs t-shirt when you download Logger. Secure your free ArcSight Logger TODAY! http://p.sf.net/sfu/arcsisghtdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] excluding files from backup
After further investigation, I believe my file exclusions are not working in backuppc. I checked some random machine transfer logs and I see lots of entries for /WINDOWS/... In my config.pl this (and other) directories should be exuded. I am using SMB to do backups (xp machines mostly): Do I need anything besides this: ? $Conf{BackupFilesExclude} = { '*' = [ '/Documents and Settings/*/Local Settings/Temporary Internet Files/', '/Documents and Settings/*/Local Settings/Temp/', '/Documents and Settings/*/NTUSER.DAT', '/Documents and Settings/*/ntuser.dat.LOG', '/Documents and Settings/*/Local Settings/Application Data/Microsoft/Windows/UsrClass.dat', '/Documents and Settings/*/Local Settings/Application Data/Microsoft/Windows/UsrClass.dat.LOG', '/Documents and Settings/*/Local Settings/Application Data/Mozilla/Firefox/Profiles/*/Cache/', '/Documents and Settings/*/Local Settings/Application Data/Mozilla/Firefox/Profiles/*/OfflineCache/', '/Documents and Settings/*/Recent/', '*.lock', 'Thumbs.db', 'IconCache.db', 'Cache', 'cache', '/WINDOWS/', '/RECYCLER/', '/MSOCache/', '/System Volume Information/', '/AUTOEXEC.BAT', '/BOOTSECT.BAK', '/CONFIG.SYS', '/hiberfil.sys', '/pagefile.sys', '/WINNT/' ] }; -- Special Offer -- Download ArcSight Logger for FREE! Finally, a world-class log management solution at an even better price-free! And you'll get a free Love Thy Logs t-shirt when you download Logger. Secure your free ArcSight Logger TODAY! http://p.sf.net/sfu/arcsisghtdev2dev___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] excluding files from backup
Hi, SSzretter wrote on 2011-08-31 13:37:13 -0400 [[BackupPC-users] excluding files from backup]: After further investigation, I believe my file exclusions are not working in backuppc. I checked some random machine transfer logs and I see lots of entries for /WINDOWS/... In my config.pl this (and other) directories should be exuded. I am using SMB to do backups (xp machines mostly): Do I need anything besides this: ? err, beside the question mark? You need to not have set BackupFilesOnly. What does your log file say? Regards, Holger -- Special Offer -- Download ArcSight Logger for FREE! Finally, a world-class log management solution at an even better price-free! And you'll get a free Love Thy Logs t-shirt when you download Logger. Secure your free ArcSight Logger TODAY! http://p.sf.net/sfu/arcsisghtdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] My problems with restore.
Hi, thanks Les for the answer. On Wed, Aug 31, 2011 at 3:34 AM, Joe Skop joe.skop@... wrote: [...] - Download singolar files or folders, not bigger of 70/100 Mb (or the download process hangs), transfer on the new instances and so like before. Hangs? Is this a network problem? Yes, maybe some timeouts on the networks... I never investigate so much about this not conventional problem. If I send the recovering via shell: /usr/bin/ssh -x -l USER localhost env LC_ALL=C /bin/tar -x -p --numeric-owner --same-owner -v -f - -C /tmp /usr/share/backuppc/bin/BackupPC_tarCreate -h SERVER_ORIG -n 134 -s /mnt/dump -t -r /mysql -p /mysql/ /mysql/backup.log /bin/tar: You may not specify more than one `-Acdtrux' option Try `/bin/tar --help' or `/bin/tar --usage' for more information. I don't understand what you are trying to do there. Can you explain where you are when you issue that command and where you expect the parts to execute? I assume you want the output of BackupPC_tarCreate on the backuppc server piped into a tar extract (which probably needs to run as root) on the host you are restoring, but that command won't do it. From the web interface, I select a host, Backup browse for HOST, I browse the folders, I select a example file, i press Restore selected files, I go to Restore Options for HOST, I choose the 1st option Option 1: Direct Restore, I select localhost (that is the server that hosts the backuppc, I choose the share and the dir. Next screen Are you sure?, the positions are: Original file/dir HOST:/mnt/dump/mysql/backup.log Will be restored to localhost:/tmp/mysql/backup.log If I press Restore, after a few seconds I have the message on the Error: restore failed: Tar exited with error 65280 () status. If I try manually, by shell, by the correct user etc etc, to do the same commandline, I have the error /bin/tar: You may not specify more than one `-Acdtrux' opion Hope now is more clear the path I follow. Thanks, Regards JS -- Special Offer -- Download ArcSight Logger for FREE! Finally, a world-class log management solution at an even better price-free! And you'll get a free Love Thy Logs t-shirt when you download Logger. Secure your free ArcSight Logger TODAY! http://p.sf.net/sfu/arcsisghtdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] My problems with restore.
On Wed, Aug 31, 2011 at 3:13 PM, Joe Skop joe.s...@gmail.com wrote: - Download singolar files or folders, not bigger of 70/100 Mb (or the download process hangs), transfer on the new instances and so like before. Hangs? Is this a network problem? Yes, maybe some timeouts on the networks... I never investigate so much about this not conventional problem. The quick fix, regardless of the real problem, is probably to run the web browser on the backuppc server itself or some nearby machine with a good network connection and download a tar image of what you need through the browser. Then use whatever transfer method you think will be reliable to get the file where you want it and can to the tar extract. If I send the recovering via shell: /usr/bin/ssh -x -l USER localhost env LC_ALL=C /bin/tar -x -p --numeric-owner --same-owner -v -f - -C /tmp /usr/share/backuppc/bin/BackupPC_tarCreate -h SERVER_ORIG -n 134 -s /mnt/dump -t -r /mysql -p /mysql/ /mysql/backup.log /bin/tar: You may not specify more than one `-Acdtrux' option Try `/bin/tar --help' or `/bin/tar --usage' for more information. I don't understand what you are trying to do there. Can you explain where you are when you issue that command and where you expect the parts to execute? I assume you want the output of BackupPC_tarCreate on the backuppc server piped into a tar extract (which probably needs to run as root) on the host you are restoring, but that command won't do it. From the web interface, I select a host, Backup browse for HOST, I browse the folders, I select a example file, i press Restore selected files, I go to Restore Options for HOST, I choose the 1st option Option 1: Direct Restore, I select localhost (that is the server that hosts the backuppc, I choose the share and the dir. Next screen Are you sure?, the positions are: Original file/dir HOST:/mnt/dump/mysql/backup.log Will be restored to localhost:/tmp/mysql/backup.log If I press Restore, after a few seconds I have the message on the Error: restore failed: Tar exited with error 65280 () status. Did you set up ssh access to localhost in the same way as other hosts (i.e. so the backuppc user can ssh commands as root)? And if the keys are set up, have you done a test run to answer the prompt you get on the first ssh connection attempt? If I try manually, by shell, by the correct user etc etc, to do the same commandline, I have the error /bin/tar: You may not specify more than one `-Acdtrux' opion Hope now is more clear the path I follow. No, still not clear on the shell side. The tar extract command and the BackupPC_tarCreate commands are two separate things. You can pipe the output of a BackupPC_tarCreate to the input of a tar extract, and you can use ssh to make it happen on a different host, but I don't see the pipe setup in your shell command. -- Les Mikesell lesmikes...@gmail.com -- Special Offer -- Download ArcSight Logger for FREE! Finally, a world-class log management solution at an even better price-free! And you'll get a free Love Thy Logs t-shirt when you download Logger. Secure your free ArcSight Logger TODAY! http://p.sf.net/sfu/arcsisghtdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] My problems with restore.
Hi, Les Mikesell wrote on 2011-08-31 17:01:24 -0500 [Re: [BackupPC-users] My problems with restore.]: On Wed, Aug 31, 2011 at 3:13 PM, Joe Skop joe.s...@gmail.com wrote: - Download singolar files or folders, not bigger of 70/100 Mb (or the download process hangs), transfer on the new instances and so like before. Hangs? Is this a network problem? it could be a full FS wherever the browser decides to temporarily store the download (no, that's not where you tell it to save it to). Yes, maybe some timeouts on the networks... I never investigate so much about this not conventional problem. The quick fix, regardless of the real problem, is probably to run the web browser on the backuppc server itself or some nearby machine with a good network connection and download a tar image of what you need through the browser. Yes, maybe there's more tmp space there ;-). Then use whatever transfer method you think will be reliable to get the file where you want it and can to the tar extract. That's what I like about the 'quick and easy solutions' instead of configuring your system correctly: they're often neither quick nor easy, but you usually don't notice until you've spent more time trying them than you would have setting things up correctly in the first place ;-). If I send the recovering via shell: /usr/bin/ssh -x -l USER localhost env LC_ALL=C /bin/tar -x -p --numeric-owner --same-owner -v -f - -C /tmp /usr/share/backuppc/bin/BackupPC_tarCreate -h SERVER_ORIG -n 134 -s /mnt/dump -t -r /mysql -p /mysql/ /mysql/backup.log /bin/tar: You may not specify more than one `-Acdtrux' option Try `/bin/tar --help' or `/bin/tar --usage' for more information. Yes. tar is absolutely right. I don't understand what you are trying to do there. I do, but it's not correct. From the web interface, I select a host, Backup browse for HOST, I browse the folders, I select a example file, i press Restore selected files, I go to Restore Options for HOST, I choose the 1st option Option 1: Direct Restore, I select localhost (that is the server that hosts the backuppc, I choose the share and the dir. Next screen Are you sure?, the positions are: Original file/dir HOST:/mnt/dump/mysql/backup.log Will be restored to localhost:/tmp/mysql/backup.log If I press Restore, after a few seconds I have the message on the Error: restore failed: Tar exited with error 65280 () status. As Les said, for that to work, you need to set up ssh access to localhost (or whatever replacement host you want to use) in *exactly* the same way as you did for the target host. This is not something you'd normally do (unless, maybe, if you are backing up your BackupPC server), so you probably haven't. If you had, it would probably work. If I try manually, by shell, by the correct user etc etc, to do the same commandline, I have the error /bin/tar: You may not specify more than one `-Acdtrux' opion Well, what you quoted above is not even *nearly* the same commandline. Look again. (I'll admit I haven't seen a log file entry for a direct restore, because I've never done one and don't intend to. If it should really look like what you quoted above, then that's a bug. However, I'd be surprised if that's the case.) As Les has stated twice, you need to have something along the lines of BackupPC_tarCreate ... | ssh tar -x ... (with the appropriate arguments and possibly paths to the commands). Simply giving the BackupPC_tarCreate invocation as extraneous parameters to the remote tar command is guaranteed to never do *anything* meaningful (let alone what you want), because the '-t' option of BackupPC_tarCreate will confuse tar (which has already seen the '-x' option). You can leave out the '-t' option if you want to see a different error message (or possibly tar will just patiently wait for a tar stream to extract). Or you could put together the correct command. Chances are, it will be /usr/share/backuppc/bin/BackupPC_tarCreate -h SERVER_ORIG -n 134 -s /mnt/dump -t /mysql/backup.log | /usr/bin/ssh -x -l USER localhost /bin/tar -x -p --numeric-owner --same-owner -v -f - -C /tmp but only if your quote was correct apart from the order. If you're wondering why I left out the '-r' and '-p' arguments, it's because replacing '/mysql' by '/mysql/' is pointless. Likewise, you'll probably understand tar's output without LC_ALL=C (BackupPC might not). I could have changed 'ssh' into 'sudo'. The point is, if you understand what the parts do, it's simple to rearrange or modify them to do what you need. If you don't, you'll end up copying something incorrectly and wondering why it doesn't work. It's not really complicated. BackupPC_tarCreate creates a tar from the backup from host SERVER_ORIG, number 134, share /mnt/dump, file(s) /mysql/backup.log. '-t' makes it print summary totals. You don't need them, so you can remove that if you want to. If you wanted more files/directories