Re: [BackupPC-users] Can't Fork Crash on Nexenta (Solaris)
I am running BackupPC 3.2.0. The line where it fails is: if ( !defined($pid = open(CHILD, -|)) ) { So it looks like it is attempting to fork... Stephen Gelman Systems Administrator On Apr 28, 2011, at 11:55 PM, Holger Parplies wrote: Hi, Stephen Gelman wrote on 2011-04-20 22:57:38 -0500 [[BackupPC-users] Can't Fork Crash on Nexenta (Solaris)]: On Nexenta (which is essentially an OpenSolaris derivative), I seem to have issues where BackupPC crashes every once and a while. When it crashes, the log says: Can't fork at /usr/share/backuppc/lib/BackupPC/Lib.pm line 1340. Any ideas how to prevent this? Stephen Gelman Systems Administrator errm, have less processes running on your machine? What errno is that? Line 1340 contains a Perl 'return' statement, so that's strange (since you didn't mention it, you must be using BackupPC 3.2.0beta0, because that's the version I happened to check). Which log file? How come BackupPC writes something to the log if it crashes? BackupPC doesn't even *try* to fork via Lib.pm (only BackupPC_{dump,restore,archive} appear to use that), and failures to fork in the daemon are certainly not fatal (except for daemonizing on startup). Very strange. Oh, and we know what Nexenta is. That's the part you wouldn't have needed to explain. Regards, Holger -- WhatsUp Gold - Download Free Network Management Software The most intuitive, comprehensive, and cost-effective network management toolset available today. Delivers lowest initial acquisition cost and overall TCO of any competing solution. http://p.sf.net/sfu/whatsupgold-sd ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] Can't Fork Crash on Nexenta (Solaris)
On Nexenta (which is essentially an OpenSolaris derivative), I seem to have issues where BackupPC crashes every once and a while. When it crashes, the log says: Can't fork at /usr/share/backuppc/lib/BackupPC/Lib.pm line 1340. Any ideas how to prevent this? Stephen Gelman Systems Administrator BlueStar Energy Services -- Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] BackupPC_tarCreate Problem
Hi, I am running BackupPC 3.1.0 on Nexenta. It seems to be working for the most part. I am having a problem with BackupPC_tarCreate. I am trying to create a tar of a 30gb backup. The tar I create ends up being 30gb, but when extracted it only takes up 5gb and is missing a lot of files. I can restore the missing files using the web interface, so I know that they are being backed up and that BackupPC has permission to access them. Does anyone have any idea what's going on? The only clue I have is that I repeatedly get tar: Skipping to next header when untaring the file. Thanks! Stephen -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] BackupPC_tarCreate Problem
Using gnu tar. This happens both if I pipe the output of BackupPC_createTar directly to tar and if I untar from the file. More specifically, the tar command I am using is tar -xf - -C MYDIRECTORY. No compression and the archive is staying on the same server. Stephen On 08/11/2010 12:33 PM, John Rouillard wrote: On Wed, Aug 11, 2010 at 10:23:59AM -0500, Stephen Gelman wrote: I am running BackupPC 3.1.0 on Nexenta. It seems to be working for the most part. I am having a problem with BackupPC_tarCreate. I am trying to create a tar of a 30gb backup. The tar I create ends up being 30gb, but when extracted it only takes up 5gb and is missing a lot of files. I can restore the missing files using the web interface, so I know that they are being backed up and that BackupPC has permission to access them. Does anyone have any idea what's going on? The only clue I have is that I repeatedly get tar: Skipping to next header when untaring the file. Which tar are you using to do the restore: native solaris /usr/bin/tar (or /usr/sbin/static/tar), gnu tar, pax? How are you supplying the 30 GB file to the restoring tar, stdin as a file on the command line ...? Do you have any compression in the picture? Also are you moving between architectures or little to big endian machines? -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] BackupPC_tarCreate Problem
When I try and extract the tar file using pax, I get a lot of errors similar to the following: pax: checksum error on header record : 4m^[.^UUWD/1 pax: test.tar : This doesn't look like a tar archive pax: test.tar : Skipping to next file... pax: checksum error on header record : CE/9j^G:ly4O pax: test.tar : Skipping to next file... On 08/11/2010 03:35 PM, John Rouillard wrote: On Wed, Aug 11, 2010 at 01:44:19PM -0500, Stephen Gelman wrote: On 08/11/2010 12:33 PM, John Rouillard wrote: On Wed, Aug 11, 2010 at 10:23:59AM -0500, Stephen Gelman wrote: I am running BackupPC 3.1.0 on Nexenta. It seems to be working for the most part. I am having a problem with BackupPC_tarCreate. I am trying to create a tar of a 30gb backup. The tar I create ends up being 30gb, but when extracted it only takes up 5gb and is missing a lot of files. I can restore the missing files using the web interface, so I know that they are being backed up and that BackupPC has permission to access them. Does anyone have any idea what's going on? The only clue I have is that I repeatedly get tar: Skipping to next header when untaring the file. Which tar are you using to do the restore: native solaris /usr/bin/tar (or /usr/sbin/static/tar), gnu tar, pax? How are you supplying the 30 GB file to the restoring tar, stdin as a file on the command line ...? Do you have any compression in the picture? Also are you moving between architectures or little to big endian machines? Using gnu tar. This happens both if I pipe the output of BackupPC_createTar directly to tar and if I untar from the file. More specifically, the tar command I am using is tar -xf - -C MYDIRECTORY. No compression and the archive is staying on the same server. I assume you are doing this locally on the nexenta box w/o ssh etc. That should rule out blocking issues and network corruption issues. But have you tried setting blocking to 20 explicitly in gnu tar (I think that's what BackupPC_tarCreate uses). Does: tar -xvf - -C MYDIRECTORY change anything? What are your arguments to BackupPC_tarCreate? You aren't redirecting errors from BackupPC_tarCreate onto stdout using 21 are you? Can you restore a subset of the 30GB of data using: BackupPC_tarCreate ... -s share directory/path where directory/path is a directory that is not being restored in the 30GB backup but is present in the backuppc web interface? If you have pax installed does using it in place of tar produce better diagnostics (e.g. why it doesn't look like file header)? If you can get a smaller subset of restored files to produce the error then you could look at the tar file and try to figure out what is confusing tar. If you see a lot of nulls where there shouldn't be any, maybe bad block factor despite what I said above? -- This SF.net email is sponsored by Make an app they can't live without Enter the BlackBerry Developer Challenge http://p.sf.net/sfu/RIM-dev2dev ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/