Hey friends. I build hadoop 2.5.2 on my pc and I am able to run map reduce jobs locally after setting hadoop_home.I am trying to set this up in another machine by using the same tar file that i built in mine but getting the below error. Can you please help
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(II[BI[BIILjava/lang/String;JZ)V at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(*Native Method*) at org.apache.hadoop.util.NativeCrc32.calculateChunkedSumsByteArray( *NativeCrc32.java:86*) at org.apache.hadoop.util.DataChecksum.calculateChunkedSums( *DataChecksum.java:430*) at org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks( *FSOutputSummer.java:202*) at org.apache.hadoop.fs.FSOutputSummer.flushBuffer( *FSOutputSummer.java:163*) at org.apache.hadoop.fs.FSOutputSummer.flushBuffer( *FSOutputSummer.java:144*) at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.close( *ChecksumFileSystem.java:400*) at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close( *FSDataOutputStream.java:72*) at org.apache.hadoop.fs.FSDataOutputStream.close( *FSDataOutputStream.java:106*) at org.apache.hadoop.mapreduce.split.JobSplitWriter.createSplitFiles( *JobSplitWriter.java:80*) at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits( *JobSubmitter.java:603*) at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits( *JobSubmitter.java:614*) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal( *JobSubmitter.java:492*) at org.apache.hadoop.mapreduce.Job$10.run(*Job.java:1296*) at org.apache.hadoop.mapreduce.Job$10.run(*Job.java:1293*) -Nishanth