Something to try:

1.       Run ‘hadoop checknative’ to see if anything wrong;

2.       Find the Hadoop native DLL and using some tool inspect its symbols, 
see if there is the required function related to crc32.

This is a rather old function I’m not sure it was changed around the version.

Regards,
Kai

From: karthi keyan [mailto:karthi93.san...@gmail.com]
Sent: Tuesday, March 29, 2016 3:16 PM
To: Brahma Reddy Battula <brahmareddy.batt...@huawei.com>
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Yes, built with right libraries.
In my case i have to connect with remote cluster which accommodate Hadoop 
(built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula 
<brahmareddy.batt...@huawei.com<mailto:brahmareddy.batt...@huawei.com>> wrote:
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan 
[mailto:karthi93.san...@gmail.com<mailto:karthi93.san...@gmail.com>]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org<mailto:user@hadoop.apache.org>
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate 
with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula 
<brahmareddy.batt...@huawei.com<mailto:brahmareddy.batt...@huawei.com>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan 
[mailto:karthi93.san...@gmail.com<mailto:karthi93.san...@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<mailto:user@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i 
have replaced (rebuid) the jars. Does any one suggest me the right way to 
resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: 
org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at 
org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) 
~[hadoop-common-2.6.2.jar:na]
            at 
org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) 
~[hadoop-common-2.6.2.jar:na]
            at 
org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) 
~[hadoop-common-2.6.2.jar:na]
            at 
org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216)
 ~[hadoop-hdfs-2.6.2.jar:na]
            at 
org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) 
~[hadoop-hdfs-2.6.2.jar:na]
            at 
org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693)
 ~[hadoop-hdfs-2.6.2.jar:na]
            at 
org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) 
~[hadoop-hdfs-2.6.2.jar:na]
            at 
org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) 
~[hadoop-hdfs-2.6.2.jar:na]
            at 
org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) 
~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) 
~[na:1.7.0]

Regards,
Karthikeyan S


Reply via email to