To add to Jeremy's last point, even after the library is present, the files 
must be greater than the HDFS block size (default is 64 MB I think?) or 
Hadoop-snappy will also not compress them.

Sent from my iPhone

> On Feb 6, 2016, at 5:41 PM, Jeremy Dyer <jdy...@gmail.com> wrote:
> 
> Shweta,
> 
> Looks like your missing the snappy native library. I have seen this several
> times before. Assuming your on a linux machine you have 2 options. You can
> copy the libsnappy.so native library to your JAVA_HOME/jre/lib native
> directory. Or you can set LD_LIBRARY_PATH to point to where your
> libsnappy.so native library is located on the machine.
> 
> I believe if you closely examine the files that are being written to HDFS
> with a .snappy extension you will see that in fact that are not actually
> snappy compressed.
> 
> Jeremy Dyer
> 
>> On Sat, Feb 6, 2016 at 1:04 PM, Joe Witt <joe.w...@gmail.com> wrote:
>> 
>> Can you show what is in your core-site.xml and the proc properties.
>> Also can you show the full log output?
>> 
>> Thanks
>> Joe
>> 
>>> On Sat, Feb 6, 2016 at 9:11 AM, shweta <shweta.agg1...@gmail.com> wrote:
>>> Hi All,
>>> 
>>> I'm getting a java.lang.UnsatisfiedLinkError while adding data into
>> PutHDFS
>>> processor with compression codec as snappy. The error message says
>> "Failed
>>> to write to HDFS due to
>>> org.apache.hadoop.util.NativeCodeloader.build.SupportsSnappy()Z.
>>> 
>>> Inspite of this error, .snappy files are being written in my Hdfs.
>>> 
>>> Has anyone faced a similar issue before or can provide any pointers.
>>> 
>>> Thanks,
>>> Shweta
>>> 
>>> 
>>> 
>>> --
>>> View this message in context:
>> http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-tp7182.html
>>> Sent from the Apache NiFi Developer List mailing list archive at
>> Nabble.com.
>> 

Reply via email to