Hi Vinay,

maybe this
http://stackoverflow.com/questions/22150417/hadoop-mapreduce-java-lang-unsatisfiedlinkerror-org-apache-hadoop-util-nativec
might help you.

Cheers,
Till

On Thu, Jun 23, 2016 at 5:15 AM, Vinay Patil <vinay18.pa...@gmail.com>
wrote:

> Hi All,
>
> Just an update on this:
>
> Setting the codec using DataFileWriter setCodec method :
>
> writer.setCodec(CodecFactory.snappyCodec());
>
> Pls help me with this issue
>
> Regards,
>
> Vinay
> On Jun 22, 2016 10:47 PM, "Vinay Patil" <vinay18.pa...@gmail.com> wrote:
>
> > Hi ,
> >
> > I am writing the data in S3 in avro data file, but when I deploy it on
> > cluster , I am facing the following issue :
> >
> > *java.lang.UnsatisfiedLinkError:
> > org.xerial.snappy.SnappyNative.maxCompressedLength(I)*
> >
> > This occurs when I try to add the record (GenericRecord) as :
> > writer.append(record);
> >
> > The error still occurs even after removing the snappy jar dependency from
> > pom.
> >
> > Do we have to add any other dependency ?
> >
> > Regards,
> > Vinay Patil
> >
>

Reply via email to