It is random whether the Hadoop developer who did the release built it with
Snappy or whatever options.

You will either have to built Hadoop yourself (have fun with that 😁) or
use a different version.  I ran into this issue when we added Zstandard.  I
put some notes on the issue here if you want to try building Hadoop:
https://github.com/apache/accumulo/issues/438

On Thu, May 30, 2019 at 2:03 PM Jeffrey Manno <[email protected]>
wrote:

> I did some more digging on this. It does appear that snappy is not being
> properly built while the rest of the other native libraries are. I am not
> completely sure on why this is. Whether its because of installing through
> fluo-uno or with this version of hadoop (3.2.0). I did get past this by
> adding the libsnappy.so.1 file manually.
>
> On Tue, May 28, 2019 at 5:36 PM Jeffrey Manno <[email protected]>
> wrote:
>
> > I have ran into this issue before. I believe it has something to do with
> > the second part Christopher mentioned.
> > I know installing snappy into the OS only causes more headaches. I can
> > take a deeper look into it tomorrow.
> >
> > On Tue, May 28, 2019 at 5:31 PM Christopher <[email protected]> wrote:
> >
> >> You might need to install snappy into your OS.
> >> On Fedora: 'sudo dnf install snappy'
> >> On RHEL/CentOS: 'sudo yum install snappy'
> >>
> >> It's also possible that the version of Hadoop you're using doesn't
> >> have its native libraries built, but it's hard to know more without
> >> seeing more of the error message (like an associated stack trace or
> >> logger class name). If it is Hadoop not having it's native libraries
> >> built, I'm not sure how to rebuild Hadoop from source.
> >>
> >> On Tue, May 28, 2019 at 3:48 PM Jeffrey Zeiberg <[email protected]>
> >> wrote:
> >> >
> >> > I am seeing this in my error logs on my Uno instance when I run the
> >> > "./bin/cingest ingest" in accumulo-testing.  Anyone else seeing this?
> >> >
> >>
> >
>

Reply via email to