Thanks for the response. They are not. Hadoop 3.2.2 is running 3.2.2 libs,
while HBase is using the 2.10 Hadoop libs. Do these need to be in sync?

I did try syncing them up and adding 3.2.2 libs into HBase, replacing all
of the hadoop-* files and I got this error:

Unhandled: Found interface org.apache.hadoop.hdfs.protocol.HdfsFileStatus,
but class was expected

In searching for an answer to this I read that if using 3.1 or higher of
Hadoop you need to compile HBase with special flag
https://issues.apache.org/jira/browse/HBASE-22394
https://issues.apache.org/jira/browse/HBASE-24154

Do I have to compile HBase in order to get HBase to work with Hadoop 3.1 or
higher?

On Mon, Oct 18, 2021 at 7:13 AM Josh Elser <els...@apache.org> wrote:

> Are the Hadoop JARs which you're using inside HBase the same as the
> Hadoop version you're running? (e.g. in $HBASE_HOME/lib)
>
> On 10/15/21 6:18 PM, Damillious Jones wrote:
> > Hi all, I am seeing a similar issue which is noted in HBASE-26007 where
> > HBase will not start if dfs.encrypt.data.transfer in HDFS is set to true.
> > When I start HBase I see the following error message on the master node:
> >
> > java.io.IOException: Invalid token in javax.security.sasl.qop:
> >          at
> >
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.readSaslMessage(DataTransferSaslUtil.java:220)
> >
> > I am using Hadoop 3.2.2 and HBase 2.4.5 with Java 1.8. If I use Hadoop
> > 3.1.1 it works fine. Has anyone else encountered this issue?
> >
> > Any help would be appreciated, thanks.
> >
>

Reply via email to