Hello,

I am trying to install hbase for about 4 days, but without success.
I am getting a strange error:
Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but
interface was expected

What does it mean? How to fix it?
I am using hadoop 3.0.0-alpha4, hbase 2.0.0-alpha2 and zookeeper 3.4.10.

Is it possible to get them work together?

I am trying to execute "bin/hbase shell" to open hbase shell but still it
doesn't work. What am I doing wrong?
It seams that hbase is able to create its files in hdfs but why shell is
not working?

Here is hbase-site.xml:

<configuration>
<property>
  <name>hbase.rootdir</name>
  <value>hdfs://127.0.0.1:8020/hbase</value>
</property>
<property>
  <name>hbase.zookeeper.quorum</name>
  <value>127.0.0.1</value>
</property>
</configuration>

Here is hdfs-site.xml:

<configuration>
    <property>
        <name>dfs.namenode.name.dir</name>
        <value>file:///srv/hadoop/hdfs/nn</value>
        <final>true</final>
    </property>

    <property>
        <name>dfs.datanode.data.dir</name>
        <value>file:///srv/hadoop/hdfs/dn</value>
        <final>true</final>
    </property>
    <property>
        <name>dfs.namenode.http-address</name>
        <value>127.0.0.1:50070</value>
        <final>true</final>
    </property>

    <property>
        <name>dfs.secondary.namenode.http-address</name>
        <value>127.0.0.1:50090</value>
        <final>true</final>
    </property>

    <property>
        <name>dfs.hosts</name>
        <value>/etc/hadoop/conf/dfs.hosts</value>
    </property>

    <property>
        <name>dfs.hosts.exclude</name>
        <value>/etc/hadoop/conf/dfs.hosts.exclude</value>
    </property>

    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>


Here is core-site.xml:

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://127.0.0.1:8020</value>
    </property>
    <property>
        <name>io.native.lib.available</name>
        <value>True</value>
    </property>
    <property>
        <name>fs.trash.interval</name>
        <value>60</value>
    </property>
    <property>
        <name>io.file.buffer.size</name>
        <value>65536</value>
    </property>
</configuration>

Here is the output when I am trying to execute "bin/hbase shell":
http://paste.openstack.org/show/619575/

In /etc/environment I have the next option:
export HADOOP_USER_CLASSPATH_FIRST=true

Don't know what else can be done to repair this issue..
Please, help with any ideas.

Best regards

Reply via email to