I updated the docker image file to remove the hadoop jars bundled with
HBase and replace them with the ones from hadoop 2.8.5 from maven. Maven
does not host any of the test jars which were present in the HBase
distribution, but I did not notice any adverse effects.
The image works correctly on all my machines after this update, however,
I am still baffled as to why it the one using Hadoop 2.7.4 jars worked
correctly on some machines but failed on others.
On 28/09/2018 8:24 AM, Francis Chuang wrote:
I tried updating my hbase-phoenix-all-in-one image to use HBase built
with Hadoop 3. Unfortunately, it didn't work. I think this might be
because Hadoop 3.0.0 is too new for tephra (which uses 2.2.0):
starting Query Server, logging to
/tmp/phoenix/phoenix-root-queryserver.log
Thu Sep 27 02:50:56 UTC 2018 Starting tephra service on
m401b01-phoenix.m401b01
Running class org.apache.tephra.TransactionServiceMain
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/opt/hbase/lib/phoenix-5.0.0-HBase-2.0-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/opt/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
Exception in thread "main" java.lang.NoSuchMethodError:
com.ctc.wstx.stax.WstxInputFactory.createSR(Lcom/ctc/wstx/api/ReaderConfig;Lcom/ctc/wstx/io/SystemId;Lcom/ctc/wstx/io/InputBootstrapper;ZZ)Lorg/codehaus/stax2/XMLStreamReader2;
at
org.apache.hadoop.conf.Configuration.parse(Configuration.java:2803)
at
org.apache.hadoop.conf.Configuration.parse(Configuration.java:2787)
at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2838)
at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2812)
at
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2689)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1160)
at
org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1214)
at
org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1620)
at
org.apache.hadoop.hbase.HBaseConfiguration.checkDefaultsVersion(HBaseConfiguration.java:66)
at
org.apache.hadoop.hbase.HBaseConfiguration.addHbaseResources(HBaseConfiguration.java:80)
at
org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:94)
at
org.apache.hadoop.hbase.util.HBaseConfTool.main(HBaseConfTool.java:39)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/opt/hbase/lib/phoenix-5.0.0-HBase-2.0-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/opt/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
Exception in thread "main" java.lang.NoSuchMethodError:
com.ctc.wstx.stax.WstxInputFactory.createSR(Lcom/ctc/wstx/api/ReaderConfig;Lcom/ctc/wstx/io/SystemId;Lcom/ctc/wstx/io/InputBootstrapper;ZZ)Lorg/codehaus/stax2/XMLStreamReader2;
at
org.apache.hadoop.conf.Configuration.parse(Configuration.java:2803)
at
org.apache.hadoop.conf.Configuration.parse(Configuration.java:2787)
at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2838)
at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2812)
at
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2689)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1160)
at
org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1214)
at
org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1620)
at
org.apache.hadoop.hbase.HBaseConfiguration.checkDefaultsVersion(HBaseConfiguration.java:66)
at
org.apache.hadoop.hbase.HBaseConfiguration.addHbaseResources(HBaseConfiguration.java:80)
at
org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:94)
at
org.apache.hadoop.hbase.zookeeper.ZKServerTool.main(ZKServerTool.java:63)
running master, logging to
/opt/hbase/logs/hbase--master-m401b01-phoenix.m401b01.out
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/opt/hbase/lib/phoenix-5.0.0-HBase-2.0-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/opt/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
Exception in thread "main" java.lang.NoSuchMethodError:
com.ctc.wstx.stax.WstxInputFactory.createSR(Lcom/ctc/wstx/api/ReaderConfig;Lcom/ctc/wstx/io/SystemId;Lcom/ctc/wstx/io/InputBootstrapper;ZZ)Lorg/codehaus/stax2/XMLStreamReader2;
at
org.apache.hadoop.conf.Configuration.parse(Configuration.java:2803)
at
org.apache.hadoop.conf.Configuration.parse(Configuration.java:2787)
at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2838)
at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2812)
at
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2689)
: running regionserver, logging to
/opt/hbase/logs/hbase--regionserver-m401b01-phoenix.m401b01.out
: SLF4J: Class path contains multiple SLF4J bindings.
: SLF4J: Found binding in
[jar:file:/opt/hbase/lib/phoenix-5.0.0-HBase-2.0-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
: SLF4J: Found binding in
[jar:file:/opt/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
: Exception in thread "main" java.lang.NoSuchMethodError:
com.ctc.wstx.stax.WstxInputFactory.createSR(Lcom/ctc/wstx/api/ReaderConfig;Lcom/ctc/wstx/io/SystemId;Lcom/ctc/wstx/io/InputBootstrapper;ZZ)Lorg/codehaus/stax2/XMLStreamReader2;
: at
org.apache.hadoop.conf.Configuration.parse(Configuration.java:2803)
: at
org.apache.hadoop.conf.Configuration.parse(Configuration.java:2787)
: at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2838)
: at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2812)
: at
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2689)
I built HBase with 2.8.5 and was able to resolve the hasKerberosKeyTab
methodNotFound error. The only problem is that it took 4 hours to run
an automated build on Docker cloud to build HBase and the build failed
eventually. I think I am going to download the Hadoop jars
from maven, rather than build HBase.
On 27/09/2018 12:56 AM, Josh Elser wrote:
If you're using HBase with Hadoop3, HBase should have Hadoop3 jars.
Re-build HBase using the -Dhadoop.profile=3.0 (I think it is) CLI
option.
On 9/26/18 7:21 AM, Francis Chuang wrote:
Upon further investigation, it appears that this is because
org.apache.hadoop.security.authentication.util.KerberosUtil.hasKerberosKeyTab
is only available in Hadoop 2.8+. HBase ships with Hadoop 2.7.4 jars.
I noticed that Hadoop was bumped from 2.7.4 to 3.0.0 a few months
ago to fix PQS/Avatica issues:
https://github.com/apache/phoenix/blame/master/pom.xml#L70
I think this causes Phoenix to expect some things that are available
in Hadoop 3.0.0, but are not present in HBase's Hadoop 2.7.4 jars.
I think I can try and replace the hadoop-*.jar files in hbase/lib
with the equivalent 2.8.5 versions, however I am not familiar with
Java and the hadoop project, so I am not sure if this is going to
introduce issues.
On 26/09/2018 4:44 PM, Francis Chuang wrote:
I wonder if this is because:
- HBase's binary distribution ships with Hadoop 2.7.4 jars.
- Phoenix 5.0.0 has Hadoop 3.0.0 declared in its pom.xml:
https://github.com/apache/phoenix/blob/8a819c6c3b4befce190c6ac759f744df511de61d/pom.xml#L70
- Tephra has Hadoop 2.2.0 declared in its pom.xml:
https://github.com/apache/incubator-tephra/blob/master/pom.xml#L211
On 26/09/2018 4:03 PM, Francis Chuang wrote:
Hi all,
I am using Phoenix 5.0.0 with HBase 2.0.0. I am seeing errors
while trying to create transactional tables using Phoenix.
I am using my Phoenix + HBase all in one docker image available
here: https://github.com/Boostport/hbase-phoenix-all-in-one
This is the error:
org.apache.phoenix.shaded.org.apache.thrift.TException: Unable to
discover transaction service. -> TException: Unable to discover
transaction service.
I checked the tephra logs and got the following:
Exception in thread "HDFSTransactionStateStorage STARTING"
Exception in thread "ThriftRPCServer"
com.google.common.util.concurrent.ExecutionError:
java.lang.NoSuchMethodError:
org.apache.hadoop.security.authentication.util.KerberosUtil.hasKerberosKeyTab(Ljavax/security/auth/Subject;)Z
at
com.google.common.util.concurrent.Futures.wrapAndThrowUnchecked(Futures.java:1008)
at
com.google.common.util.concurrent.Futures.getUnchecked(Futures.java:1001)
at
com.google.common.util.concurrent.AbstractService.startAndWait(AbstractService.java:220)
at
com.google.common.util.concurrent.AbstractIdleService.startAndWait(AbstractIdleService.java:106)
at
org.apache.tephra.TransactionManager.doStart(TransactionManager.java:245)
at
com.google.common.util.concurrent.AbstractService.start(AbstractService.java:170)
at
com.google.common.util.concurrent.AbstractService.startAndWait(AbstractService.java:220)
at
org.apache.tephra.distributed.TransactionServiceThriftHandler.init(TransactionServiceThriftHandler.java:249)
at
org.apache.tephra.rpc.ThriftRPCServer.startUp(ThriftRPCServer.java:177)
at
com.google.common.util.concurrent.AbstractExecutionThreadService$1$1.run(AbstractExecutionThreadService.java:47)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoSuchMethodError:
org.apache.hadoop.security.authentication.util.KerberosUtil.hasKerberosKeyTab(Ljavax/security/auth/Subject;)Z
at
org.apache.hadoop.security.UserGroupInformation.<init>(UserGroupInformation.java:715)
at
org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:925)
at
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:873)
at
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:740)
at
org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:3472)
at
org.apache.hadoop.fs.FileSystem$Cache.getUnique(FileSystem.java:3310)
at
org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:529)
at
org.apache.tephra.persist.HDFSTransactionStateStorage.startUp(HDFSTransactionStateStorage.java:104)
at
com.google.common.util.concurrent.AbstractIdleService$1$1.run(AbstractIdleService.java:43)
... 1 more
2018-09-26 04:31:11,290 INFO [leader-election-tx.service-leader]
distributed.TransactionService
(TransactionService.java:leader(115)) - Transaction Thrift Service
didn't start on /0.0.0.0:15165
java.lang.NoSuchMethodError:
org.apache.hadoop.security.authentication.util.KerberosUtil.hasKerberosKeyTab(Ljavax/security/auth/Subject;)Z
at
org.apache.hadoop.security.UserGroupInformation.<init>(UserGroupInformation.java:715)
at
org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:925)
at
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:873)
at
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:740)
at
org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:3472)
at
org.apache.hadoop.fs.FileSystem$Cache.getUnique(FileSystem.java:3310)
at
org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:529)
at
org.apache.tephra.persist.HDFSTransactionStateStorage.startUp(HDFSTransactionStateStorage.java:104)
at
com.google.common.util.concurrent.AbstractIdleService$1$1.run(AbstractIdleService.java:43)
at java.lang.Thread.run(Thread.java:748)
I know that HBase ships with the Hadoop 2.7.4 jars and I was not
able to find "hasKerberosKeyTab" grepping through the source code
for hadoop 2.7.4. However, I checked the Hadoop 2.7.4 source files
from the stack trace above and the line numbers do not match up.
Interestingly, I only see this issue on my older machine (Core i7
920 with 12GB of RAM) and Gitlab's CI environment (a Google Cloud
n1-standard-1 instance with 1vCPU and 3.75GB of RAM). I know
Michael also encountered this problem while running the Phoenix
tests for calcite-avatica-go on an older i5 machine from 2011.
It does seem to be pretty weird that we are only seeing this on
machines where the CPU is not very powerful.
I also printed the classpath for tephra by doing:
$ # export HBASE_CONF_DIR=/opt/hbase/conf
$ # export HBASE_CP=/opt/hbase/lib
$ # export HBASE_HOME=/opt/hbase
$ # /opt/hbase/bin/tephra classpath
/opt/hbase/bin/../lib/*:/opt/hbase/bin/../conf/:/opt/hbase/phoenix-client/target/*:/opt/hbase/conf:/usr/lib/jvm/java-1.8-openjdk/jre/lib/tools.jar:/opt/hbase:/opt/hbase/lib/aopalliance-1.0.jar:/opt/hbase/lib/aopalliance-repackaged-2.5.0-b32.jar:/opt/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/opt/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/hbase/lib/api-asn1-api-1.0.0-M20.jar:/opt/hbase/lib/api-util-1.0.0-M20.jar:/opt/hbase/lib/asm-3.1.jar:/opt/hbase/lib/audience-annotations-0.5.0.jar:/opt/hbase/lib/avro-1.7.7.jar:/opt/hbase/lib/commons-beanutils-core-1.8.0.jar:/opt/hbase/lib/commons-cli-1.2.jar:/opt/hbase/lib/commons-codec-1.10.jar:/opt/hbase/lib/commons-collections-3.2.2.jar:/opt/hbase/lib/commons-compress-1.4.1.jar:/opt/hbase/lib/commons-configuration-1.6.jar:/opt/hbase/lib/commons-crypto-1.0.0.jar:/opt/hbase/lib/commons-daemon-1.0.13.jar:/opt/hbase/lib/commons-digester-1.8.jar:/opt/hbase/lib/commons-httpclient-3.1.jar:/opt/hbase/lib/commons-io-2.5.jar:/opt/hbase/lib/commons-lang-2.6.jar:/opt/hbase/lib/commons-lang3-3.6.jar:/opt/hbase/lib/commons-logging-1.2.jar:/opt/hbase/lib/commons-math3-3.6.1.jar:/opt/hbase/lib/commons-net-3.1.jar:/opt/hbase/lib/curator-client-4.0.0.jar:/opt/hbase/lib/curator-framework-4.0.0.jar:/opt/hbase/lib/curator-recipes-4.0.0.jar:/opt/hbase/lib/disruptor-3.3.6.jar:/opt/hbase/lib/findbugs-annotations-1.3.9-1.jar:/opt/hbase/lib/gson-2.2.4.jar:/opt/hbase/lib/guice-3.0.jar:/opt/hbase/lib/guice-servlet-3.0.jar:/opt/hbase/lib/hadoop-annotations-2.7.4.jar:/opt/hbase/lib/hadoop-auth-2.7.4.jar:/opt/hbase/lib/hadoop-client-2.7.4.jar:/opt/hbase/lib/hadoop-common-2.7.4-tests.jar:/opt/hbase/lib/hadoop-common-2.7.4.jar:/opt/hbase/lib/hadoop-distcp-2.7.4.jar:/opt/hbase/lib/hadoop-hdfs-2.7.4-tests.jar:/opt/hbase/lib/hadoop-hdfs-2.7.4.jar:/opt/hbase/lib/hadoop-mapreduce-client-app-2.7.4.jar:/opt/hbase/lib/hadoop-mapreduce-client-common-2.7.4.jar:/opt/hbase/lib/hadoop-mapreduce-client-core-2.7.4.jar:/opt/hbase/lib/hadoop-mapreduce-client-hs-2.7.4.jar:/opt/hbase/lib/hadoop-mapreduce-client-jobclient-2.7.4.jar:/opt/hbase/lib/hadoop-mapreduce-client-shuffle-2.7.4.jar:/opt/hbase/lib/hadoop-minicluster-2.7.4.jar:/opt/hbase/lib/hadoop-yarn-api-2.7.4.jar:/opt/hbase/lib/hadoop-yarn-client-2.7.4.jar:/opt/hbase/lib/hadoop-yarn-common-2.7.4.jar:/opt/hbase/lib/hadoop-yarn-server-applicationhistoryservice-2.7.4.jar:/opt/hbase/lib/hadoop-yarn-server-common-2.7.4.jar:/opt/hbase/lib/hadoop-yarn-server-nodemanager-2.7.4.jar:/opt/hbase/lib/hadoop-yarn-server-resourcemanager-2.7.4.jar:/opt/hbase/lib/hadoop-yarn-server-tests-2.7.4-tests.jar:/opt/hbase/lib/hadoop-yarn-server-web-proxy-2.7.4.jar:/opt/hbase/lib/hbase-annotations-2.0.0-tests.jar:/opt/hbase/lib/hbase-annotations-2.0.0.jar:/opt/hbase/lib/hbase-client-2.0.0.jar:/opt/hbase/lib/hbase-common-2.0.0-tests.jar:/opt/hbase/lib/hbase-common-2.0.0.jar:/opt/hbase/lib/hbase-endpoint-2.0.0.jar:/opt/hbase/lib/hbase-examples-2.0.0.jar:/opt/hbase/lib/hbase-external-blockcache-2.0.0.jar:/opt/hbase/lib/hbase-hadoop-compat-2.0.0-tests.jar:/opt/hbase/lib/hbase-hadoop-compat-2.0.0.jar:/opt/hbase/lib/hbase-hadoop2-compat-2.0.0-tests.jar:/opt/hbase/lib/hbase-hadoop2-compat-2.0.0.jar:/opt/hbase/lib/hbase-http-2.0.0.jar:/opt/hbase/lib/hbase-it-2.0.0-tests.jar:/opt/hbase/lib/hbase-it-2.0.0.jar:/opt/hbase/lib/hbase-mapreduce-2.0.0-tests.jar:/opt/hbase/lib/hbase-mapreduce-2.0.0.jar:/opt/hbase/lib/hbase-metrics-2.0.0.jar:/opt/hbase/lib/hbase-metrics-api-2.0.0.jar:/opt/hbase/lib/hbase-procedure-2.0.0.jar:/opt/hbase/lib/hbase-protocol-2.0.0.jar:/opt/hbase/lib/hbase-protocol-shaded-2.0.0.jar:/opt/hbase/lib/hbase-replication-2.0.0.jar:/opt/hbase/lib/hbase-resource-bundle-2.0.0.jar:/opt/hbase/lib/hbase-rest-2.0.0.jar:/opt/hbase/lib/hbase-rsgroup-2.0.0-tests.jar:/opt/hbase/lib/hbase-rsgroup-2.0.0.jar:/opt/hbase/lib/hbase-server-2.0.0-tests.jar:/opt/hbase/lib/hbase-server-2.0.0.jar:/opt/hbase/lib/hbase-shaded-miscellaneous-2.1.0.jar:/opt/hbase/lib/hbase-shaded-netty-2.1.0.jar:/opt/hbase/lib/hbase-shaded-protobuf-2.1.0.jar:/opt/hbase/lib/hbase-shell-2.0.0.jar:/opt/hbase/lib/hbase-testing-util-2.0.0.jar:/opt/hbase/lib/hbase-thrift-2.0.0.jar:/opt/hbase/lib/hbase-zookeeper-2.0.0-tests.jar:/opt/hbase/lib/hbase-zookeeper-2.0.0.jar:/opt/hbase/lib/hk2-api-2.5.0-b32.jar:/opt/hbase/lib/hk2-locator-2.5.0-b32.jar:/opt/hbase/lib/hk2-utils-2.5.0-b32.jar:/opt/hbase/lib/htrace-core-3.2.0-incubating.jar:/opt/hbase/lib/htrace-core4-4.2.0-incubating.jar:/opt/hbase/lib/httpclient-4.5.3.jar:/opt/hbase/lib/httpcore-4.4.6.jar:/opt/hbase/lib/jackson-annotations-2.9.0.jar:/opt/hbase/lib/jackson-core-2.9.2.jar:/opt/hbase/lib/jackson-core-asl-1.9.13.jar:/opt/hbase/lib/jackson-databind-2.9.2.jar:/opt/hbase/lib/jackson-jaxrs-1.8.3.jar:/opt/hbase/lib/jackson-jaxrs-base-2.9.2.jar:/opt/hbase/lib/jackson-jaxrs-json-provider-2.9.2.jar:/opt/hbase/lib/jackson-mapper-asl-1.9.13.jar:/opt/hbase/lib/jackson-module-jaxb-annotations-2.9.2.jar:/opt/hbase/lib/jackson-xc-1.8.3.jar:/opt/hbase/lib/jamon-runtime-2.4.1.jar:/opt/hbase/lib/java-xmlbuilder-0.4.jar:/opt/hbase/lib/javassist-3.20.0-GA.jar:/opt/hbase/lib/javax.annotation-api-1.2.jar:/opt/hbase/lib/javax.el-3.0.1-b08.jar:/opt/hbase/lib/javax.inject-2.5.0-b32.jar:/opt/hbase/lib/javax.servlet-api-3.1.0.jar:/opt/hbase/lib/javax.servlet.jsp-2.3.2.jar:/opt/hbase/lib/javax.servlet.jsp-api-2.3.1.jar:/opt/hbase/lib/javax.servlet.jsp.jstl-1.2.0.v201105211821.jar:/opt/hbase/lib/javax.servlet.jsp.jstl-1.2.2.jar:/opt/hbase/lib/javax.ws.rs-api-2.0.1.jar:/opt/hbase/lib/jaxb-api-2.2.12.jar:/opt/hbase/lib/jaxb-impl-2.2.3-1.jar:/opt/hbase/lib/jcodings-1.0.18.jar:/opt/hbase/lib/jersey-client-2.25.1.jar:/opt/hbase/lib/jersey-common-2.25.1.jar:/opt/hbase/lib/jersey-container-servlet-core-2.25.1.jar:/opt/hbase/lib/jersey-guava-2.25.1.jar:/opt/hbase/lib/jersey-media-jaxb-2.25.1.jar:/opt/hbase/lib/jersey-server-2.25.1.jar:/opt/hbase/lib/jets3t-0.9.0.jar:/opt/hbase/lib/jettison-1.3.8.jar:/opt/hbase/lib/jetty-6.1.26.jar:/opt/hbase/lib/jetty-http-9.3.19.v20170502.jar:/opt/hbase/lib/jetty-io-9.3.19.v20170502.jar:/opt/hbase/lib/jetty-jmx-9.3.19.v20170502.jar:/opt/hbase/lib/jetty-jsp-9.2.19.v20160908.jar:/opt/hbase/lib/jetty-schemas-3.1.M0.jar:/opt/hbase/lib/jetty-security-9.3.19.v20170502.jar:/opt/hbase/lib/jetty-server-9.3.19.v20170502.jar:/opt/hbase/lib/jetty-servlet-9.3.19.v20170502.jar:/opt/hbase/lib/jetty-sslengine-6.1.26.jar:/opt/hbase/lib/jetty-util-6.1.26.jar:/opt/hbase/lib/jetty-util-9.3.19.v20170502.jar:/opt/hbase/lib/jetty-util-ajax-9.3.19.v20170502.jar:/opt/hbase/lib/jetty-webapp-9.3.19.v20170502.jar:/opt/hbase/lib/jetty-xml-9.3.19.v20170502.jar:/opt/hbase/lib/joni-2.1.11.jar:/opt/hbase/lib/jsch-0.1.54.jar:/opt/hbase/lib/junit-4.12.jar:/opt/hbase/lib/leveldbjni-all-1.8.jar:/opt/hbase/lib/libthrift-0.9.3.jar:/opt/hbase/lib/log4j-1.2.17.jar:/opt/hbase/lib/metrics-core-3.2.1.jar:/opt/hbase/lib/netty-all-4.0.23.Final.jar:/opt/hbase/lib/org.eclipse.jdt.core-3.8.2.v20130121.jar:/opt/hbase/lib/osgi-resource-locator-1.0.1.jar:/opt/hbase/lib/paranamer-2.3.jar:/opt/hbase/lib/phoenix-5.0.0-HBase-2.0-client.jar:/opt/hbase/lib/phoenix-5.0.0-HBase-2.0-server.jar:/opt/hbase/lib/protobuf-java-2.5.0.jar:/opt/hbase/lib/remotecontent?filepath=com%2Fgoogle%2Fguava%2Fguava%2F13.0.1%2Fguava-13.0.1.jar:/opt/hbase/lib/slf4j-api-1.7.25.jar:/opt/hbase/lib/slf4j-log4j12-1.7.25.jar:/opt/hbase/lib/snappy-java-1.0.5.jar:/opt/hbase/lib/spymemcached-2.12.2.jar:/opt/hbase/lib/validation-api-1.1.0.Final.jar:/opt/hbase/lib/xmlenc-0.52.jar:/opt/hbase/lib/xz-1.0.jar:/opt/hbase/lib/zookeeper-3.4.10.jar::
Does anyone know what might be causing this or any tips for
troubleshooting?
Francis