After installing Phoenix 5.0 with HBase 2.2.4, I am unable to connect using sqlline.py or squirrel or through a JDBC Class.
Getting this error Caused by: java.io.IOException: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /hbase-unsecure # Verified all services are running in HBase (HMaster, RegionServer etc) # Verified zookeeper is running on localhost:2181 # Verified server lib in hbase/lib # Verified HBase has loaded the driver during starting in its log Not sure what is missing - how do I further debug - are there any hbase config that I can override? Attached the full stack trace and command line snippet. Appreciate any pointers on whether it is supported and if supported how do I go about fixing this. Java code tried package conn; import java.sql.*; import java.util.*; public class PhoenixClient { public static void main(String args[]) throws Exception { Connection conn; Properties prop = new Properties(); Class.forName("org.apache.phoenix.jdbc.PhoenixDriver"); conn = DriverManager.getConnection("jdbc:phoenix:localhost:2181:/hbase-unsecure"); if(conn != null){ System.out.println("Test Connection Successful:"+conn.toString()); } } }
jak@jakhost:/usr/local/Hbase/lib$ jps 32306 HMaster 9479 SqlLine 11018 SqlLine 11515 DataNode 9243 SqlLine 11131 Jps 11356 NameNode 12076 ResourceManager 13917 Main 11758 SecondaryNameNode 8942 Launcher 12239 NodeManager jak@jakhost:/usr/local/Hbase/lib$ ls -l phoenix* -rw-r--r-- 1 jak jak 41800313 May 22 00:52 phoenix-5.0.0-HBase-2.0-server.jar jak@jakhost:/usr/local/Hbase/lib$ netstat -lpten | grep 2181 (Not all processes could be identified, non-owned process info will not be shown, you would have to be root to see it all.) tcp6 0 0 :::2181 :::* LISTEN 121 26081 - jak@jakhost:/usr/local/Hbase/bin$ hbase shell SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/local/Hbase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] HBase Shell Use "help" to get list of supported commands. Use "exit" to quit this interactive shell. For Reference, please visit: http://hbase.apache.org/2.0/book.html#shell Version 2.2.4, r67779d1a325a4f78a468af3339e73bf075888bac, 2020年 03月 11日 星期三 12:57:39 CST Took 0.0037 seconds hbase(main):001:0> exit jak@jakhost:/usr/local/Hbase/bin$ cd ../logs/ jak@jakhost:/usr/local/Hbase/logs$ cat hbase-jak-master-01276-JARUN.log | grep phoenix 2020-06-17 00:12:41,761 INFO [RpcServer.priority.RWQ.Fifo.read.handler=2,queue=1,port=16000] master.HMaster: cessor$3 => '|org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver|805306366|', coprocessor$4 => '|org.apache.phoenix.coprocessor.ServerCachingEndpointImpl|805306366|', coprocessor$5 => '|org.apache.hadoop.hbase.coprocessor.MultiRowMutationEndpoint|805306366|', coprocessor$6 => '|org.apache.phoenix.coprocessor.MetaDataEndpointImpl|805306366|', coprocessor$7 => '|org.apache.phoenix.coprocessor.MetaDataRegionObserver|805306367|', METADATA => {'SPLIT_POLICY' => 'org.apache.phoenix.schema.MetaDataSplitPolicy'}}}, {NAME => '0', VERSIONS => '1', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'false', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'FAST_DIFF', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'NONE', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'} 2020-06-17 00:12:42,294 INFO [RegionOpenAndInitThread-SYSTEM.CATALOG-pool9-t1] regionserver.HRegion: .GroupedAggregateRegionObserver|805306366|', coprocessor$4 => '|org.apache.phoenix.coprocessor.ServerCachingEndpointImpl|805306366|', coprocessor$5 => '|org.apache.hadoop.hbase.coprocessor.MultiRowMutationEndpoint|805306366|', coprocessor$6 => '|org.apache.phoenix.coprocessor.MetaDataEndpointImpl|805306366|', coprocessor$7 => '|org.apache.phoenix.coprocessor.MetaDataRegionObserver|805306367|', METADATA => {'SPLIT_POLICY' => 'org.apache.phoenix.schema.MetaDataSplitPolicy'}}}, {NAME => '0', VERSIONS => '1', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'false', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'FAST_DIFF', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'NONE', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'}, regionDir=hdfs://localhost:9000/hbase/.tmp jak@jakhost:/usr/local/Hbase/logs$ cd ../../phoenix/bin/ jak@jakhost:/usr/local/phoenix/bin$ ./sqlline.py localhost Setting property: [incremental, false] Setting property: [isolation, TRANSACTION_READ_COMMITTED] issuing: !connect jdbc:phoenix:localhost none none org.apache.phoenix.jdbc.PhoenixDriver Connecting to jdbc:phoenix:localhost SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/local/phoenix/phoenix-5.0.0-HBase-2.0-client.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 20/06/17 15:56:29 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/jak/ea/sandbox/users/jak/hbase/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/jak/ea/sandbox/users/jak/hbase/lib/phoenix-5.0.0-HBase-2.0-client.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Exception in thread "main" org.apache.phoenix.exception.PhoenixIOException: Failed after attempts=2, exceptions: 2020-06-17T10:29:47.677Z, RpcRetryingCaller{globalStartTime=1592389786803, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=2, exceptions: 2020-06-17T10:29:47.065Z, RpcRetryingCaller{globalStartTime=1592389787007, pause=100, maxAttempts=2}, java.io.IOException: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /hbase-unsecure 2020-06-17T10:29:47.199Z, RpcRetryingCaller{globalStartTime=1592389787007, pause=100, maxAttempts=2}, java.io.IOException: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /hbase-unsecure 2020-06-17T10:29:47.885Z, RpcRetryingCaller{globalStartTime=1592389786803, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=2, exceptions: 2020-06-17T10:29:47.782Z, RpcRetryingCaller{globalStartTime=1592389787778, pause=100, maxAttempts=2}, java.io.IOException: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /hbase-unsecure 2020-06-17T10:29:47.884Z, RpcRetryingCaller{globalStartTime=1592389787778, pause=100, maxAttempts=2}, java.io.IOException: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /hbase-unsecure at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:138) at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1204) at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501) at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2721) at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114) at org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192) at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408) at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391) at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390) at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378) at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806) at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569) at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532) at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76) at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532) at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255) at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150) at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:270) at conn.PhoenixClient.main(PhoenixClient.java:12) Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=2, exceptions: 2020-06-17T10:29:47.677Z, RpcRetryingCaller{globalStartTime=1592389786803, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=2, exceptions: 2020-06-17T10:29:47.065Z, RpcRetryingCaller{globalStartTime=1592389787007, pause=100, maxAttempts=2}, java.io.IOException: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /hbase-unsecure 2020-06-17T10:29:47.199Z, RpcRetryingCaller{globalStartTime=1592389787007, pause=100, maxAttempts=2}, java.io.IOException: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /hbase-unsecure 2020-06-17T10:29:47.885Z, RpcRetryingCaller{globalStartTime=1592389786803, pause=100, maxAttempts=2}, org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=2, exceptions: 2020-06-17T10:29:47.782Z, RpcRetryingCaller{globalStartTime=1592389787778, pause=100, maxAttempts=2}, java.io.IOException: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /hbase-unsecure 2020-06-17T10:29:47.884Z, RpcRetryingCaller{globalStartTime=1592389787778, pause=100, maxAttempts=2}, java.io.IOException: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /hbase-unsecure at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:145) at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3133) at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3125) at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:466) at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1105) ... 20 more Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=2, exceptions: 2020-06-17T10:29:47.782Z, RpcRetryingCaller{globalStartTime=1592389787778, pause=100, maxAttempts=2}, java.io.IOException: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /hbase-unsecure 2020-06-17T10:29:47.884Z, RpcRetryingCaller{globalStartTime=1592389787778, pause=100, maxAttempts=2}, java.io.IOException: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /hbase-unsecure at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:145) at org.apache.hadoop.hbase.client.HTable.get(HTable.java:384) at org.apache.hadoop.hbase.client.HTable.get(HTable.java:358) at org.apache.hadoop.hbase.MetaTableAccessor.getTableState(MetaTableAccessor.java:1118) at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:440) at org.apache.hadoop.hbase.client.HBaseAdmin$6.rpcCall(HBaseAdmin.java:469) at org.apache.hadoop.hbase.client.HBaseAdmin$6.rpcCall(HBaseAdmin.java:466) at org.apache.hadoop.hbase.client.RpcRetryingCallable.call(RpcRetryingCallable.java:58) at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:107) ... 24 more Caused by: java.io.IOException: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /hbase-unsecure at org.apache.hadoop.hbase.client.ConnectionImplementation.get(ConnectionImplementation.java:2081) at org.apache.hadoop.hbase.client.ConnectionImplementation.locateMeta(ConnectionImplementation.java:814) at org.apache.hadoop.hbase.client.ConnectionImplementation.locateRegion(ConnectionImplementation.java:781) at org.apache.hadoop.hbase.client.HRegionLocator.getRegionLocation(HRegionLocator.java:64) at org.apache.hadoop.hbase.client.RegionLocator.getRegionLocation(RegionLocator.java:58) at org.apache.hadoop.hbase.client.RegionLocator.getRegionLocation(RegionLocator.java:47) at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(RegionServerCallable.java:223) at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:105) ... 32 more Caused by: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /hbase-unsecure at org.apache.zookeeper.KeeperException.create(KeeperException.java:111) at org.apache.zookeeper.KeeperException.create(KeeperException.java:51) at org.apache.hadoop.hbase.zookeeper.ReadOnlyZKClient$ZKTask$1.exec(ReadOnlyZKClient.java:177) at org.apache.hadoop.hbase.zookeeper.ReadOnlyZKClient.run(ReadOnlyZKClient.java:342) at java.lang.Thread.run(Thread.java:748)