Hi David:

The image doesn't seem to load successfully, so I can't see its contents.

Is the kylin you downloaded a docker image or a binary package?

If kylin's binary package is installed in your own Hadoop environment, this 
error may indicate that there are some problems in your HBase environment. You 
can execute commands in the HBase shell to check the HBase status.


> 在 2020年4月14日,23:57,Rubio Piqueras, David <david.ru...@gft.com> 写道:
> 
> Hi guys,
>  
> Just starting using Kylin too which looks so interesting.
> We downloaded your Kylin 3.0 image version:
>  
> 
> However, I can’t open the Kylin UI anytime I try to open it. Checking the 
> logs, this is what we are seeing:
>  
> 2020-04-14 13:58:23,853 INFO  [main-SendThread(localhost:2181)] 
> zookeeper.ClientCnxn:1235 : Session establishment complete on server 
> localhost/127.0.0.1:2181, sessionid = 0x17178f6fbd6000a, negotiated timeout = 
> 40000
> Exception in thread "main" java.lang.IllegalArgumentException: Failed to find 
> metadata store by url: kylin_metadata@hbase
>         at 
> org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:101)
>         at 
> org.apache.kylin.common.persistence.ResourceStore.getStore(ResourceStore.java:113)
>         at 
> org.apache.kylin.rest.service.AclTableMigrationTool.checkIfNeedMigrate(AclTableMigrationTool.java:99)
>         at 
> org.apache.kylin.tool.AclTableMigrationCLI.main(AclTableMigrationCLI.java:43)
> Caused by: java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
>         at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>         at 
> org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:94)
>         ... 3 more
> Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed 
> after attempts=1, exceptions:
> Tue Apr 14 13:58:24 UTC 2020, 
> RpcRetryingCaller{globalStartTime=1586872703980, pause=100, retries=1}, 
> java.net.ConnectException: Connection refused
>  
>         at 
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:147)
>         at 
> org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:64)
>         at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>         at java.lang.Thread.run(Thread.java:748)
> Caused by: java.net.ConnectException: Connection refused
>         at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>         at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
>         at 
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
>         at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
>         at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
>         at 
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupConnection(RpcClientImpl.java:424)
>         at 
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:748)
>         at 
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:920)
>         at 
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:889)
>         at 
> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1222)
>         at 
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
>         at 
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
>         at 
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32651)
>         at 
> org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:372)
>         at 
> org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:199)
>         at 
> org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62)
>         at 
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
>         at 
> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:346)
>         at 
> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:320)
>         at 
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>         ... 4 more
> 2020-04-14 13:58:24,142 INFO  [close-hbase-conn] hbase.HBaseConnection:137 : 
> Closing HBase connections...
> 2020-04-14 13:58:24,143 INFO  [close-hbase-conn] 
> client.ConnectionManager$HConnectionImplementation:1676 : Closing zookeeper 
> sessionid=0x17178f6fbd6000a
> 2020-04-14 13:58:24,150 INFO  [close-hbase-conn] zookeeper.ZooKeeper:684 : 
> Session: 0x17178f6fbd6000a closed
> 2020-04-14 13:58:24,150 INFO  [main-EventThread] zookeeper.ClientCnxn:512 : 
> EventThread shut down
> ERROR: Unknown error. Please check full log
>  
> I’m not sure, but my feeling is that there is an error with the HBase or 
> Kafka or Zookeper default configuration that needs some changes from my end, 
> but not sure what.
>  
> Appreciate your help on this,
>  
> Thanks.
> David Rubio
> _______________________________________
> 
> GFT IT Consulting, S.L.U.
> Avinguda Oest 48
> 46001 VALENCIA
> 
> T +34963012486
> david.ru...@gft.com <mailto:david.ru...@gft.com>
>  <http://www.gft.com/>www.gft.com/es <https://www.gft.com/es>
> https://blog.gft.com/es <https://blog.gft.com/es>
> www.twitter.com/gft_es <https://www.twitter.com/gft_es>
> _______________________________________
> 

Reply via email to