[ https://issues.apache.org/jira/browse/NIFI-1712?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15227808#comment-15227808 ]
Guillaume Pool commented on NIFI-1712: -------------------------------------- Hi, Hope this is not beating the same drum: So I tested the principal to connect to hbase on one of the hdp nodes: {noformat} SV-HTNMN1 ~ # kinit -kt nifi.keytab n...@hdp.supergrp.net SV-HTNMN1 ~ # hbase shell SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12 -1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/zookeeper/lib/slf4j-log4 j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] HBase Shell; enter 'help<RETURN>' for list of supported commands. Type "exit<RETURN>" to leave the HBase Shell Version 1.1.2.2.3.2.0-2950, r58355eb3c88bded74f382d81cdd36174d68ad0fd, Wed Sep 3 0 18:56:38 UTC 2015 hbase(main):001:0> list TABLE CTRACKINT DRMVEHICLE RT.ORDERS SGCONV.POSTSTTRACKING SYSLOG SYSTEM.CATALOG SYSTEM.FUNCTION SYSTEM.SEQUENCE SYSTEM.STATS ambarismoketest drmint pastelint websvclogs 13 row(s) in 0.2600 seconds => ["CTRACKINT", "DRMVEHICLE", "RT.ORDERS", "SGCONV.POSTSTTRACKING", "SYSLOG", " SYSTEM.CATALOG", "SYSTEM.FUNCTION", "SYSTEM.SEQUENCE", "SYSTEM.STATS", "ambarism oketest", "drmint", "pastelint", "websvclogs"] hbase(main):002:0> exit SV-HTNMN1 ~ # klist Ticket cache: FILE:/tmp/krb5cc_0 Default principal: n...@hdp.supergrp.net Valid starting Expires Service principal 04/06/16 08:03:09 04/07/16 08:03:09 krbtgt/hdp.supergrp....@hdp.supergrp.net renew until 04/06/16 08:03:09 {noformat} Then I did the same kinit on the nifi hosts, getting a valid kerberos ticket again. Can't test it on those nodes with hbase shell as I am not 100% sure how to install it without the assistance of Ambari... I have 2 realms configured in the krb5.conf, one for the HDP.SUPERGRP.NET realm where we define all out HDP services and one for our active directory domain SUPERGRP.NET The HDF servers are joined to the active directory domain and use sssd to provide login for domain users, the HDP servers only have one realm configured, HDP.SUPERGRP.NET > HBaseClientService unable to connect when Phoenix is installed > -------------------------------------------------------------- > > Key: NIFI-1712 > URL: https://issues.apache.org/jira/browse/NIFI-1712 > Project: Apache NiFi > Issue Type: Bug > Reporter: Bryan Bende > Priority: Minor > > A user reported running HDP 2.3.2 with Phoenix installed, and NiFi 0.6.0, > with the following error: > > 2016-03-31 13:24:23,916 INFO [StandardProcessScheduler Thread-5] > o.a.nifi.hbase.HBase_1_1_2_ClientService > HBase_1_1_2_ClientService[id=e7e9b2ed-d336-34be-acb4-6c8b60c735c2] HBase > Security Enabled, logging in as principal n...@hdp.supergrp.net with keytab > /app/env/nifi.keytab > 2016-03-31 13:24:23,984 WARN [StandardProcessScheduler Thread-5] > org.apache.hadoop.util.NativeCodeLoader Unable to load native-hadoop library > for your platform... using builtin-java classes where applicable > 2016-03-31 13:24:24,101 INFO [StandardProcessScheduler Thread-5] > o.a.nifi.hbase.HBase_1_1_2_ClientService > HBase_1_1_2_ClientService[id=e7e9b2ed-d336-34be-acb4-6c8b60c735c2] > Successfully logged in as principal n...@hdp.supergrp.net with keytab > /app/env/nifi.keytab > 2016-03-31 13:24:24,177 ERROR [StandardProcessScheduler Thread-5] > o.a.n.c.s.StandardControllerServiceNode > HBase_1_1_2_ClientService[id=e7e9b2ed-d336-34be-acb4-6c8b60c735c2] Failed to > invoke @OnEnabled method due to java.io.IOException: > java.lang.reflect.InvocationTargetException > 2016-03-31 13:24:24,182 ERROR [StandardProcessScheduler Thread-5] > o.a.n.c.s.StandardControllerServiceNode > java.io.IOException: java.lang.reflect.InvocationTargetException > at > org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240) > ~[hbase-client-1.1.2.jar:1.1.2] > at > org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218) > ~[hbase-client-1.1.2.jar:1.1.2] > at > org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119) > ~[hbase-client-1.1.2.jar:1.1.2] > at > org.apache.nifi.hbase.HBase_1_1_2_ClientService$1.run(HBase_1_1_2_ClientService.java:215) > ~[nifi-hbase_1_1_2-client-service-0.6.0.jar:0.6.0] > at > org.apache.nifi.hbase.HBase_1_1_2_ClientService$1.run(HBase_1_1_2_ClientService.java:212) > ~[nifi-hbase_1_1_2-client-service-0.6.0.jar:0.6.0] > at java.security.AccessController.doPrivileged(Native Method) > ~[na:1.8.0_71] > at javax.security.auth.Subject.doAs(Subject.java:422) ~[na:1.8.0_71] > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) > ~[hadoop-common-2.6.2.jar:na] > at > org.apache.nifi.hbase.HBase_1_1_2_ClientService.createConnection(HBase_1_1_2_ClientService.java:212) > ~[nifi-hbase_1_1_2-client-service-0.6.0.jar:0.6.0] > at > org.apache.nifi.hbase.HBase_1_1_2_ClientService.onEnabled(HBase_1_1_2_ClientService.java:161) > ~[nifi-hbase_1_1_2-client-service-0.6.0.jar:0.6.0] > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > ~[na:1.8.0_71] > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > ~[na:1.8.0_71] > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > ~[na:1.8.0_71] > at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_71] > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:137) > ~[na:na] > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:125) > ~[na:na] > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:70) > ~[na:na] > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(ReflectionUtils.java:47) > ~[na:na] > at > org.apache.nifi.controller.service.StandardControllerServiceNode$1.run(StandardControllerServiceNode.java:285) > ~[na:na] > at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > [na:1.8.0_71] > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > [na:1.8.0_71] > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) > [na:1.8.0_71] > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) > [na:1.8.0_71] > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > [na:1.8.0_71] > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > [na:1.8.0_71] > at java.lang.Thread.run(Thread.java:745) [na:1.8.0_71] > Caused by: java.lang.reflect.InvocationTargetException: null > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) ~[na:1.8.0_71] > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > ~[na:1.8.0_71] > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > ~[na:1.8.0_71] > at java.lang.reflect.Constructor.newInstance(Constructor.java:422) > ~[na:1.8.0_71] > at > org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238) > ~[hbase-client-1.1.2.jar:1.1.2] > ... 25 common frames omitted > Caused by: java.lang.UnsupportedOperationException: Unable to find > org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory > at > org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36) > ~[hbase-common-1.1.2.jar:1.1.2] > at > org.apache.hadoop.hbase.ipc.RpcControllerFactory.instantiate(RpcControllerFactory.java:58) > ~[hbase-client-1.1.2.jar:1.1.2] > at > org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.createAsyncProcess(ConnectionManager.java:2242) > ~[hbase-client-1.1.2.jar:1.1.2] > at > org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:690) > ~[hbase-client-1.1.2.jar:1.1.2] > at > org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630) > ~[hbase-client-1.1.2.jar:1.1.2] > ... 30 common frames omitted > Caused by: java.lang.ClassNotFoundException: > org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory > at java.net.URLClassLoader.findClass(URLClassLoader.java:381) > ~[na:1.8.0_71] > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > ~[na:1.8.0_71] > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > ~[na:1.8.0_71] > at java.lang.Class.forName0(Native Method) ~[na:1.8.0_71] > at java.lang.Class.forName(Class.java:264) ~[na:1.8.0_71] > at > org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32) > ~[hbase-common-1.1.2.jar:1.1.2] > ... 34 common frames omitted > 2016-03-31 13:24:24,184 ERROR [StandardProcessScheduler Thread-5] > o.a.n.c.s.StandardControllerServiceNode Failed to invoke @OnEnabled method of > HBase_1_1_2_ClientService[id=e7e9b2ed-d336-34be-acb4-6c8b60c735c2] due to > java.io.IOException: java.lang.reflect.InvocationTargetException > -- This message was sent by Atlassian JIRA (v6.3.4#6332)