Peter, They're in separate NARs, and are isolated by different ClassLoaders, so their state regarding UGI will be separate. There shouldn't be a problem there. The only way I could think of that might create a problem is if Atlas JARs were added to HDFS using the Additional Classpath Resources property (from memory, I don't think the Hive processors have that property), but that also uses a separate (descendant) ClassLoader, and shouldn't create a problem either.
On Fri, Jul 27, 2018 at 1:29 PM Peter Wicks (pwicks) <pwi...@micron.com> wrote: > As an aside, while digging around in the code, I noticed that the Atlas > Reporting Task has its own Hadoop Kerberos authentication logic > (org.apache.nifi.atlas.security.Kerberos). I’m not using this, but it made > me wonder if this could cause trouble if Hive (synchronized) and Atlas > (separate, unsynchronized) were both trying to login from Keytab at the > same time. > > > > --Peter > > > > *From:* Shawn Weeks [mailto:swe...@weeksconsulting.us] > > *Sent:* Friday, July 27, 2018 10:29 AM > > > *To:* users@nifi.apache.org > *Subject:* Re: [EXT] Re: Hive w/ Kerberos Authentication starts failing > after a week > > > > If you're using the Hortonworks distribution it's fixed in the latest HDF > 3.x release I think. > > > > Thanks > > Shawn > > > ------------------------------ > > *From:* Peter Wicks (pwicks) <pwi...@micron.com> > *Sent:* Friday, July 27, 2018 10:58 AM > *To:* users@nifi.apache.org > *Subject:* RE: [EXT] Re: Hive w/ Kerberos Authentication starts failing > after a week > > > > Thanks Shawn. Looks like this was fixed in 1.7.0. Will have to upgrade. > > > > *From:* Shawn Weeks [mailto:swe...@weeksconsulting.us] > *Sent:* Friday, July 27, 2018 8:07 AM > *To:* users@nifi.apache.org > *Subject:* Re: [EXT] Re: Hive w/ Kerberos Authentication starts failing > after a week > > > > See NIFI-5134 as there was a known bug with the Hive Connection Pool that > made it fail once the Kerberos Tickets expired and you lost your connection > from Hive. If you don't have this patch in your version once the Kerberos > Tickets reaches the end of it's lifetime the connection pool won't work > till you restart NiFi. > > > > Thanks > > Shawn > ------------------------------ > > *From:* Peter Wicks (pwicks) <pwi...@micron.com> > *Sent:* Friday, July 27, 2018 8:51:54 AM > *To:* users@nifi.apache.org > *Subject:* RE: [EXT] Re: Hive w/ Kerberos Authentication starts failing > after a week > > > > I don’t believe that is how this code works. Not to say that might not > work, but I don’t believe that the Kerberos authentication used by NiFi > processors relies in any way on the tickets that appear in klist. > > > > While we are only using a single account on this particular server, many > of our servers use several Kerberos principals/keytab’s. I don’t think that > doing kinit’s for all of them would work either. > > > > Thanks, > > Peter > > > > *From:* Sivaprasanna [mailto:sivaprasanna...@gmail.com > <sivaprasanna...@gmail.com>] > *Sent:* Friday, July 27, 2018 3:12 AM > *To:* users@nifi.apache.org > *Subject:* [EXT] Re: Hive w/ Kerberos Authentication starts failing after > a week > > > > Did you try executing 'klist' to see if the tickets are there and renewed? > If expired, try manual kinit and see if that fixes. > > > > On Fri, Jul 27, 2018 at 1:51 AM Peter Wicks (pwicks) <pwi...@micron.com> > wrote: > > We are seeing frequent failures of our Hive DBCP connections after a week > of use when using Kerberos with Principal/Keytab. We’ve tried with both the > Credential Service and without (though in looking at the code, there should > be no difference). > > > > It looks like the tickets are expiring and renewal is not happening? > > > > javax.security.sasl.SaslException: GSS initiate failed > > at > com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) > > at > org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) > > at > org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) > > at > org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) > > at > org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) > > at > org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:422) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) > > at > org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) > > at > org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204) > > at > org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176) > > at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) > > at > org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38) > > at > org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582) > > at > org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148) > > at > org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106) > > at > org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044) > > at > org.apache.nifi.dbcp.hive.HiveConnectionPool.lambda$getConnection$0(HiveConnectionPool.java:355) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:422) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) > > at > org.apache.nifi.dbcp.hive.HiveConnectionPool.getConnection(HiveConnectionPool.java:355) > > at sun.reflect.GeneratedMethodAccessor515.invoke(Unknown Source) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:498) > > > > Thanks, > > Peter > >