Hi everyone,
After copying the hive-site.xml from a CDH5 cluster, I can't seem to
connect to the hive metastore using spark-shell, here's a part of the stack
trace I get :

15/06/17 04:41:57 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by
GSSException: No valid credentials provided (Mechanism level: Failed to
find any Kerberos tgt)]
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
at
org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
at
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)

The user has a non-expired ticket, I can execute hadoop fs -ls, all in all
I should have access to this.
I am stuck with this issue on Spark 1.4.0, did not try a version before...

Any guess regarding what might be wrong ?

Regards,

Olivier.

Reply via email to