[jira] [Commented] (SPARK-14694) Thrift Server + Hive Metastore + Kerberos doesn't work

2016-04-22 Thread zhangguancheng (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-14694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15255036#comment-15255036
 ] 

zhangguancheng commented on SPARK-14694:


Content of hive-site.xml:
{quote}




hive.server2.thrift.port
1 



hive.metastore.sasl.enabled
true 



hive.metastore.kerberos.keytab.file
/Users/zhangguancheng/Documents/github/bigdata/hive/apache-hive-1.1.1-bin/conf/hive.keytab
 



hive.metastore.kerberos.principal
hive/c1@C1 



hive.server2.authentication
KERBEROS 



hive.server2.authentication.kerberos.principal
hive/c1@C1 



hive.server2.authentication.kerberos.keytab
/Users/zhangguancheng/Documents/github/bigdata/hive/apache-hive-1.1.1-bin/conf/hive.keytab
 



  javax.jdo.option.ConnectionURL
  jdbc:mysql://localhost/test
  the URL of the MySQL database



  javax.jdo.option.ConnectionDriverName
  com.mysql.jdbc.Driver



  javax.jdo.option.ConnectionUserName
  test



  javax.jdo.option.ConnectionPassword
  test123



  datanucleus.autoCreateSchema
  false



  datanucleus.fixedDatastore
  true



  hive.metastore.uris
  thrift://localhost:9083
  IP address (or fully-qualified domain name) and port of the 
metastore host



{quote}


> Thrift Server + Hive Metastore + Kerberos doesn't work
> --
>
> Key: SPARK-14694
> URL: https://issues.apache.org/jira/browse/SPARK-14694
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.6.0, 1.6.1
> Environment: Spark 1.6.1. compiled with hadoop 2.6.0, yarn, hive
> Hadoop 2.6.4 
> Hive 1.1.1 
> Kerberos
>Reporter: zhangguancheng
>  Labels: security
>
> My Hive Metasore is MySQL based. I started a spark thrift server on the same 
> node as the Hive Metastore. I can open beeline and run select statements but 
> for some commands like "show databases", I get an error:
> {quote}
> ERROR pool-24-thread-1 org.apache.thrift.transport.TSaslTransport:315 SASL 
> negotiation failure
> javax.security.sasl.SaslException: GSS initiate failed [Caused by 
> GSSException: No valid credentials provided (Mechanism level: Failed to find 
> any Kerberos tgt)]
> at 
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
> at 
> org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
> at 
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
> at 
> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
> at 
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
> at 
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
> at 
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
> at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
> at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
> at 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
> at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
> at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
> at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
> at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
> at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2223)
> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:385)
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1653)
> at 

[jira] [Commented] (SPARK-14694) Thrift Server + Hive Metastore + Kerberos doesn't work

2016-04-22 Thread Andrew Lee (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-14694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15254337#comment-15254337
 ] 

Andrew Lee commented on SPARK-14694:


I'm able to get this working with Spark 1.6.1 + Hive 1.2 + Kerberos Hadoop. 
Could you provide the directory layout for the KErberos ticket file and how 
your hive-site.xml looks like in your SPARK_CONF_DIR?


> Thrift Server + Hive Metastore + Kerberos doesn't work
> --
>
> Key: SPARK-14694
> URL: https://issues.apache.org/jira/browse/SPARK-14694
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.6.0, 1.6.1
> Environment: Spark 1.6.1. compiled with hadoop 2.6.0, yarn, hive
> Hadoop 2.6.4 
> Hive 1.1.1 
> Kerberos
>Reporter: zhangguancheng
>  Labels: security
>
> My Hive Metasore is MySQL based. I started a spark thrift server on the same 
> node as the Hive Metastore. I can open beeline and run select statements but 
> for some commands like "show databases", I get an error:
> {quote}
> ERROR pool-24-thread-1 org.apache.thrift.transport.TSaslTransport:315 SASL 
> negotiation failure
> javax.security.sasl.SaslException: GSS initiate failed [Caused by 
> GSSException: No valid credentials provided (Mechanism level: Failed to find 
> any Kerberos tgt)]
> at 
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
> at 
> org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
> at 
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
> at 
> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
> at 
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
> at 
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
> at 
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
> at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
> at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
> at 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
> at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
> at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
> at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
> at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
> at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2223)
> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:385)
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1653)
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1412)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
> at 
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:495)
> at 
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:484)
> at 
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:290)
> at 
> org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:237)
> at 
> org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:236)
> at 
>