[jira] [Comment Edited] (SPARK-14694) Thrift Server + Hive Metastore + Kerberos doesn't work
[ https://issues.apache.org/jira/browse/SPARK-14694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15255036#comment-15255036 ] zhangguancheng edited comment on SPARK-14694 at 4/23/16 6:50 PM: - Content of hive-site.xml: {quote} hive.server2.thrift.port 1 hive.metastore.sasl.enabled true hive.metastore.kerberos.keytab.file /opt/hive/apache-hive-1.1.1-bin/conf/hive.keytab hive.metastore.kerberos.principal hive/c1@C1 hive.server2.authentication KERBEROS hive.server2.authentication.kerberos.principal hive/c1@C1 hive.server2.authentication.kerberos.keytab /opt/hive/apache-hive-1.1.1-bin/conf/hive.keytab javax.jdo.option.ConnectionURL jdbc:mysql://localhost/test the URL of the MySQL database javax.jdo.option.ConnectionDriverName com.mysql.jdbc.Driver javax.jdo.option.ConnectionUserName xxx javax.jdo.option.ConnectionPassword x datanucleus.autoCreateSchema false datanucleus.fixedDatastore true hive.metastore.uris thrift://localhost:9083 IP address (or fully-qualified domain name) and port of the metastore host {quote} And when I set hive.server2.enable.impersonation and hive.server2.enable.doAs to false, the error gone: {quote} hive.server2.enable.impersonation false hive.server2.enable.doAs false hive.execution.engine spark {quote} was (Author: zhangguancheng): Content of hive-site.xml: {quote} hive.server2.thrift.port 1 hive.metastore.sasl.enabled true hive.metastore.kerberos.keytab.file /opt/hive/apache-hive-1.1.1-bin/conf/hive.keytab hive.metastore.kerberos.principal hive/c1@C1 hive.server2.authentication KERBEROS hive.server2.authentication.kerberos.principal hive/c1@C1 hive.server2.authentication.kerberos.keytab /opt/hive/apache-hive-1.1.1-bin/conf/hive.keytab javax.jdo.option.ConnectionURL jdbc:mysql://localhost/test the URL of the MySQL database javax.jdo.option.ConnectionDriverName com.mysql.jdbc.Driver javax.jdo.option.ConnectionUserName xxx javax.jdo.option.ConnectionPassword x datanucleus.autoCreateSchema false datanucleus.fixedDatastore true hive.metastore.uris thrift://localhost:9083 IP address (or fully-qualified domain name) and port of the metastore host {quote} > Thrift Server + Hive Metastore + Kerberos doesn't work > -- > > Key: SPARK-14694 > URL: https://issues.apache.org/jira/browse/SPARK-14694 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.0, 1.6.1 > Environment: Spark 1.6.1. compiled with hadoop 2.6.0, yarn, hive > Hadoop 2.6.4 > Hive 1.1.1 > Kerberos >Reporter: zhangguancheng > Labels: security > > My Hive Metasore is MySQL based. I started a spark thrift server on the same > node as the Hive Metastore. I can open beeline and run select statements but > for some commands like "show databases", I get an error: > {quote} > ERROR pool-24-thread-1 org.apache.thrift.transport.TSaslTransport:315 SASL > negotiation failure > javax.security.sasl.SaslException: GSS initiate failed [Caused by > GSSException: No valid credentials provided (Mechanism level: Failed to find > any Kerberos tgt)] > at > com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) > at > org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) > at > org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) > at > org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) > at > org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) > at > org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) > at > org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236) > at > org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:
[jira] [Comment Edited] (SPARK-14694) Thrift Server + Hive Metastore + Kerberos doesn't work
[ https://issues.apache.org/jira/browse/SPARK-14694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15255036#comment-15255036 ] zhangguancheng edited comment on SPARK-14694 at 4/23/16 2:08 AM: - Content of hive-site.xml: {quote} hive.server2.thrift.port 1 hive.metastore.sasl.enabled true hive.metastore.kerberos.keytab.file /opt/hive/apache-hive-1.1.1-bin/conf/hive.keytab hive.metastore.kerberos.principal hive/c1@C1 hive.server2.authentication KERBEROS hive.server2.authentication.kerberos.principal hive/c1@C1 hive.server2.authentication.kerberos.keytab /opt/hive/apache-hive-1.1.1-bin/conf/hive.keytab javax.jdo.option.ConnectionURL jdbc:mysql://localhost/test the URL of the MySQL database javax.jdo.option.ConnectionDriverName com.mysql.jdbc.Driver javax.jdo.option.ConnectionUserName xxx javax.jdo.option.ConnectionPassword x datanucleus.autoCreateSchema false datanucleus.fixedDatastore true hive.metastore.uris thrift://localhost:9083 IP address (or fully-qualified domain name) and port of the metastore host {quote} was (Author: zhangguancheng): Content of hive-site.xml: {quote} hive.server2.thrift.port 1 hive.metastore.sasl.enabled true hive.metastore.kerberos.keytab.file /Users/zhangguancheng/Documents/github/bigdata/hive/apache-hive-1.1.1-bin/conf/hive.keytab hive.metastore.kerberos.principal hive/c1@C1 hive.server2.authentication KERBEROS hive.server2.authentication.kerberos.principal hive/c1@C1 hive.server2.authentication.kerberos.keytab /Users/zhangguancheng/Documents/github/bigdata/hive/apache-hive-1.1.1-bin/conf/hive.keytab javax.jdo.option.ConnectionURL jdbc:mysql://localhost/test the URL of the MySQL database javax.jdo.option.ConnectionDriverName com.mysql.jdbc.Driver javax.jdo.option.ConnectionUserName test javax.jdo.option.ConnectionPassword test123 datanucleus.autoCreateSchema false datanucleus.fixedDatastore true hive.metastore.uris thrift://localhost:9083 IP address (or fully-qualified domain name) and port of the metastore host {quote} > Thrift Server + Hive Metastore + Kerberos doesn't work > -- > > Key: SPARK-14694 > URL: https://issues.apache.org/jira/browse/SPARK-14694 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.0, 1.6.1 > Environment: Spark 1.6.1. compiled with hadoop 2.6.0, yarn, hive > Hadoop 2.6.4 > Hive 1.1.1 > Kerberos >Reporter: zhangguancheng > Labels: security > > My Hive Metasore is MySQL based. I started a spark thrift server on the same > node as the Hive Metastore. I can open beeline and run select statements but > for some commands like "show databases", I get an error: > {quote} > ERROR pool-24-thread-1 org.apache.thrift.transport.TSaslTransport:315 SASL > negotiation failure > javax.security.sasl.SaslException: GSS initiate failed [Caused by > GSSException: No valid credentials provided (Mechanism level: Failed to find > any Kerberos tgt)] > at > com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) > at > org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) > at > org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) > at > org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) > at > org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) > at > org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) > at > org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236) > at > org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:422) > at > org.apache.hadoop.hive.metastore.MetaStoreUtils.newIn
[jira] [Commented] (SPARK-14694) Thrift Server + Hive Metastore + Kerberos doesn't work
[ https://issues.apache.org/jira/browse/SPARK-14694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15255036#comment-15255036 ] zhangguancheng commented on SPARK-14694: Content of hive-site.xml: {quote} hive.server2.thrift.port 1 hive.metastore.sasl.enabled true hive.metastore.kerberos.keytab.file /Users/zhangguancheng/Documents/github/bigdata/hive/apache-hive-1.1.1-bin/conf/hive.keytab hive.metastore.kerberos.principal hive/c1@C1 hive.server2.authentication KERBEROS hive.server2.authentication.kerberos.principal hive/c1@C1 hive.server2.authentication.kerberos.keytab /Users/zhangguancheng/Documents/github/bigdata/hive/apache-hive-1.1.1-bin/conf/hive.keytab javax.jdo.option.ConnectionURL jdbc:mysql://localhost/test the URL of the MySQL database javax.jdo.option.ConnectionDriverName com.mysql.jdbc.Driver javax.jdo.option.ConnectionUserName test javax.jdo.option.ConnectionPassword test123 datanucleus.autoCreateSchema false datanucleus.fixedDatastore true hive.metastore.uris thrift://localhost:9083 IP address (or fully-qualified domain name) and port of the metastore host {quote} > Thrift Server + Hive Metastore + Kerberos doesn't work > -- > > Key: SPARK-14694 > URL: https://issues.apache.org/jira/browse/SPARK-14694 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.0, 1.6.1 > Environment: Spark 1.6.1. compiled with hadoop 2.6.0, yarn, hive > Hadoop 2.6.4 > Hive 1.1.1 > Kerberos >Reporter: zhangguancheng > Labels: security > > My Hive Metasore is MySQL based. I started a spark thrift server on the same > node as the Hive Metastore. I can open beeline and run select statements but > for some commands like "show databases", I get an error: > {quote} > ERROR pool-24-thread-1 org.apache.thrift.transport.TSaslTransport:315 SASL > negotiation failure > javax.security.sasl.SaslException: GSS initiate failed [Caused by > GSSException: No valid credentials provided (Mechanism level: Failed to find > any Kerberos tgt)] > at > com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) > at > org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) > at > org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) > at > org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) > at > org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) > at > org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) > at > org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236) > at > org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:422) > at > org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) > at > org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) > at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) > at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234) > at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2223) > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:385) > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1653) > at org.apache.
[jira] [Created] (SPARK-14694) Thrift Server + Hive Metastore + Kerberos doesn't work
zhangguancheng created SPARK-14694: -- Summary: Thrift Server + Hive Metastore + Kerberos doesn't work Key: SPARK-14694 URL: https://issues.apache.org/jira/browse/SPARK-14694 Project: Spark Issue Type: Bug Components: SQL Affects Versions: 1.6.1, 1.6.0 Environment: Spark 1.6.1. compiled with hadoop 2.6.0, yarn, hive Hadoop 2.6.4 Hive 1.1.1 Kerberos Reporter: zhangguancheng My Hive Metasore is MySQL based. I started a spark thrift server on the same node as the Hive Metastore. I can open beeline and run select statements but for some commands like "show databases", I get an error: {quote} ERROR pool-24-thread-1 org.apache.thrift.transport.TSaslTransport:315 SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236) at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:422) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234) at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2223) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:385) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1653) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1412) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049) at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:495) at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:484) at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:290) at org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:237) at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:236) at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:279) at org.apache.spark.sql.hive.client.ClientWrapper.runHive(ClientWrapper.scala:484) at org.apache.spark.sql.hive.client.ClientWrapper.runSqlHive(ClientWrapper.scala:474) at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:605) at org.apache.spark.sql.hive.execution.HiveNativeCommand.run(HiveNativeCommand.scala:33) at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58) at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56) at org.ap