[jira] [Comment Edited] (SPARK-14694) Thrift Server + Hive Metastore + Kerberos doesn't work

2016-04-22 Thread Andrew Lee (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-14694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15254337#comment-15254337
 ] 

Andrew Lee edited comment on SPARK-14694 at 4/22/16 6:03 PM:
-

Not sure if this is considered a bug or not.
I'm able to get this working with Spark 1.6.1 + Hive 1.2 + Kerberos Hadoop. 
Could you provide the directory layout for the KErberos ticket file and how 
your hive-site.xml looks like in your SPARK_CONF_DIR?



was (Author: alee526):
I'm able to get this working with Spark 1.6.1 + Hive 1.2 + Kerberos Hadoop. 
Could you provide the directory layout for the KErberos ticket file and how 
your hive-site.xml looks like in your SPARK_CONF_DIR?


> Thrift Server + Hive Metastore + Kerberos doesn't work
> --
>
> Key: SPARK-14694
> URL: https://issues.apache.org/jira/browse/SPARK-14694
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.6.0, 1.6.1
> Environment: Spark 1.6.1. compiled with hadoop 2.6.0, yarn, hive
> Hadoop 2.6.4 
> Hive 1.1.1 
> Kerberos
>Reporter: zhangguancheng
>  Labels: security
>
> My Hive Metasore is MySQL based. I started a spark thrift server on the same 
> node as the Hive Metastore. I can open beeline and run select statements but 
> for some commands like "show databases", I get an error:
> {quote}
> ERROR pool-24-thread-1 org.apache.thrift.transport.TSaslTransport:315 SASL 
> negotiation failure
> javax.security.sasl.SaslException: GSS initiate failed [Caused by 
> GSSException: No valid credentials provided (Mechanism level: Failed to find 
> any Kerberos tgt)]
> at 
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
> at 
> org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
> at 
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
> at 
> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
> at 
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
> at 
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
> at 
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
> at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
> at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
> at 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
> at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
> at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
> at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
> at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
> at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2223)
> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:385)
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1653)
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1412)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
> at 
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:495)
> at 
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:484)
> at 
> 

[jira] [Commented] (SPARK-14694) Thrift Server + Hive Metastore + Kerberos doesn't work

2016-04-22 Thread Andrew Lee (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-14694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15254337#comment-15254337
 ] 

Andrew Lee commented on SPARK-14694:


I'm able to get this working with Spark 1.6.1 + Hive 1.2 + Kerberos Hadoop. 
Could you provide the directory layout for the KErberos ticket file and how 
your hive-site.xml looks like in your SPARK_CONF_DIR?


> Thrift Server + Hive Metastore + Kerberos doesn't work
> --
>
> Key: SPARK-14694
> URL: https://issues.apache.org/jira/browse/SPARK-14694
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.6.0, 1.6.1
> Environment: Spark 1.6.1. compiled with hadoop 2.6.0, yarn, hive
> Hadoop 2.6.4 
> Hive 1.1.1 
> Kerberos
>Reporter: zhangguancheng
>  Labels: security
>
> My Hive Metasore is MySQL based. I started a spark thrift server on the same 
> node as the Hive Metastore. I can open beeline and run select statements but 
> for some commands like "show databases", I get an error:
> {quote}
> ERROR pool-24-thread-1 org.apache.thrift.transport.TSaslTransport:315 SASL 
> negotiation failure
> javax.security.sasl.SaslException: GSS initiate failed [Caused by 
> GSSException: No valid credentials provided (Mechanism level: Failed to find 
> any Kerberos tgt)]
> at 
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
> at 
> org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
> at 
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
> at 
> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
> at 
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
> at 
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
> at 
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
> at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
> at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
> at 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
> at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
> at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
> at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
> at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
> at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2223)
> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:385)
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1653)
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1412)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
> at 
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:495)
> at 
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:484)
> at 
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:290)
> at 
> org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:237)
> at 
> org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:236)
> at 
> 

[jira] [Commented] (SPARK-6882) Spark ThriftServer2 Kerberos failed encountering java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]

2015-07-10 Thread Andrew Lee (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6882?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14623230#comment-14623230
 ] 

Andrew Lee commented on SPARK-6882:
---

I don't think updating spark-env.sh {{SPARK_CLASSPATH}} will be a good idea 
since this conflicts with {{--driver-class-path}} in yarn-client mode.
But if this is the current work around, I can specify it with a different 
directory with SPARK_CONF_DIR just to get it up and running.

Regarding Bin's approach, I believe you will need to enable 
{{spark.yarn.user.classpath.first}} according to SPARK-939, but I think it 
should be picking up user JAR y default now, isn't?

 Spark ThriftServer2 Kerberos failed encountering 
 java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
 are: [auth-int, auth-conf, auth]
 

 Key: SPARK-6882
 URL: https://issues.apache.org/jira/browse/SPARK-6882
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.2.1, 1.3.0, 1.4.0
 Environment: * Apache Hadoop 2.4.1 with Kerberos Enabled
 * Apache Hive 0.13.1
 * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
 * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
Reporter: Andrew Lee

 When Kerberos is enabled, I get the following exceptions. 
 {code}
 2015-03-13 18:26:05,363 ERROR 
 org.apache.hive.service.cli.thrift.ThriftCLIService 
 (ThriftBinaryCLIService.java:run(93)) - Error: 
 java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
 are: [auth-int, auth-conf, auth]
 {code}
 I tried it in
 * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
 * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
 with
 * Apache Hive 0.13.1
 * Apache Hadoop 2.4.1
 Build command
 {code}
 mvn -U -X -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Phive-thriftserver 
 -Dhadoop.version=2.4.1 -Dyarn.version=2.4.1 -Dhive.version=0.13.1 -DskipTests 
 install
 {code}
 When starting Spark ThriftServer in {{yarn-client}} mode, the command to 
 start thriftserver looks like this
 {code}
 ./start-thriftserver.sh --hiveconf hive.server2.thrift.port=2 --hiveconf 
 hive.server2.thrift.bind.host=$(hostname) --master yarn-client
 {code}
 {{hostname}} points to the current hostname of the machine I'm using.
 Error message in {{spark.log}} from Spark 1.2.1 (1.2 rc1)
 {code}
 2015-03-13 18:26:05,363 ERROR 
 org.apache.hive.service.cli.thrift.ThriftCLIService 
 (ThriftBinaryCLIService.java:run(93)) - Error: 
 java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
 are: [auth-int, auth-conf, auth]
 at org.apache.hive.service.auth.SaslQOP.fromString(SaslQOP.java:56)
 at 
 org.apache.hive.service.auth.HiveAuthFactory.getSaslProperties(HiveAuthFactory.java:118)
 at 
 org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:133)
 at 
 org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)
 at java.lang.Thread.run(Thread.java:744)
 {code}
 I'm wondering if this is due to the same problem described in HIVE-8154 
 HIVE-7620 due to an older code based for the Spark ThriftServer?
 Any insights are appreciated. Currently, I can't get Spark ThriftServer2 to 
 run against a Kerberos cluster (Apache 2.4.1).
 My hive-site.xml looks like the following for spark/conf.
 The kerberos keytab and tgt are configured correctly, I'm able to connect to 
 metastore, but the subsequent steps failed due to the exception.
 {code}
 property
   namehive.semantic.analyzer.factory.impl/name
   valueorg.apache.hcatalog.cli.HCatSemanticAnalyzerFactory/value
 /property
 property
   namehive.metastore.execute.setugi/name
   valuetrue/value
 /property
 property
   namehive.stats.autogather/name
   valuefalse/value
 /property
 property
   namehive.session.history.enabled/name
   valuetrue/value
 /property
 property
   namehive.querylog.location/name
   value/tmp/home/hive/log/${user.name}/value
 /property
 property
   namehive.exec.local.scratchdir/name
   value/tmp/hive/scratch/${user.name}/value
 /property
 property
   namehive.metastore.uris/name
   valuethrift://somehostname:9083/value
 /property
 !-- HIVE SERVER 2 --
 property
   namehive.server2.authentication/name
   valueKERBEROS/value
 /property
 property
   namehive.server2.authentication.kerberos.principal/name
   value***/value
 /property
 property
   namehive.server2.authentication.kerberos.keytab/name
   value***/value
 /property
 property
   namehive.server2.thrift.sasl.qop/name
   valueauth/value
   descriptionSasl QOP value; one of 'auth', 'auth-int' and 
 'auth-conf'/description
 /property
 property
   

[jira] [Comment Edited] (SPARK-6882) Spark ThriftServer2 Kerberos failed encountering java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]

2015-07-10 Thread Andrew Lee (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6882?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14623230#comment-14623230
 ] 

Andrew Lee edited comment on SPARK-6882 at 7/11/15 4:35 AM:


I don't think updating spark-env.sh {{SPARK_CLASSPATH}} will be a good idea 
since this conflicts with {{--driver-class-path}} in yarn-client mode.
But if this is the current work around, I can specify it with a different 
directory with SPARK_CONF_DIR just to get it up and running.

Regarding Bin's approach, I believe you will need to enable 
{{spark.yarn.user.classpath.first}} according to SPARK-939, but I think it 
should be picking up user JAR by default now, isn't?


was (Author: alee526):
I don't think updating spark-env.sh {{SPARK_CLASSPATH}} will be a good idea 
since this conflicts with {{--driver-class-path}} in yarn-client mode.
But if this is the current work around, I can specify it with a different 
directory with SPARK_CONF_DIR just to get it up and running.

Regarding Bin's approach, I believe you will need to enable 
{{spark.yarn.user.classpath.first}} according to SPARK-939, but I think it 
should be picking up user JAR y default now, isn't?

 Spark ThriftServer2 Kerberos failed encountering 
 java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
 are: [auth-int, auth-conf, auth]
 

 Key: SPARK-6882
 URL: https://issues.apache.org/jira/browse/SPARK-6882
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.2.1, 1.3.0, 1.4.0
 Environment: * Apache Hadoop 2.4.1 with Kerberos Enabled
 * Apache Hive 0.13.1
 * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
 * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
Reporter: Andrew Lee

 When Kerberos is enabled, I get the following exceptions. 
 {code}
 2015-03-13 18:26:05,363 ERROR 
 org.apache.hive.service.cli.thrift.ThriftCLIService 
 (ThriftBinaryCLIService.java:run(93)) - Error: 
 java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
 are: [auth-int, auth-conf, auth]
 {code}
 I tried it in
 * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
 * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
 with
 * Apache Hive 0.13.1
 * Apache Hadoop 2.4.1
 Build command
 {code}
 mvn -U -X -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Phive-thriftserver 
 -Dhadoop.version=2.4.1 -Dyarn.version=2.4.1 -Dhive.version=0.13.1 -DskipTests 
 install
 {code}
 When starting Spark ThriftServer in {{yarn-client}} mode, the command to 
 start thriftserver looks like this
 {code}
 ./start-thriftserver.sh --hiveconf hive.server2.thrift.port=2 --hiveconf 
 hive.server2.thrift.bind.host=$(hostname) --master yarn-client
 {code}
 {{hostname}} points to the current hostname of the machine I'm using.
 Error message in {{spark.log}} from Spark 1.2.1 (1.2 rc1)
 {code}
 2015-03-13 18:26:05,363 ERROR 
 org.apache.hive.service.cli.thrift.ThriftCLIService 
 (ThriftBinaryCLIService.java:run(93)) - Error: 
 java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
 are: [auth-int, auth-conf, auth]
 at org.apache.hive.service.auth.SaslQOP.fromString(SaslQOP.java:56)
 at 
 org.apache.hive.service.auth.HiveAuthFactory.getSaslProperties(HiveAuthFactory.java:118)
 at 
 org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:133)
 at 
 org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)
 at java.lang.Thread.run(Thread.java:744)
 {code}
 I'm wondering if this is due to the same problem described in HIVE-8154 
 HIVE-7620 due to an older code based for the Spark ThriftServer?
 Any insights are appreciated. Currently, I can't get Spark ThriftServer2 to 
 run against a Kerberos cluster (Apache 2.4.1).
 My hive-site.xml looks like the following for spark/conf.
 The kerberos keytab and tgt are configured correctly, I'm able to connect to 
 metastore, but the subsequent steps failed due to the exception.
 {code}
 property
   namehive.semantic.analyzer.factory.impl/name
   valueorg.apache.hcatalog.cli.HCatSemanticAnalyzerFactory/value
 /property
 property
   namehive.metastore.execute.setugi/name
   valuetrue/value
 /property
 property
   namehive.stats.autogather/name
   valuefalse/value
 /property
 property
   namehive.session.history.enabled/name
   valuetrue/value
 /property
 property
   namehive.querylog.location/name
   value/tmp/home/hive/log/${user.name}/value
 /property
 property
   namehive.exec.local.scratchdir/name
   value/tmp/hive/scratch/${user.name}/value
 /property
 property
   

[jira] [Updated] (SPARK-6882) Spark ThriftServer2 Kerberos failed encountering java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]

2015-07-06 Thread Andrew Lee (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-6882?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Lee updated SPARK-6882:
--
Affects Version/s: 1.4.0

 Spark ThriftServer2 Kerberos failed encountering 
 java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
 are: [auth-int, auth-conf, auth]
 

 Key: SPARK-6882
 URL: https://issues.apache.org/jira/browse/SPARK-6882
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.2.1, 1.3.0, 1.4.0
 Environment: * Apache Hadoop 2.4.1 with Kerberos Enabled
 * Apache Hive 0.13.1
 * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
 * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
Reporter: Andrew Lee

 When Kerberos is enabled, I get the following exceptions. 
 {code}
 2015-03-13 18:26:05,363 ERROR 
 org.apache.hive.service.cli.thrift.ThriftCLIService 
 (ThriftBinaryCLIService.java:run(93)) - Error: 
 java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
 are: [auth-int, auth-conf, auth]
 {code}
 I tried it in
 * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
 * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
 with
 * Apache Hive 0.13.1
 * Apache Hadoop 2.4.1
 Build command
 {code}
 mvn -U -X -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Phive-thriftserver 
 -Dhadoop.version=2.4.1 -Dyarn.version=2.4.1 -Dhive.version=0.13.1 -DskipTests 
 install
 {code}
 When starting Spark ThriftServer in {{yarn-client}} mode, the command to 
 start thriftserver looks like this
 {code}
 ./start-thriftserver.sh --hiveconf hive.server2.thrift.port=2 --hiveconf 
 hive.server2.thrift.bind.host=$(hostname) --master yarn-client
 {code}
 {{hostname}} points to the current hostname of the machine I'm using.
 Error message in {{spark.log}} from Spark 1.2.1 (1.2 rc1)
 {code}
 2015-03-13 18:26:05,363 ERROR 
 org.apache.hive.service.cli.thrift.ThriftCLIService 
 (ThriftBinaryCLIService.java:run(93)) - Error: 
 java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
 are: [auth-int, auth-conf, auth]
 at org.apache.hive.service.auth.SaslQOP.fromString(SaslQOP.java:56)
 at 
 org.apache.hive.service.auth.HiveAuthFactory.getSaslProperties(HiveAuthFactory.java:118)
 at 
 org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:133)
 at 
 org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)
 at java.lang.Thread.run(Thread.java:744)
 {code}
 I'm wondering if this is due to the same problem described in HIVE-8154 
 HIVE-7620 due to an older code based for the Spark ThriftServer?
 Any insights are appreciated. Currently, I can't get Spark ThriftServer2 to 
 run against a Kerberos cluster (Apache 2.4.1).
 My hive-site.xml looks like the following for spark/conf.
 The kerberos keytab and tgt are configured correctly, I'm able to connect to 
 metastore, but the subsequent steps failed due to the exception.
 {code}
 property
   namehive.semantic.analyzer.factory.impl/name
   valueorg.apache.hcatalog.cli.HCatSemanticAnalyzerFactory/value
 /property
 property
   namehive.metastore.execute.setugi/name
   valuetrue/value
 /property
 property
   namehive.stats.autogather/name
   valuefalse/value
 /property
 property
   namehive.session.history.enabled/name
   valuetrue/value
 /property
 property
   namehive.querylog.location/name
   value/tmp/home/hive/log/${user.name}/value
 /property
 property
   namehive.exec.local.scratchdir/name
   value/tmp/hive/scratch/${user.name}/value
 /property
 property
   namehive.metastore.uris/name
   valuethrift://somehostname:9083/value
 /property
 !-- HIVE SERVER 2 --
 property
   namehive.server2.authentication/name
   valueKERBEROS/value
 /property
 property
   namehive.server2.authentication.kerberos.principal/name
   value***/value
 /property
 property
   namehive.server2.authentication.kerberos.keytab/name
   value***/value
 /property
 property
   namehive.server2.thrift.sasl.qop/name
   valueauth/value
   descriptionSasl QOP value; one of 'auth', 'auth-int' and 
 'auth-conf'/description
 /property
 property
   namehive.server2.enable.impersonation/name
   descriptionEnable user impersonation for HiveServer2/description
   valuetrue/value
 /property
 !-- HIVE METASTORE --
 property
   namehive.metastore.sasl.enabled/name
   valuetrue/value
 /property
 property
   namehive.metastore.kerberos.keytab.file/name
   value***/value
 /property
 property
   namehive.metastore.kerberos.principal/name
   value***/value
 /property
 property
   namehive.metastore.cache.pinobjtypes/name
   

[jira] [Commented] (SPARK-6882) Spark ThriftServer2 Kerberos failed encountering java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]

2015-07-06 Thread Andrew Lee (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6882?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14615571#comment-14615571
 ] 

Andrew Lee commented on SPARK-6882:
---

Hi [~WangTaoTheTonic], I'm still getting the same exception. This is from Spark 
1.4 so it is impacting 1.4 as well. You will notice that, it is showing 
{{null}} in the exception, so it looks like it is having trouble to pick up 
that property form the beginning.

{code}
2015-07-06 20:12:00,882 ERROR 
org.apache.hive.service.cli.thrift.ThriftCLIService 
(ThriftBinaryCLIService.java:run(93)) - Error: 
java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: 
[auth-int, auth-conf, auth]
at org.apache.hive.service.auth.SaslQOP.fromString(SaslQOP.java:56)
at 
org.apache.hive.service.auth.HiveAuthFactory.getSaslProperties(HiveAuthFactory.java:118)
at 
org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:133)
at 
org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)
at java.lang.Thread.run(Thread.java:744)
{code}



 Spark ThriftServer2 Kerberos failed encountering 
 java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
 are: [auth-int, auth-conf, auth]
 

 Key: SPARK-6882
 URL: https://issues.apache.org/jira/browse/SPARK-6882
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.2.1, 1.3.0, 1.4.0
 Environment: * Apache Hadoop 2.4.1 with Kerberos Enabled
 * Apache Hive 0.13.1
 * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
 * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
Reporter: Andrew Lee

 When Kerberos is enabled, I get the following exceptions. 
 {code}
 2015-03-13 18:26:05,363 ERROR 
 org.apache.hive.service.cli.thrift.ThriftCLIService 
 (ThriftBinaryCLIService.java:run(93)) - Error: 
 java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
 are: [auth-int, auth-conf, auth]
 {code}
 I tried it in
 * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
 * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
 with
 * Apache Hive 0.13.1
 * Apache Hadoop 2.4.1
 Build command
 {code}
 mvn -U -X -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Phive-thriftserver 
 -Dhadoop.version=2.4.1 -Dyarn.version=2.4.1 -Dhive.version=0.13.1 -DskipTests 
 install
 {code}
 When starting Spark ThriftServer in {{yarn-client}} mode, the command to 
 start thriftserver looks like this
 {code}
 ./start-thriftserver.sh --hiveconf hive.server2.thrift.port=2 --hiveconf 
 hive.server2.thrift.bind.host=$(hostname) --master yarn-client
 {code}
 {{hostname}} points to the current hostname of the machine I'm using.
 Error message in {{spark.log}} from Spark 1.2.1 (1.2 rc1)
 {code}
 2015-03-13 18:26:05,363 ERROR 
 org.apache.hive.service.cli.thrift.ThriftCLIService 
 (ThriftBinaryCLIService.java:run(93)) - Error: 
 java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
 are: [auth-int, auth-conf, auth]
 at org.apache.hive.service.auth.SaslQOP.fromString(SaslQOP.java:56)
 at 
 org.apache.hive.service.auth.HiveAuthFactory.getSaslProperties(HiveAuthFactory.java:118)
 at 
 org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:133)
 at 
 org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)
 at java.lang.Thread.run(Thread.java:744)
 {code}
 I'm wondering if this is due to the same problem described in HIVE-8154 
 HIVE-7620 due to an older code based for the Spark ThriftServer?
 Any insights are appreciated. Currently, I can't get Spark ThriftServer2 to 
 run against a Kerberos cluster (Apache 2.4.1).
 My hive-site.xml looks like the following for spark/conf.
 The kerberos keytab and tgt are configured correctly, I'm able to connect to 
 metastore, but the subsequent steps failed due to the exception.
 {code}
 property
   namehive.semantic.analyzer.factory.impl/name
   valueorg.apache.hcatalog.cli.HCatSemanticAnalyzerFactory/value
 /property
 property
   namehive.metastore.execute.setugi/name
   valuetrue/value
 /property
 property
   namehive.stats.autogather/name
   valuefalse/value
 /property
 property
   namehive.session.history.enabled/name
   valuetrue/value
 /property
 property
   namehive.querylog.location/name
   value/tmp/home/hive/log/${user.name}/value
 /property
 property
   namehive.exec.local.scratchdir/name
   value/tmp/hive/scratch/${user.name}/value
 /property
 property
   namehive.metastore.uris/name
   valuethrift://somehostname:9083/value
 /property
 !-- 

[jira] [Created] (SPARK-6882) Spark ThriftServer2 Kerberos failed encountering java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]

2015-04-13 Thread Andrew Lee (JIRA)
Andrew Lee created SPARK-6882:
-

 Summary: Spark ThriftServer2 Kerberos failed encountering 
java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: 
[auth-int, auth-conf, auth]
 Key: SPARK-6882
 URL: https://issues.apache.org/jira/browse/SPARK-6882
 Project: Spark
  Issue Type: Bug
Affects Versions: 1.3.0, 1.2.1
 Environment: * Apache Hadoop 2.4.1 with Kerberos Enabled
* Apache Hive 0.13.1
* Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
* Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
Reporter: Andrew Lee


When Kerberos is enabled, I get the following exceptions. 
{code}
2015-03-13 18:26:05,363 ERROR 
org.apache.hive.service.cli.thrift.ThriftCLIService 
(ThriftBinaryCLIService.java:run(93)) - Error: 
java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: 
[auth-int, auth-conf, auth]
{code}

I tried it in
* Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
* Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851

with
* Apache Hive 0.13.1
* Apache Hadoop 2.4.1

Build command
{code}
mvn -U -X -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Phive-thriftserver 
-Dhadoop.version=2.4.1 -Dyarn.version=2.4.1 -Dhive.version=0.13.1 -DskipTests 
install
{code}

When starting Spark ThriftServer in {{yarn-client}} mode, the command to start 
thriftserver looks like this

{code}
./start-thriftserver.sh --hiveconf hive.server2.thrift.port=2 --hiveconf 
hive.server2.thrift.bind.host=$(hostname) --master yarn-client
{code}

{{hostname}} points to the current hostname of the machine I'm using.

Error message in {{spark.log}} from Spark 1.2.1 (1.2 rc1)
{code}
2015-03-13 18:26:05,363 ERROR 
org.apache.hive.service.cli.thrift.ThriftCLIService 
(ThriftBinaryCLIService.java:run(93)) - Error: 
java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: 
[auth-int, auth-conf, auth]
at org.apache.hive.service.auth.SaslQOP.fromString(SaslQOP.java:56)
at 
org.apache.hive.service.auth.HiveAuthFactory.getSaslProperties(HiveAuthFactory.java:118)
at 
org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:133)
at 
org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)
at java.lang.Thread.run(Thread.java:744)
{code}

I'm wondering if this is due to the same problem described in HIVE-8154 
HIVE-7620 due to an older code based for the Spark ThriftServer?

Any insights are appreciated. Currently, I can't get Spark ThriftServer2 to run 
against a Kerberos cluster (Apache 2.4.1).

My hive-site.xml looks like the following for spark/conf.
The kerberos keytab and tgt are configured correctly, I'm able to connect to 
metastore, but the subsequent steps failed due to the exception.
{code}
property
  namehive.semantic.analyzer.factory.impl/name
  valueorg.apache.hcatalog.cli.HCatSemanticAnalyzerFactory/value
/property
property
  namehive.metastore.execute.setugi/name
  valuetrue/value
/property
property
  namehive.stats.autogather/name
  valuefalse/value
/property
property
  namehive.session.history.enabled/name
  valuetrue/value
/property
property
  namehive.querylog.location/name
  value/tmp/home/hive/log/${user.name}/value
/property
property
  namehive.exec.local.scratchdir/name
  value/tmp/hive/scratch/${user.name}/value
/property
property
  namehive.metastore.uris/name
  valuethrift://somehostname:9083/value
/property
!-- HIVE SERVER 2 --
property
  namehive.server2.authentication/name
  valueKERBEROS/value
/property
property
  namehive.server2.authentication.kerberos.principal/name
  value***/value
/property
property
  namehive.server2.authentication.kerberos.keytab/name
  value***/value
/property
property
  namehive.server2.thrift.sasl.qop/name
  valueauth/value
  descriptionSasl QOP value; one of 'auth', 'auth-int' and 
'auth-conf'/description
/property
property
  namehive.server2.enable.impersonation/name
  descriptionEnable user impersonation for HiveServer2/description
  valuetrue/value
/property
!-- HIVE METASTORE --
property
  namehive.metastore.sasl.enabled/name
  valuetrue/value
/property
property
  namehive.metastore.kerberos.keytab.file/name
  value***/value
/property
property
  namehive.metastore.kerberos.principal/name
  value***/value
/property
property
  namehive.metastore.cache.pinobjtypes/name
  valueTable,Database,Type,FieldSchema,Order/value
/property
property
  namehdfs_sentinel_file/name
  value***/value
/property
property
  namehive.metastore.warehouse.dir/name
  value/hive/value
/property
property
  namehive.metastore.client.socket.timeout/name
  value600/value
/property
property
  namehive.warehouse.subdir.inherit.perms/name
  valuetrue/value
/property
{code}

Here, I'm attaching a more detail logs from Spark 1.3 rc1.
{code}
2015-04-13 16:37:20,688 INFO