[jira] [Comment Edited] (SPARK-5159) Thrift server does not respect hive.server2.enable.doAs=true

2016-01-18 Thread Ma Xiaoyu (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-5159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15104911#comment-15104911
 ] 

Ma Xiaoyu edited comment on SPARK-5159 at 1/18/16 8:16 AM:
---

Sorry and I realised that I messed up my PR with SPARK-6910.
My code is shadowed inside and not getting merged.
If needed, I might resubmit it with only the change of doAs part. That one is 
just trying to make doAs work.


was (Author: ilovesoup):
Sorry and I realised that I messed up my PR with SPARK-6910.
My code is shadowed inside.
If needed, I might resubmit it with only the change of doAs part. That one is 
just trying to make doAs work.

> Thrift server does not respect hive.server2.enable.doAs=true
> 
>
> Key: SPARK-5159
> URL: https://issues.apache.org/jira/browse/SPARK-5159
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.2.0
>Reporter: Andrew Ray
> Attachments: spark_thrift_server_log.txt
>
>
> I'm currently testing the spark sql thrift server on a kerberos secured 
> cluster in YARN mode. Currently any user can access any table regardless of 
> HDFS permissions as all data is read as the hive user. In HiveServer2 the 
> property hive.server2.enable.doAs=true causes all access to be done as the 
> submitting user. We should do the same.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-5159) Thrift server does not respect hive.server2.enable.doAs=true

2016-01-18 Thread Ma Xiaoyu (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-5159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15104911#comment-15104911
 ] 

Ma Xiaoyu edited comment on SPARK-5159 at 1/18/16 8:17 AM:
---

Sorry and I realised that I messed up my PR with SPARK-6910.
My change is shadowed inside and not getting merged.
If needed, I might resubmit it with only the change of doAs part. That one is 
just trying to make doAs work.


was (Author: ilovesoup):
Sorry and I realised that I messed up my PR with SPARK-6910.
My code is shadowed inside and not getting merged.
If needed, I might resubmit it with only the change of doAs part. That one is 
just trying to make doAs work.

> Thrift server does not respect hive.server2.enable.doAs=true
> 
>
> Key: SPARK-5159
> URL: https://issues.apache.org/jira/browse/SPARK-5159
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.2.0
>Reporter: Andrew Ray
> Attachments: spark_thrift_server_log.txt
>
>
> I'm currently testing the spark sql thrift server on a kerberos secured 
> cluster in YARN mode. Currently any user can access any table regardless of 
> HDFS permissions as all data is read as the hive user. In HiveServer2 the 
> property hive.server2.enable.doAs=true causes all access to be done as the 
> submitting user. We should do the same.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-5159) Thrift server does not respect hive.server2.enable.doAs=true

2016-01-18 Thread Ma Xiaoyu (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-5159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15104911#comment-15104911
 ] 

Ma Xiaoyu commented on SPARK-5159:
--

Sorry and I realised that I messed up my PR with SPARK-6910.
My code is shadowed inside.
If needed, I might resubmit it with only the change of doAs part. That one is 
just trying to make doAs work.

> Thrift server does not respect hive.server2.enable.doAs=true
> 
>
> Key: SPARK-5159
> URL: https://issues.apache.org/jira/browse/SPARK-5159
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.2.0
>Reporter: Andrew Ray
> Attachments: spark_thrift_server_log.txt
>
>
> I'm currently testing the spark sql thrift server on a kerberos secured 
> cluster in YARN mode. Currently any user can access any table regardless of 
> HDFS permissions as all data is read as the hive user. In HiveServer2 the 
> property hive.server2.enable.doAs=true causes all access to be done as the 
> submitting user. We should do the same.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-5159) Thrift server does not respect hive.server2.enable.doAs=true

2016-01-15 Thread Ma Xiaoyu (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-5159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15101484#comment-15101484
 ] 

Ma Xiaoyu commented on SPARK-5159:
--

Sorry for not following up on this.
Per what I investigate before. The problem is, what Hive get in session is not 
passed to DAGScheduler since they are in different thread.
The multi-threaded part is in DAGEventLoop. What I did before is added a field 
in event loop to pass the user that been impersonated. So in DAGScheduler event 
handler we can re-impersonate again across different threads.
If that's a OK solution, I can stick on that and resubmmit one.
Also I would like to follow up and redesign if needed.

> Thrift server does not respect hive.server2.enable.doAs=true
> 
>
> Key: SPARK-5159
> URL: https://issues.apache.org/jira/browse/SPARK-5159
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.2.0
>Reporter: Andrew Ray
> Attachments: spark_thrift_server_log.txt
>
>
> I'm currently testing the spark sql thrift server on a kerberos secured 
> cluster in YARN mode. Currently any user can access any table regardless of 
> HDFS permissions as all data is read as the hive user. In HiveServer2 the 
> property hive.server2.enable.doAs=true causes all access to be done as the 
> submitting user. We should do the same.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-5159) Thrift server does not respect hive.server2.enable.doAs=true

2015-07-12 Thread Ma Xiaoyu (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-5159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14623909#comment-14623909
 ] 

Ma Xiaoyu commented on SPARK-5159:
--

Above is my first PR to spark. New to Spark and scala. Please advise.

> Thrift server does not respect hive.server2.enable.doAs=true
> 
>
> Key: SPARK-5159
> URL: https://issues.apache.org/jira/browse/SPARK-5159
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.2.0
>Reporter: Andrew Ray
>
> I'm currently testing the spark sql thrift server on a kerberos secured 
> cluster in YARN mode. Currently any user can access any table regardless of 
> HDFS permissions as all data is read as the hive user. In HiveServer2 the 
> property hive.server2.enable.doAs=true causes all access to be done as the 
> submitting user. We should do the same.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6882) Spark ThriftServer2 Kerberos failed encountering java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]

2015-07-10 Thread Ma Xiaoyu (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6882?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14621899#comment-14621899
 ] 

Ma Xiaoyu commented on SPARK-6882:
--

Can you try add in spark-env.sh's classpath and make sure it stay before other 
jars.

> Spark ThriftServer2 Kerberos failed encountering 
> java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
> are: [auth-int, auth-conf, auth]
> 
>
> Key: SPARK-6882
> URL: https://issues.apache.org/jira/browse/SPARK-6882
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.2.1, 1.3.0, 1.4.0
> Environment: * Apache Hadoop 2.4.1 with Kerberos Enabled
> * Apache Hive 0.13.1
> * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
> * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
>Reporter: Andrew Lee
>
> When Kerberos is enabled, I get the following exceptions. 
> {code}
> 2015-03-13 18:26:05,363 ERROR 
> org.apache.hive.service.cli.thrift.ThriftCLIService 
> (ThriftBinaryCLIService.java:run(93)) - Error: 
> java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
> are: [auth-int, auth-conf, auth]
> {code}
> I tried it in
> * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
> * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
> with
> * Apache Hive 0.13.1
> * Apache Hadoop 2.4.1
> Build command
> {code}
> mvn -U -X -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Phive-thriftserver 
> -Dhadoop.version=2.4.1 -Dyarn.version=2.4.1 -Dhive.version=0.13.1 -DskipTests 
> install
> {code}
> When starting Spark ThriftServer in {{yarn-client}} mode, the command to 
> start thriftserver looks like this
> {code}
> ./start-thriftserver.sh --hiveconf hive.server2.thrift.port=2 --hiveconf 
> hive.server2.thrift.bind.host=$(hostname) --master yarn-client
> {code}
> {{hostname}} points to the current hostname of the machine I'm using.
> Error message in {{spark.log}} from Spark 1.2.1 (1.2 rc1)
> {code}
> 2015-03-13 18:26:05,363 ERROR 
> org.apache.hive.service.cli.thrift.ThriftCLIService 
> (ThriftBinaryCLIService.java:run(93)) - Error: 
> java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
> are: [auth-int, auth-conf, auth]
> at org.apache.hive.service.auth.SaslQOP.fromString(SaslQOP.java:56)
> at 
> org.apache.hive.service.auth.HiveAuthFactory.getSaslProperties(HiveAuthFactory.java:118)
> at 
> org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:133)
> at 
> org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)
> at java.lang.Thread.run(Thread.java:744)
> {code}
> I'm wondering if this is due to the same problem described in HIVE-8154 
> HIVE-7620 due to an older code based for the Spark ThriftServer?
> Any insights are appreciated. Currently, I can't get Spark ThriftServer2 to 
> run against a Kerberos cluster (Apache 2.4.1).
> My hive-site.xml looks like the following for spark/conf.
> The kerberos keytab and tgt are configured correctly, I'm able to connect to 
> metastore, but the subsequent steps failed due to the exception.
> {code}
> 
>   hive.semantic.analyzer.factory.impl
>   org.apache.hcatalog.cli.HCatSemanticAnalyzerFactory
> 
> 
>   hive.metastore.execute.setugi
>   true
> 
> 
>   hive.stats.autogather
>   false
> 
> 
>   hive.session.history.enabled
>   true
> 
> 
>   hive.querylog.location
>   /tmp/home/hive/log/${user.name}
> 
> 
>   hive.exec.local.scratchdir
>   /tmp/hive/scratch/${user.name}
> 
> 
>   hive.metastore.uris
>   thrift://somehostname:9083
> 
> 
> 
>   hive.server2.authentication
>   KERBEROS
> 
> 
>   hive.server2.authentication.kerberos.principal
>   ***
> 
> 
>   hive.server2.authentication.kerberos.keytab
>   ***
> 
> 
>   hive.server2.thrift.sasl.qop
>   auth
>   Sasl QOP value; one of 'auth', 'auth-int' and 
> 'auth-conf'
> 
> 
>   hive.server2.enable.impersonation
>   Enable user impersonation for HiveServer2
>   true
> 
> 
> 
>   hive.metastore.sasl.enabled
>   true
> 
> 
>   hive.metastore.kerberos.keytab.file
>   ***
> 
> 
>   hive.metastore.kerberos.principal
>   ***
> 
> 
>   hive.metastore.cache.pinobjtypes
>   Table,Database,Type,FieldSchema,Order
> 
> 
>   hdfs_sentinel_file
>   ***
> 
> 
>   hive.metastore.warehouse.dir
>   /hive
> 
> 
>   hive.metastore.client.socket.timeout
>   600
> 
> 
>   hive.warehouse.subdir.inherit.perms
>   true
> 
> {code}
> Here, I'm attaching a more detail logs from Spark 1.3 rc1.
> {code}
> 2015-04-13 16:37:20,688 INFO  org.apache.hadoop.security.UserGroupInformation 
> (UserGroupInformation.java:loginUserFromKeytab(893

[jira] [Commented] (SPARK-6882) Spark ThriftServer2 Kerberos failed encountering java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]

2015-07-10 Thread Ma Xiaoyu (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6882?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14621900#comment-14621900
 ] 

Ma Xiaoyu commented on SPARK-6882:
--

Can you try add in spark-env.sh's classpath and make sure it stay before other 
jars.

> Spark ThriftServer2 Kerberos failed encountering 
> java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
> are: [auth-int, auth-conf, auth]
> 
>
> Key: SPARK-6882
> URL: https://issues.apache.org/jira/browse/SPARK-6882
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.2.1, 1.3.0, 1.4.0
> Environment: * Apache Hadoop 2.4.1 with Kerberos Enabled
> * Apache Hive 0.13.1
> * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
> * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
>Reporter: Andrew Lee
>
> When Kerberos is enabled, I get the following exceptions. 
> {code}
> 2015-03-13 18:26:05,363 ERROR 
> org.apache.hive.service.cli.thrift.ThriftCLIService 
> (ThriftBinaryCLIService.java:run(93)) - Error: 
> java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
> are: [auth-int, auth-conf, auth]
> {code}
> I tried it in
> * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
> * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
> with
> * Apache Hive 0.13.1
> * Apache Hadoop 2.4.1
> Build command
> {code}
> mvn -U -X -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Phive-thriftserver 
> -Dhadoop.version=2.4.1 -Dyarn.version=2.4.1 -Dhive.version=0.13.1 -DskipTests 
> install
> {code}
> When starting Spark ThriftServer in {{yarn-client}} mode, the command to 
> start thriftserver looks like this
> {code}
> ./start-thriftserver.sh --hiveconf hive.server2.thrift.port=2 --hiveconf 
> hive.server2.thrift.bind.host=$(hostname) --master yarn-client
> {code}
> {{hostname}} points to the current hostname of the machine I'm using.
> Error message in {{spark.log}} from Spark 1.2.1 (1.2 rc1)
> {code}
> 2015-03-13 18:26:05,363 ERROR 
> org.apache.hive.service.cli.thrift.ThriftCLIService 
> (ThriftBinaryCLIService.java:run(93)) - Error: 
> java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
> are: [auth-int, auth-conf, auth]
> at org.apache.hive.service.auth.SaslQOP.fromString(SaslQOP.java:56)
> at 
> org.apache.hive.service.auth.HiveAuthFactory.getSaslProperties(HiveAuthFactory.java:118)
> at 
> org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:133)
> at 
> org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)
> at java.lang.Thread.run(Thread.java:744)
> {code}
> I'm wondering if this is due to the same problem described in HIVE-8154 
> HIVE-7620 due to an older code based for the Spark ThriftServer?
> Any insights are appreciated. Currently, I can't get Spark ThriftServer2 to 
> run against a Kerberos cluster (Apache 2.4.1).
> My hive-site.xml looks like the following for spark/conf.
> The kerberos keytab and tgt are configured correctly, I'm able to connect to 
> metastore, but the subsequent steps failed due to the exception.
> {code}
> 
>   hive.semantic.analyzer.factory.impl
>   org.apache.hcatalog.cli.HCatSemanticAnalyzerFactory
> 
> 
>   hive.metastore.execute.setugi
>   true
> 
> 
>   hive.stats.autogather
>   false
> 
> 
>   hive.session.history.enabled
>   true
> 
> 
>   hive.querylog.location
>   /tmp/home/hive/log/${user.name}
> 
> 
>   hive.exec.local.scratchdir
>   /tmp/hive/scratch/${user.name}
> 
> 
>   hive.metastore.uris
>   thrift://somehostname:9083
> 
> 
> 
>   hive.server2.authentication
>   KERBEROS
> 
> 
>   hive.server2.authentication.kerberos.principal
>   ***
> 
> 
>   hive.server2.authentication.kerberos.keytab
>   ***
> 
> 
>   hive.server2.thrift.sasl.qop
>   auth
>   Sasl QOP value; one of 'auth', 'auth-int' and 
> 'auth-conf'
> 
> 
>   hive.server2.enable.impersonation
>   Enable user impersonation for HiveServer2
>   true
> 
> 
> 
>   hive.metastore.sasl.enabled
>   true
> 
> 
>   hive.metastore.kerberos.keytab.file
>   ***
> 
> 
>   hive.metastore.kerberos.principal
>   ***
> 
> 
>   hive.metastore.cache.pinobjtypes
>   Table,Database,Type,FieldSchema,Order
> 
> 
>   hdfs_sentinel_file
>   ***
> 
> 
>   hive.metastore.warehouse.dir
>   /hive
> 
> 
>   hive.metastore.client.socket.timeout
>   600
> 
> 
>   hive.warehouse.subdir.inherit.perms
>   true
> 
> {code}
> Here, I'm attaching a more detail logs from Spark 1.3 rc1.
> {code}
> 2015-04-13 16:37:20,688 INFO  org.apache.hadoop.security.UserGroupInformation 
> (UserGroupInformation.java:loginUserFromKeytab(893

[jira] [Commented] (SPARK-6882) Spark ThriftServer2 Kerberos failed encountering java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]

2015-07-09 Thread Ma Xiaoyu (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6882?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14620252#comment-14620252
 ] 

Ma Xiaoyu commented on SPARK-6882:
--

What we do is:
Make Hive 13.1 applied with this patch (and related patch in Jira) 
https://issues.apache.org/jira/browse/HIVE-6741
In Spark-env.sh point to Hive lib's jar that with patch. It will make Spark 
Thrift server load hive's class instead of the ones in assemble first.
And, it shall work.

> Spark ThriftServer2 Kerberos failed encountering 
> java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
> are: [auth-int, auth-conf, auth]
> 
>
> Key: SPARK-6882
> URL: https://issues.apache.org/jira/browse/SPARK-6882
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.2.1, 1.3.0, 1.4.0
> Environment: * Apache Hadoop 2.4.1 with Kerberos Enabled
> * Apache Hive 0.13.1
> * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
> * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
>Reporter: Andrew Lee
>
> When Kerberos is enabled, I get the following exceptions. 
> {code}
> 2015-03-13 18:26:05,363 ERROR 
> org.apache.hive.service.cli.thrift.ThriftCLIService 
> (ThriftBinaryCLIService.java:run(93)) - Error: 
> java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
> are: [auth-int, auth-conf, auth]
> {code}
> I tried it in
> * Spark 1.2.1 git commit b6eaf77d4332bfb0a698849b1f5f917d20d70e97
> * Spark 1.3.0 rc1 commit label 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
> with
> * Apache Hive 0.13.1
> * Apache Hadoop 2.4.1
> Build command
> {code}
> mvn -U -X -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Phive-thriftserver 
> -Dhadoop.version=2.4.1 -Dyarn.version=2.4.1 -Dhive.version=0.13.1 -DskipTests 
> install
> {code}
> When starting Spark ThriftServer in {{yarn-client}} mode, the command to 
> start thriftserver looks like this
> {code}
> ./start-thriftserver.sh --hiveconf hive.server2.thrift.port=2 --hiveconf 
> hive.server2.thrift.bind.host=$(hostname) --master yarn-client
> {code}
> {{hostname}} points to the current hostname of the machine I'm using.
> Error message in {{spark.log}} from Spark 1.2.1 (1.2 rc1)
> {code}
> 2015-03-13 18:26:05,363 ERROR 
> org.apache.hive.service.cli.thrift.ThriftCLIService 
> (ThriftBinaryCLIService.java:run(93)) - Error: 
> java.lang.IllegalArgumentException: Unknown auth type: null Allowed values 
> are: [auth-int, auth-conf, auth]
> at org.apache.hive.service.auth.SaslQOP.fromString(SaslQOP.java:56)
> at 
> org.apache.hive.service.auth.HiveAuthFactory.getSaslProperties(HiveAuthFactory.java:118)
> at 
> org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:133)
> at 
> org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)
> at java.lang.Thread.run(Thread.java:744)
> {code}
> I'm wondering if this is due to the same problem described in HIVE-8154 
> HIVE-7620 due to an older code based for the Spark ThriftServer?
> Any insights are appreciated. Currently, I can't get Spark ThriftServer2 to 
> run against a Kerberos cluster (Apache 2.4.1).
> My hive-site.xml looks like the following for spark/conf.
> The kerberos keytab and tgt are configured correctly, I'm able to connect to 
> metastore, but the subsequent steps failed due to the exception.
> {code}
> 
>   hive.semantic.analyzer.factory.impl
>   org.apache.hcatalog.cli.HCatSemanticAnalyzerFactory
> 
> 
>   hive.metastore.execute.setugi
>   true
> 
> 
>   hive.stats.autogather
>   false
> 
> 
>   hive.session.history.enabled
>   true
> 
> 
>   hive.querylog.location
>   /tmp/home/hive/log/${user.name}
> 
> 
>   hive.exec.local.scratchdir
>   /tmp/hive/scratch/${user.name}
> 
> 
>   hive.metastore.uris
>   thrift://somehostname:9083
> 
> 
> 
>   hive.server2.authentication
>   KERBEROS
> 
> 
>   hive.server2.authentication.kerberos.principal
>   ***
> 
> 
>   hive.server2.authentication.kerberos.keytab
>   ***
> 
> 
>   hive.server2.thrift.sasl.qop
>   auth
>   Sasl QOP value; one of 'auth', 'auth-int' and 
> 'auth-conf'
> 
> 
>   hive.server2.enable.impersonation
>   Enable user impersonation for HiveServer2
>   true
> 
> 
> 
>   hive.metastore.sasl.enabled
>   true
> 
> 
>   hive.metastore.kerberos.keytab.file
>   ***
> 
> 
>   hive.metastore.kerberos.principal
>   ***
> 
> 
>   hive.metastore.cache.pinobjtypes
>   Table,Database,Type,FieldSchema,Order
> 
> 
>   hdfs_sentinel_file
>   ***
> 
> 
>   hive.metastore.warehouse.dir
>   /hive
> 
> 
>   hive.metastore.client.socket.timeout
>   600
> 
> 
>   hive.warehouse.subdir.inherit.perms
>   true
> 

[jira] [Commented] (SPARK-5159) Thrift server does not respect hive.server2.enable.doAs=true

2015-07-08 Thread Ma Xiaoyu (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-5159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14618277#comment-14618277
 ] 

Ma Xiaoyu commented on SPARK-5159:
--

I was investigating this issue and it seems doAs in Hiveserver2 code was 
working. The problem is when it forwarding some event in DAGScheduler, the 
event goes through different thread and the ticket in receiving side thread is 
not the same as sending side.
The proxy user became the real user who started the hiveserver2 services. 
Is that the root cause?
I might be making patch if so.


> Thrift server does not respect hive.server2.enable.doAs=true
> 
>
> Key: SPARK-5159
> URL: https://issues.apache.org/jira/browse/SPARK-5159
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.2.0
>Reporter: Andrew Ray
>
> I'm currently testing the spark sql thrift server on a kerberos secured 
> cluster in YARN mode. Currently any user can access any table regardless of 
> HDFS permissions as all data is read as the hive user. In HiveServer2 the 
> property hive.server2.enable.doAs=true causes all access to be done as the 
> submitting user. We should do the same.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org