[
https://issues.apache.org/jira/browse/SPARK-5493?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14582720#comment-14582720
]
Jesika Haria edited comment on SPARK-5493 at 6/12/15 12:20 AM:
---
Trying to support impersonation with the pyspark. It works with the proxy-user
flag set on the command line:
{code}
pyspark --master yarn-client --proxy-user foo
{code}
However, I actually need to set up the Spark Context programmatically via the
Python API, but could find no documentation for this. Is pyspark impersonation
via proxy-user even supported at this time? In the absence of this
functionality, what is the recommended way of supporting impersonation
(especially if setting the HADOOP_PROXY_USER env variable is discouraged in
production)?
Or if there is a spark config property that corresponds to the proxy-user flag,
that would be great too (cannot see one at
https://spark.apache.org/docs/latest/configuration.html)
was (Author: jesika):
Trying to support impersonation with the pyspark. It works with the proxy-user
flag set on the command line:
{code}
pyspark --master yarn-client --proxy-user foo
{code}
However, I actually need to set up the Spark Context programmatically via the
Python API, but could find no documentation for this. Is pyspark impersonation
via proxy-user even supported at this time? In the absence of this
functionality, what is the recommended way of supporting impersonation
(especially if setting the HADOOP_PROXY_USER env variable is discouraged in
production)?
Support proxy users under kerberos
--
Key: SPARK-5493
URL: https://issues.apache.org/jira/browse/SPARK-5493
Project: Spark
Issue Type: Improvement
Components: Spark Core
Affects Versions: 1.2.0
Reporter: Brock Noland
Assignee: Marcelo Vanzin
Fix For: 1.3.0
When using kerberos, services may want to use spark-submit to submit jobs as
a separate user. For example a service like hive might want to submit jobs as
a client user.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org