[ https://issues.apache.org/jira/browse/SPARK-5493?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15095938#comment-15095938 ]
Hemshankar Sahu commented on SPARK-5493: ---------------------------------------- I have few doubts about running in client mode and cluster mode. Currently I am using a cloudera hadoop single node cluster (kerberos enabled.) In client mode I use following commands kinit spark-submit --master yarn-client --proxy-user cloudera examples/src/main/python/pi.py This works fine. In cluster mode I use following command (no kinit done and no TGT is present in the cache) spark-submit --principal <myprinc> --keytab <KT location> --master yarn-cluster examples/src/main/python/pi.py Also works fine. But when I use following command in cluster mode (no kinit done and no TGT is present in the cache) spark-submit --principal <myprinc> --keytab <KT location> --master yarn-cluster --proxy-user cloudera examples/src/main/python/pi.py throws following error No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) I guess in cluster mode the spark-submit do not look for TGT in the client machine... it transfers the "keytab" file to the cluster and then starts the spark job. So why does the specifying "--proxy-user" option looks for TGT while submitting in the "yarn-cluster" mode. Am I doing some thing wrong. > Support proxy users under kerberos > ---------------------------------- > > Key: SPARK-5493 > URL: https://issues.apache.org/jira/browse/SPARK-5493 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 1.2.0 > Reporter: Brock Noland > Assignee: Marcelo Vanzin > Fix For: 1.3.0 > > > When using kerberos, services may want to use spark-submit to submit jobs as > a separate user. For example a service like hive might want to submit jobs as > a client user. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org