[ 
https://issues.apache.org/jira/browse/SPARK-7504?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14536677#comment-14536677
 ] 

Apache Spark commented on SPARK-7504:
-------------------------------------

User 'ehnalis' has created a pull request for this issue:
https://github.com/apache/spark/pull/6029

> NullPointerException when initializing SparkContext in YARN-cluster mode
> ------------------------------------------------------------------------
>
>                 Key: SPARK-7504
>                 URL: https://issues.apache.org/jira/browse/SPARK-7504
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy, YARN
>            Reporter: Zoltán Zvara
>              Labels: deployment, yarn, yarn-client
>
> It is not clear for most users that, while running Spark on YARN a 
> {{SparkContext}} with a given execution plan can be run locally as 
> {{yarn-client}}, but can not deploy itself to the cluster. This is currently 
> performed using {{org.apache.spark.deploy.yarn.Client}}. {color:gray} I think 
> we should support deployment through {{SparkContext}}, but this is not the 
> point I wish to make here. {color}
> Configuring a {{SparkContext}} to deploy itself currently will yield an 
> {{ERROR}} while accessing {{spark.yarn.app.id}} in  
> {{YarnClusterSchedulerBackend}}, and after that a {{NullPointerException}} 
> while referencing the {{ApplicationMaster}} instance.
> Spark should clearly inform the user that it might be running in 
> {{yarn-cluster}} mode without a proper submission using {{Client}} and that 
> deploying is not supported directly from {{SparkContext}}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to