[ 
https://issues.apache.org/jira/browse/SPARK-6506?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14381860#comment-14381860
 ] 

Thomas Graves commented on SPARK-6506:
--------------------------------------

If you are running on yarn you just have to set SPARK_HOME like this:
spark.yarn.appMasterEnv.SPARK_HOME /bogus
spark.executorEnv.SPARK_HOME /bogus

But the error you pasted above isn't about that.  I've seen this when building 
the assembly with jdk7 or jdk8 due to the python stuff not being packaged 
properly in the assembly jar.  I have to use jdk6 to package it.  see 
https://issues.apache.org/jira/browse/SPARK-1920

> python support yarn cluster mode requires SPARK_HOME to be set
> --------------------------------------------------------------
>
>                 Key: SPARK-6506
>                 URL: https://issues.apache.org/jira/browse/SPARK-6506
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.3.0
>            Reporter: Thomas Graves
>
> We added support for python running in yarn cluster mode in 
> https://issues.apache.org/jira/browse/SPARK-5173, but it requires that 
> SPARK_HOME be set in the environment variables for application master and 
> executor.  It doesn't have to be set to anything real but it fails if its not 
> set.  See the command at the end of: https://github.com/apache/spark/pull/3976



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to