[ 
https://issues.apache.org/jira/browse/SPARK-5510?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

hash-x updated SPARK-5510:
--------------------------
    Comment: was deleted

(was: mailing list ??? What is the site ,could you give me ? OK ,Thankyou!!! I 
am a beginner at spark and scala.)

> How can I fix the spark-submit script and then running the program on cluster 
> ?
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-5510
>                 URL: https://issues.apache.org/jira/browse/SPARK-5510
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.0.2
>            Reporter: hash-x
>              Labels: Help!!!!!!!!!!, spark-submit
>
> Reference: My Question is how can I fix the script and can submit the program 
> to a Master from my laptop? Not submit the program from a cluster. Submit 
> program from Node 2 is work for me.But the laptop is not!How can i do to fix 
> ??? help!!!!!!!!!!!
> I have looked the follow Email and I accept the recommend of One - run 
> spark-shell from a cluster node! But I want to solve the program with the 
> recommend of 2.But I am confused..........
> Hi Ken,
> This is unfortunately a limitation of spark-shell and the way it works on the 
> standalone mode.
> spark-shell sets an environment variable, SPARK_HOME, which tells Spark where 
> to find its
> code installed on the cluster. This means that the path on your laptop must 
> be the same as
> on the cluster, which is not the case. I recommend one of two things:
> 1) Either run spark-shell from a cluster node, where it will have the right 
> path. (In general
> it’s also better for performance to have it close to the cluster)
> 2) Or, edit the spark-shell script and re-export SPARK_HOME right before it 
> runs the Java
> command (ugly but will probably work).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to