[ 
https://issues.apache.org/jira/browse/SPARK-12111?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15036887#comment-15036887
 ] 

Andrew Davidson commented on SPARK-12111:
-----------------------------------------

Hi Sean

I understand I will need to stop by cluster to change to a different version. I 
am looking for directions for how to "change to a different version" E.G. on my 
local mac I have several different versions of spark down loaded. I have an env 
var SPARK_ROOT=pathToVersion I want to use.

To use something like pyspark I would 
$ $SPARK_ROOT/bin/pyspark

I am looking for direction for how to do something similar in a cluster env.  I 
think the rough steps would be

1) stop the cluster
2) down load the binary. Is the binary the same on all the machines (ie. 
masters and slaves?)
3)  I am not sure what do do about the config/* 

 

> need upgrade instruction
> ------------------------
>
>                 Key: SPARK-12111
>                 URL: https://issues.apache.org/jira/browse/SPARK-12111
>             Project: Spark
>          Issue Type: Documentation
>          Components: EC2
>    Affects Versions: 1.5.1
>            Reporter: Andrew Davidson
>              Labels: build, documentation
>
> I have looked all over the spark website and googled. I have not found 
> instructions for how to upgrade spark in general let alone a cluster created 
> by using spark-ec2 script
> thanks.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to