[ https://issues.apache.org/jira/browse/SPARK-12111?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Andrew Davidson reopened SPARK-12111: ------------------------------------- Hi Sean It must be possible for customers to upgrade installations. Given Spark is written in java it is probably a matter of replacing jar files and maybe making a few changes to config files. Who ever is responsible for build/release of spark can probably write down the instructions. Its not reasonable to say destroy your old cluster and re-install it. In my experience spark does not work out of the box. You have to do a lot of work to configure it properly. I have a lot of data on HDFS I can not simply move it sincerely yours Andy > need upgrade instruction > ------------------------ > > Key: SPARK-12111 > URL: https://issues.apache.org/jira/browse/SPARK-12111 > Project: Spark > Issue Type: Documentation > Components: EC2 > Affects Versions: 1.5.1 > Reporter: Andrew Davidson > Labels: build, documentation > > I have looked all over the spark website and googled. I have not found > instructions for how to upgrade spark in general let alone a cluster created > by using spark-ec2 script > thanks. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org