Hi,

I upgraded to 1.0.1 from 1.0 a couple of weeks ago and have been able to use
some of the features advertised in 1.0.1. However, I get some compilation
errors in some cases and based on user response, these errors have been
addressed in the 1.0.1 version and so I should not be getting these errors.
So I want to make sure I followed the correct upgrade process as below (I am
running Spark on single machine in standalone mode - so no cluster
deployment):

- set SPARK_HOME to the new version

- run "sbt assembly" in SPARK_HOME to build the new Spark jars

- in the project sbt file point the libraryDependencies for spark-core and
other libraries to the 1.0.1 version and run "sbt assembly" to build the
project jar.

Is there anything else I need to do to ensure that no old jars are being
used? For example do I need to manually delete any old jars?

thanks



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/correct-upgrade-process-tp11194.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to