these errors.
So I want to make sure I followed the correct upgrade process as below (I am
running Spark on single machine in standalone mode - so no cluster
deployment):
- set SPARK_HOME to the new version
- run sbt assembly in SPARK_HOME to build the new Spark jars
- in the project sbt file point
these errors.
So I want to make sure I followed the correct upgrade process as below (I am
running Spark on single machine in standalone mode - so no cluster
deployment):
- set SPARK_HOME to the new version
- run sbt assembly in SPARK_HOME to build the new Spark jars
- in the project sbt file
version?
thanks
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/correct-upgrade-process-tp11194p11213.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.