Re: correct upgrade process

2014-08-01 Thread Matei Zaharia
This should be okay, but make sure that your cluster also has the right code 
deployed. Maybe you have the wrong one.

If you built Spark from source multiple times, you may also want to try sbt 
clean before sbt assembly.

Matei

On August 1, 2014 at 12:00:07 PM, SK (skrishna...@gmail.com) wrote:


Hi, 

I upgraded to 1.0.1 from 1.0 a couple of weeks ago and have been able to use 
some of the features advertised in 1.0.1. However, I get some compilation 
errors in some cases and based on user response, these errors have been 
addressed in the 1.0.1 version and so I should not be getting these errors. 
So I want to make sure I followed the correct upgrade process as below (I am 
running Spark on single machine in standalone mode - so no cluster 
deployment): 

- set SPARK_HOME to the new version 

- run sbt assembly in SPARK_HOME to build the new Spark jars 

- in the project sbt file point the libraryDependencies for spark-core and 
other libraries to the 1.0.1 version and run sbt assembly to build the 
project jar. 

Is there anything else I need to do to ensure that no old jars are being 
used? For example do I need to manually delete any old jars? 

thanks 



-- 
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/correct-upgrade-process-tp11194.html
 
Sent from the Apache Spark User List mailing list archive at Nabble.com. 


Re: correct upgrade process

2014-08-01 Thread SK
Hi,

So I again ran sbt clean followed by all of the steps listed above to
rebuild the jars after cleaning. My compilation error still persists.
Specifically, I am trying to extract an element from the feature vector that
is part of a LabeledPoint as follows:

data.features(i) 

This gives the following error:
method apply in trait Vector cannot be accessed in
org.apache.spark.mllib.linalg.Vector 

Based on a related post, this bug has been fixed in version 1.0.1 So not
sure why I am still getting this error. 

I noticed that sbt clean only removes the classes and jar files. However,
there is a .ivy2 directory where things get downloaded. That does not seem
to get cleaned and I am not sure if there are any old dependencies from here
that are being used when sbt assembly is run. So do I need to manually
remove this directory before running sbt clean and rebuilding the jars for
the new version?

thanks



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/correct-upgrade-process-tp11194p11213.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.