From: Williams, Ken Williams
ken.willi...@windlogics.commailto:ken.willi...@windlogics.com
Date: Thursday, March 19, 2015 at 10:59 AM
To: Spark list user@spark.apache.orgmailto:user@spark.apache.org
Subject: JAVA_HOME problem with upgrade to 1.3.0
[…]
Finally, I go and check the YARN
I’m trying to upgrade a Spark project, written in Scala, from Spark 1.2.1 to
1.3.0, so I changed my `build.sbt` like so:
-libraryDependencies += org.apache.spark %% spark-core % 1.2.1 %
“provided
+libraryDependencies += org.apache.spark %% spark-core % 1.3.0 %
provided
then make an
JAVA_HOME, an environment variable, should be defined on the node where
appattempt_1420225286501_4699_02 ran.
Cheers
On Thu, Mar 19, 2015 at 8:59 AM, Williams, Ken ken.willi...@windlogics.com
wrote:
I’m trying to upgrade a Spark project, written in Scala, from Spark
1.2.1 to 1.3.0, so I
From: Ted Yu yuzhih...@gmail.commailto:yuzhih...@gmail.com
Date: Thursday, March 19, 2015 at 11:05 AM
JAVA_HOME, an environment variable, should be defined on the node where
appattempt_1420225286501_4699_02 ran.
Has this behavior changed in 1.3.0 since 1.2.1 though? Using 1.2.1 and