Re: Re: Failed when starting Spark 1.5.0 standalone cluster

2015-09-10 Thread Adam Hunt
You can add it to conf/spark-env.sh. $ cat conf/spark-env.sh #!/usr/bin/env bash JAVA_HOME=/app/tools/jdk1.7 PATH=$JAVA_HOME/bin:$PATH MESOS_NATIVE_JAVA_LIBRARY="/usr/lib/libmesos.so" SPARK_CLASSPATH="/opt/mapr/hadoop/hadoop-0.20.2/lib/amazon-s3.jar" On Wed, Sep 9, 2015 at 10:25 PM, Netwaver

Re: Failed when starting Spark 1.5.0 standalone cluster

2015-09-09 Thread Ted Yu
See the following announcement: http://search-hadoop.com/m/q3RTtojAyW1dabFk On Wed, Sep 9, 2015 at 9:05 PM, Netwaver wrote: > Hi Spark experts, > I am trying to migrate my Spark cluster from > 1.4.1 to latest 1.5.0 , but meet below issues when run

Failed when starting Spark 1.5.0 standalone cluster

2015-09-09 Thread Netwaver
Hi Spark experts, I am trying to migrate my Spark cluster from 1.4.1 to latest 1.5.0 , but meet below issues when run start-all.sh script. Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/Main Caused by:

Re:Re: Failed when starting Spark 1.5.0 standalone cluster

2015-09-09 Thread Netwaver
Thank you, Ted, This does help. One more question, If I just want to migrate JDK only for Spark on my cluster machines, where can I add the JAVA_HOME environment variable? Does conf/spark-env.sh support JAVA_HOME environment variable? Thanks a lot. 在 2015-09-10 12:45:43,"Ted Yu"