Thank you, Ted, This does help.
One more question, If I just want to migrate JDK only for Spark on my cluster 
machines, where can I add the JAVA_HOME environment variable? Does 
conf/spark-env.sh support JAVA_HOME environment variable? Thanks a lot.







在 2015-09-10 12:45:43,"Ted Yu" <yuzhih...@gmail.com> 写道:

See the following announcement:


http://search-hadoop.com/m/q3RTtojAyW1dabFk



On Wed, Sep 9, 2015 at 9:05 PM, Netwaver <wanglong_...@163.com> wrote:

Hi Spark experts,
                         I am trying to migrate my Spark cluster from 1.4.1 to 
latest 1.5.0 , but meet below issues when run start-all.sh script.

                          Exception in thread "main" 
java.lang.NoClassDefFoundError: org/apache/spark/launcher/Main
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.Main
        at java.net.URLClassLoader$1.run(Unknown Source)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
        at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
Could not find the main class: org.apache.spark.launcher.Main.  Program will 
exit.

                        I could easily migrate Spark cluster from 1.3.1 to 
1.4.1 on the same machines before, I am wondering if Spark 1.5.0 asks for some 
special jars in the classpath?
       I am using JDK 1.6 , don't know if 1.6 is also supported by Spark 1.5.0. 
Any suggestion will be highly appreciated, thank you all.





 


Reply via email to