See the following announcement: http://search-hadoop.com/m/q3RTtojAyW1dabFk
On Wed, Sep 9, 2015 at 9:05 PM, Netwaver <wanglong_...@163.com> wrote: > Hi Spark experts, > I am trying to migrate my Spark cluster from > 1.4.1 to latest 1.5.0 , but meet below issues when run start-all.sh script. > > *Exception in thread "main" > java.lang.NoClassDefFoundError: org/apache/spark/launcher/Main* > *Caused by: java.lang.ClassNotFoundException: > org.apache.spark.launcher.Main* > * at java.net.URLClassLoader$1.run(Unknown Source)* > * at java.security.AccessController.doPrivileged(Native Method)* > * at java.net.URLClassLoader.findClass(Unknown Source)* > * at java.lang.ClassLoader.loadClass(Unknown Source)* > * at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)* > * at java.lang.ClassLoader.loadClass(Unknown Source)* > *Could not find the main class: org.apache.spark.launcher.Main. Program > will exit.* > > I could easily migrate Spark cluster from 1.3.1 to > 1.4.1 on the same machines before, I am wondering if Spark 1.5.0 asks for > some special jars in the classpath? > I am using JDK 1.6 , don't know if 1.6 is also supported by Spark > 1.5.0. Any suggestion will be highly appreciated, thank you all. > > > >