It is 1.6.0 builded from sources. I’m trying it on mine eclipse project and want use spark on it, so I put libraries there and have no ClassNotFoundException
akka-actor_2.10-2.3.11.jar akka-remote_2.10-2.3.11.jar akka-slf4j_2.10-2.3.11.jar config-1.2.1.jar hadoop-auth-2.7.1.jar hadoop-common-2.7.1.jar hadoop-mapreduce-client-common-2.7.1.jar hadoop-mapreduce-client-core-2.7.1.jar hadoop-mapreduce-client-jobclient-2.7.1.jar hadoop-mapreduce-client-shuffle-2.7.1.jar hadoop-yarn-api-2.7.1.jar hadoop-yarn-common-2.7.1.jar hamcrest-core-1.3.jar netty-all-4.0.29.Final.jar scala-library-2.10.6.jar spark-core_2.10-1.6.0.jar spark-mllib_2.10-1.6.0.jar spark-network-common_2.10-1.6.0.jar spark-network-shuffle_2.10-1.6.0.jar spark-sql_2.10-1.6.0.jar spark-streaming_2.10-1.6.0.jar > On 4 февр. 2016 г., at 13:51, Ted Yu <yuzhih...@gmail.com> wrote: > > Which Spark release are you using ? > > Is there other clue from the logs ? If so, please pastebin. > > Cheers > > On Thu, Feb 4, 2016 at 2:49 AM, Valentin Popov <valentin...@gmail.com > <mailto:valentin...@gmail.com>> wrote: > Hi all, > > I’m trying run spark on local mode, i using such code: > > SparkConf conf = new > SparkConf().setAppName("JavaWord2VecExample").setMaster("local[*]"); > JavaSparkContext sc = new JavaSparkContext(conf); > > but after while (10 sec) I got Exception, here is a stack trace: > java.util.concurrent.TimeoutException: Futures timed out after [10000 > milliseconds] > at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) > at > scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) > at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) > at > scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53) > at scala.concurrent.Await$.result(package.scala:107) > at akka.remote.Remoting.start(Remoting.scala:179) > at > akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184) > at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:620) > at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:617) > at akka.actor.ActorSystemImpl._start(ActorSystem.scala:617) > at akka.actor.ActorSystemImpl.start(ActorSystem.scala:634) > at akka.actor.ActorSystem$.apply(ActorSystem.scala:142) > at akka.actor.ActorSystem$.apply(ActorSystem.scala:119) > at > org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121) > at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53) > at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52) > at > org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964) > at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) > at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955) > at > org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55) > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:266) > at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193) > at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288) > at org.apache.spark.SparkContext.<init>(SparkContext.scala:457) > at > org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59) > at > com.stimulus.archiva.datamining.ml.Word2VecTest.word2vec(Word2VecTest.java:23) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:497) > at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) > at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) > at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) > at > org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) > at > org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78) > at > org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57) > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) > at org.junit.runners.ParentRunner.run(ParentRunner.java:363) > at > org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86) > at > org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) > at > org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459) > at > org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675) > at > org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382) > at > org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192) > > > > Any one know library dependencies that can cause such error? > > Regards, > Valentin > > > > > Regards, Valentin Popov