I think this is an answer… 

HADOOP_HOME or hadoop.home.dir are not set.

Sorry


2016-02-04 14:10:08 o.a.h.u.Shell [DEBUG] Failed to detect a valid hadoop home 
directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
        at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:303) 
[hadoop-common-2.7.1.jar:na]
        at org.apache.hadoop.util.Shell.<clinit>(Shell.java:328) 
[hadoop-common-2.7.1.jar:na]
        at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80) 
[hadoop-common-2.7.1.jar:na]
        at 
org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:610)
 [hadoop-common-2.7.1.jar:na]
        at 
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:272)
 [hadoop-common-2.7.1.jar:na]
        at 
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
 [hadoop-common-2.7.1.jar:na]
        at 
org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:790)
 [hadoop-common-2.7.1.jar:na]
        at 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:760)
 [hadoop-common-2.7.1.jar:na]
        at 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:633)
 [hadoop-common-2.7.1.jar:na]
        at 
org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2136)
 [spark-core_2.10-1.6.0.jar:1.6.0]
        at 
org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2136)
 [spark-core_2.10-1.6.0.jar:1.6.0]
        at scala.Option.getOrElse(Option.scala:120) 
[scala-library-2.10.6.jar:na]
        at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2136) 
[spark-core_2.10-1.6.0.jar:1.6.0]
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:322) 
[spark-core_2.10-1.6.0.jar:1.6.0]
        at 
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59) 
[spark-core_2.10-1.6.0.jar:1.6.0]
        at 
com.stimulus.archiva.datamining.ml.Word2VecTest.word2vec(Word2VecTest.java:23) 
[classes/:na]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[na:1.8.0_71]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[na:1.8.0_71]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[na:1.8.0_71]
        at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_71]
        at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
 [junit-4.12.jar:4.12]
        at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
 [junit-4.12.jar:4.12]
        at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
 [junit-4.12.jar:4.12]
        at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
 [junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) 
[junit-4.12.jar:4.12]
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
 [junit-4.12.jar:4.12]
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
 [junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) 
[junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) 
[junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) 
[junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) 
[junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) 
[junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner.run(ParentRunner.java:363) 
[junit-4.12.jar:4.12]
        at 
org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
 [.cp/:na]
        at 
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) 
[.cp/:na]
        at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
 [.cp/:na]
        at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
 [.cp/:na]
        at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
 [.cp/:na]
        at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
 [.cp/:na]
2016-02-04 14:10:08 o.a.h.u.Shell [DEBUG] setsid is not available on this 
machine. So not using it.
2016-02-04 14:10:08 o.a.h.u.Shell [DEBUG] setsid exited with exit code 0
2016-02-04 14:10:08 o.a.h.s.a.u.KerberosName [DEBUG] Kerberos krb5 
configuration not found, setting default realm to empty
2016-02-04 14:10:08 o.a.h.s.Groups [DEBUG]  Creating new Groups object
2016-02-04 14:10:08 o.a.h.u.NativeCodeLoader [DEBUG] Trying to load the 
custom-built native-hadoop library...
2016-02-04 14:10:08 o.a.h.u.NativeCodeLoader [DEBUG] Failed to load 
native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in 
java.library.path
2016-02-04 14:10:08 o.a.h.u.NativeCodeLoader [DEBUG] 
java.library.path=/Users/valenpo/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
2016-02-04 14:10:08 o.a.h.u.NativeCodeLoader [WARN] Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
2016-02-04 14:10:08 o.a.h.u.PerformanceAdvisory [DEBUG] Falling back to shell 
based
2016-02-04 14:10:08 o.a.h.s.JniBasedUnixGroupsMappingWithFallback [DEBUG] Group 
mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
2016-02-04 14:10:08 o.a.h.s.Groups [DEBUG] Group mapping 
impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; 
cacheTimeout=300000; warningDeltaMs=5000
2016-02-04 14:10:08 o.a.h.s.UserGroupInformation [DEBUG] hadoop login
2016-02-04 14:10:08 o.a.h.s.UserGroupInformation [DEBUG] hadoop login commit
2016-02-04 14:10:08 o.a.h.s.UserGroupInformation [DEBUG] using local 
user:UnixPrincipal: valenpo
2016-02-04 14:10:08 o.a.h.s.UserGroupInformation [DEBUG] Using user: 
"UnixPrincipal: valenpo" with name valenpo
2016-02-04 14:10:08 o.a.h.s.UserGroupInformation [DEBUG] User entry: "valenpo"
2016-02-04 14:10:08 o.a.h.s.UserGroupInformation [DEBUG] UGI loginUser:valenpo 
(auth:SIMPLE)
2016-02-04 14:10:08 o.a.s.SecurityManager [INFO] Changing view acls to: valenpo
2016-02-04 14:10:08 o.a.s.SecurityManager [INFO] Changing modify acls to: 
valenpo
2016-02-04 14:10:08 o.a.s.SecurityManager [INFO] SecurityManager: 
authentication disabled; ui acls disabled; users with view permissions: 
Set(valenpo); users with modify permissions: Set(valenpo)
2016-02-04 14:10:08 o.a.s.SSLOptions [DEBUG] No SSL protocol specified
2016-02-04 14:10:08 o.a.s.SSLOptions [DEBUG] No SSL protocol specified
2016-02-04 14:10:08 o.a.s.SSLOptions [DEBUG] No SSL protocol specified
2016-02-04 14:10:08 o.a.s.SecurityManager [DEBUG] SSLConfiguration for file 
server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, 
trustStore=None, trustStorePassword=None, protocol=None, 
enabledAlgorithms=Set()}
2016-02-04 14:10:08 o.a.s.SecurityManager [DEBUG] SSLConfiguration for Akka: 
SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, 
trustStore=None, trustStorePassword=None, protocol=None, 
enabledAlgorithms=Set()}
2016-02-04 14:10:09 i.n.u.i.l.InternalLoggerFactory [DEBUG] Using SLF4J as the 
default logging framework
2016-02-04 14:10:09 i.n.u.i.PlatformDependent0 [DEBUG] java.nio.Buffer.address: 
available
2016-02-04 14:10:09 i.n.u.i.PlatformDependent0 [DEBUG] 
sun.misc.Unsafe.theUnsafe: available
2016-02-04 14:10:09 i.n.u.i.PlatformDependent0 [DEBUG] 
sun.misc.Unsafe.copyMemory: available
2016-02-04 14:10:09 i.n.u.i.PlatformDependent0 [DEBUG] java.nio.Bits.unaligned: 
true
2016-02-04 14:10:09 i.n.u.i.PlatformDependent [DEBUG] Java version: 8
2016-02-04 14:10:09 i.n.u.i.PlatformDependent [DEBUG] -Dio.netty.noUnsafe: false
2016-02-04 14:10:09 i.n.u.i.PlatformDependent [DEBUG] sun.misc.Unsafe: available
2016-02-04 14:10:09 i.n.u.i.PlatformDependent [DEBUG] -Dio.netty.noJavassist: 
false
2016-02-04 14:10:09 i.n.u.i.PlatformDependent [DEBUG] Javassist: available
2016-02-04 14:10:09 i.n.u.i.PlatformDependent [DEBUG] -Dio.netty.tmpdir: 
/var/folders/l9/j5mvx5v13cq5586rggc_hngc0000gn/T (java.io.tmpdir)
2016-02-04 14:10:09 i.n.u.i.PlatformDependent [DEBUG] -Dio.netty.bitMode: 64 
(sun.arch.data.model)
2016-02-04 14:10:09 i.n.u.i.PlatformDependent [DEBUG] 
-Dio.netty.noPreferDirect: false
2016-02-04 14:10:09 i.n.u.i.JavassistTypeParameterMatcherGenerator [DEBUG] 
Generated: 
io.netty.util.internal.__matchers__.org.apache.spark.network.protocol.MessageMatcher
2016-02-04 14:10:09 i.n.u.i.JavassistTypeParameterMatcherGenerator [DEBUG] 
Generated: io.netty.util.internal.__matchers__.io.netty.buffer.ByteBufMatcher
2016-02-04 14:10:09 i.n.c.MultithreadEventLoopGroup [DEBUG] 
-Dio.netty.eventLoopThreads: 16
2016-02-04 14:10:09 i.n.c.n.NioEventLoop [DEBUG] 
-Dio.netty.noKeySetOptimization: false
2016-02-04 14:10:09 i.n.c.n.NioEventLoop [DEBUG] 
-Dio.netty.selectorAutoRebuildThreshold: 512
2016-02-04 14:10:09 i.n.b.PooledByteBufAllocator [DEBUG] 
-Dio.netty.allocator.numHeapArenas: 16
2016-02-04 14:10:09 i.n.b.PooledByteBufAllocator [DEBUG] 
-Dio.netty.allocator.numDirectArenas: 16
2016-02-04 14:10:09 i.n.b.PooledByteBufAllocator [DEBUG] 
-Dio.netty.allocator.pageSize: 8192
2016-02-04 14:10:09 i.n.b.PooledByteBufAllocator [DEBUG] 
-Dio.netty.allocator.maxOrder: 11
2016-02-04 14:10:09 i.n.b.PooledByteBufAllocator [DEBUG] 
-Dio.netty.allocator.chunkSize: 16777216
2016-02-04 14:10:09 i.n.b.PooledByteBufAllocator [DEBUG] 
-Dio.netty.allocator.tinyCacheSize: 512
2016-02-04 14:10:09 i.n.b.PooledByteBufAllocator [DEBUG] 
-Dio.netty.allocator.smallCacheSize: 256
2016-02-04 14:10:09 i.n.b.PooledByteBufAllocator [DEBUG] 
-Dio.netty.allocator.normalCacheSize: 64
2016-02-04 14:10:09 i.n.b.PooledByteBufAllocator [DEBUG] 
-Dio.netty.allocator.maxCachedBufferCapacity: 32768
2016-02-04 14:10:09 i.n.b.PooledByteBufAllocator [DEBUG] 
-Dio.netty.allocator.cacheTrimInterval: 8192
2016-02-04 14:10:09 i.n.u.i.ThreadLocalRandom [DEBUG] 
-Dio.netty.initialSeedUniquifier: 0x28216e9490a50a13 (took 0 ms)
2016-02-04 14:10:09 i.n.b.ByteBufUtil [DEBUG] -Dio.netty.allocator.type: 
unpooled
2016-02-04 14:10:09 i.n.b.ByteBufUtil [DEBUG] 
-Dio.netty.threadLocalDirectBufferSize: 65536
2016-02-04 14:10:09 i.n.u.NetUtil [DEBUG] Loopback interface: lo0 (lo0, 
0:0:0:0:0:0:0:1)
2016-02-04 14:10:09 i.n.u.NetUtil [DEBUG] /proc/sys/net/core/somaxconn: 128 
(non-existent)
2016-02-04 14:10:09 o.a.s.n.s.TransportServer [DEBUG] Shuffle server started on 
port :54363
2016-02-04 14:10:09 o.a.s.u.Utils [INFO] Successfully started service 
'sparkDriver' on port 54363.
2016-02-04 14:10:09 o.a.s.u.AkkaUtils [DEBUG] In createActorSystem, 
requireCookie is: off
2016-02-04 14:10:09 a.e.s.Slf4jLogger [INFO] Slf4jLogger started
2016-02-04 14:10:09 Remoting [INFO] Starting remoting
2016-02-04 14:10:09 a.a.ActorSystemImpl [ERROR] Uncaught fatal error from 
thread [sparkDriverActorSystem-akka.remote.default-remote-dispatcher-5] 
shutting down ActorSystem [sparkDriverActorSystem]
java.lang.NoClassDefFoundError: org/jboss/netty/util/Timer
        at java.lang.Class.getDeclaredConstructors0(Native Method) 
~[na:1.8.0_71]
        at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671) 
~[na:1.8.0_71]
        at java.lang.Class.getConstructor0(Class.java:3075) ~[na:1.8.0_71]
        at java.lang.Class.getDeclaredConstructor(Class.java:2178) 
~[na:1.8.0_71]
        at 
akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:76)
 ~[akka-actor_2.10-2.3.11.jar:na]
        at scala.util.Try$.apply(Try.scala:161) ~[scala-library-2.10.6.jar:na]
        at 
akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73) 
~[akka-actor_2.10-2.3.11.jar:na]
        at 
akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
 ~[akka-actor_2.10-2.3.11.jar:na]
        at 
akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
 ~[akka-actor_2.10-2.3.11.jar:na]
        at scala.util.Success.flatMap(Try.scala:200) 
~[scala-library-2.10.6.jar:na]
        at 
akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84) 
~[akka-actor_2.10-2.3.11.jar:na]
        at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:711) 
~[akka-remote_2.10-2.3.11.jar:na]
        at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:703) 
~[akka-remote_2.10-2.3.11.jar:na]
        at 
scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
 ~[scala-library-2.10.6.jar:na]
        at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
~[scala-library-2.10.6.jar:na]
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
~[scala-library-2.10.6.jar:na]
        at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) 
~[scala-library-2.10.6.jar:na]
        at scala.collection.AbstractIterable.foreach(Iterable.scala:54) 
~[scala-library-2.10.6.jar:na]
        at 
scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721) 
~[scala-library-2.10.6.jar:na]
        at 
akka.remote.EndpointManager.akka$remote$EndpointManager$$listens(Remoting.scala:703)
 ~[akka-remote_2.10-2.3.11.jar:na]
        at 
akka.remote.EndpointManager$$anonfun$receive$2.applyOrElse(Remoting.scala:491) 
~[akka-remote_2.10-2.3.11.jar:na]
        at akka.actor.Actor$class.aroundReceive(Actor.scala:467) 
~[akka-actor_2.10-2.3.11.jar:na]
        at akka.remote.EndpointManager.aroundReceive(Remoting.scala:394) 
~[akka-remote_2.10-2.3.11.jar:na]
        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) 
[akka-actor_2.10-2.3.11.jar:na]
        at akka.actor.ActorCell.invoke(ActorCell.scala:487) 
[akka-actor_2.10-2.3.11.jar:na]
        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) 
[akka-actor_2.10-2.3.11.jar:na]
        at akka.dispatch.Mailbox.run(Mailbox.scala:220) 
[akka-actor_2.10-2.3.11.jar:na]
        at 
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
 [akka-actor_2.10-2.3.11.jar:na]
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
[scala-library-2.10.6.jar:na]
        at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
 [scala-library-2.10.6.jar:na]
        at 
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) 
[scala-library-2.10.6.jar:na]
        at 
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
 [scala-library-2.10.6.jar:na]
Caused by: java.lang.ClassNotFoundException: org.jboss.netty.util.Timer
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381) 
~[na:1.8.0_71]
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[na:1.8.0_71]
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) 
~[na:1.8.0_71]
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[na:1.8.0_71]
        ... 32 common frames omitted
2016-02-04 14:10:09 a.r.RemoteActorRefProvider$RemotingTerminator [INFO] 
Shutting down remote daemon.
2016-02-04 14:10:09 a.r.RemoteActorRefProvider$RemotingTerminator [INFO] Remote 
daemon shut down; proceeding with flushing remote transports.
2016-02-04 14:10:09 Remoting [ERROR] Remoting system has been terminated 
abrubtly. Attempting to shut down transports
2016-02-04 14:10:09 a.r.RemoteActorRefProvider$RemotingTerminator [INFO] 
Remoting shut down.
2016-02-04 14:10:19 o.a.s.SparkContext [ERROR] Error initializing SparkContext.
java.util.concurrent.TimeoutException: Futures timed out after [10000 
milliseconds]
        at 
scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) 
~[scala-library-2.10.6.jar:na]
        at 
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) 
~[scala-library-2.10.6.jar:na]
        at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) 
~[scala-library-2.10.6.jar:na]
        at 
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
 ~[scala-library-2.10.6.jar:na]
        at scala.concurrent.Await$.result(package.scala:107) 
~[scala-library-2.10.6.jar:na]
        at akka.remote.Remoting.start(Remoting.scala:179) 
~[akka-remote_2.10-2.3.11.jar:na]
        at 
akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184) 
~[akka-remote_2.10-2.3.11.jar:na]
        at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:620) 
~[akka-actor_2.10-2.3.11.jar:na]
        at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:617) 
~[akka-actor_2.10-2.3.11.jar:na]
        at akka.actor.ActorSystemImpl._start(ActorSystem.scala:617) 
~[akka-actor_2.10-2.3.11.jar:na]
        at akka.actor.ActorSystemImpl.start(ActorSystem.scala:634) 
~[akka-actor_2.10-2.3.11.jar:na]
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:142) 
~[akka-actor_2.10-2.3.11.jar:na]
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:119) 
~[akka-actor_2.10-2.3.11.jar:na]
        at 
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
 ~[spark-core_2.10-1.6.0.jar:1.6.0]
        at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53) 
~[spark-core_2.10-1.6.0.jar:1.6.0]
        at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52) 
~[spark-core_2.10-1.6.0.jar:1.6.0]
        at 
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
 ~[spark-core_2.10-1.6.0.jar:1.6.0]
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) 
~[scala-library-2.10.6.jar:na]
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955) 
~[spark-core_2.10-1.6.0.jar:1.6.0]
        at 
org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55) 
~[spark-core_2.10-1.6.0.jar:1.6.0]
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:266) 
~[spark-core_2.10-1.6.0.jar:1.6.0]
        at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193) 
~[spark-core_2.10-1.6.0.jar:1.6.0]
        at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288) 
~[spark-core_2.10-1.6.0.jar:1.6.0]
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:457) 
~[spark-core_2.10-1.6.0.jar:1.6.0]
        at 
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59) 
[spark-core_2.10-1.6.0.jar:1.6.0]
        at 
com.stimulus.archiva.datamining.ml.Word2VecTest.word2vec(Word2VecTest.java:23) 
[classes/:na]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[na:1.8.0_71]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[na:1.8.0_71]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[na:1.8.0_71]
        at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_71]
        at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
 [junit-4.12.jar:4.12]
        at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
 [junit-4.12.jar:4.12]
        at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
 [junit-4.12.jar:4.12]
        at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
 [junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) 
[junit-4.12.jar:4.12]
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
 [junit-4.12.jar:4.12]
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
 [junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) 
[junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) 
[junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) 
[junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) 
[junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) 
[junit-4.12.jar:4.12]
        at org.junit.runners.ParentRunner.run(ParentRunner.java:363) 
[junit-4.12.jar:4.12]
        at 
org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
 [.cp/:na]
        at 
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) 
[.cp/:na]
        at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
 [.cp/:na]
        at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
 [.cp/:na]
        at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
 [.cp/:na]
        at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
 [.cp/:na]
2016-02-04 14:10:19 o.a.s.SparkContext [INFO] Successfully stopped SparkContext

> On 4 февр. 2016 г., at 13:51, Ted Yu <yuzhih...@gmail.com> wrote:
> 
> Which Spark release are you using ?
> 
> Is there other clue from the logs ? If so, please pastebin.
> 
> Cheers
> 
> On Thu, Feb 4, 2016 at 2:49 AM, Valentin Popov <valentin...@gmail.com 
> <mailto:valentin...@gmail.com>> wrote:
> Hi all, 
> 
> I’m trying run spark on local mode, i using such code: 
> 
> SparkConf conf = new 
> SparkConf().setAppName("JavaWord2VecExample").setMaster("local[*]");
> JavaSparkContext sc = new JavaSparkContext(conf);
> 
> but after while (10 sec) I got Exception, here is a stack trace: 
> java.util.concurrent.TimeoutException: Futures timed out after [10000 
> milliseconds]
>       at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>       at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>       at 
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>       at scala.concurrent.Await$.result(package.scala:107)
>       at akka.remote.Remoting.start(Remoting.scala:179)
>       at 
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
>       at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:620)
>       at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:617)
>       at akka.actor.ActorSystemImpl._start(ActorSystem.scala:617)
>       at akka.actor.ActorSystemImpl.start(ActorSystem.scala:634)
>       at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
>       at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
>       at 
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
>       at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
>       at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)
>       at 
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
>       at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>       at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
>       at 
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)
>       at org.apache.spark.SparkEnv$.create(SparkEnv.scala:266)
>       at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
>       at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:457)
>       at 
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
>       at 
> com.stimulus.archiva.datamining.ml.Word2VecTest.word2vec(Word2VecTest.java:23)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:497)
>       at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
>       at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>       at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
>       at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>       at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
>       at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
>       at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
>       at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
>       at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
>       at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
>       at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
>       at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
>       at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
>       at 
> org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
>       at 
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
>       at 
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
>       at 
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
>       at 
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
>       at 
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
> 
> 
> 
> Any one know library dependencies that can cause such error? 
> 
> Regards,
> Valentin
> 
> 
> 
> 
> 


 С Уважением,
Валентин Попов





Reply via email to