[ 
https://issues.apache.org/jira/browse/SPARK-1138?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14007837#comment-14007837
 ] 

Reynold Xin commented on SPARK-1138:
------------------------------------

Just want to chime in that I also encountered this stack trace, and the problem 
was an older Netty (in particular, I have Netty 3.4 on my classpath). Once I 
included Netty 3.6.6, the problem went away. 

> Spark 0.9.0 does not work with Hadoop / HDFS
> --------------------------------------------
>
>                 Key: SPARK-1138
>                 URL: https://issues.apache.org/jira/browse/SPARK-1138
>             Project: Spark
>          Issue Type: Bug
>            Reporter: Sam Abeyratne
>
> UPDATE: This problem is certainly related to trying to use Spark 0.9.0 and 
> the latest cloudera Hadoop / HDFS in the same jar.  It seems no matter how I 
> fiddle with the deps, the do not play nice together.
> I'm getting a java.util.concurrent.TimeoutException when trying to create a 
> spark context with 0.9.  I cannot, whatever I do, change the timeout.  I've 
> tried using System.setProperty, the SparkConf mechanism of creating a 
> SparkContext and the -D flags when executing my jar.  I seem to be able to 
> run simple jobs from the spark-shell OK, but my more complicated jobs require 
> external libraries so I need to build jars and execute them.
> Some code that causes this:
> println("Creating config")
>     val conf = new SparkConf()
>       .setMaster(clusterMaster)
>       .setAppName("MyApp")
>       .setSparkHome(sparkHome)
>       .set("spark.akka.askTimeout", parsed.getOrElse(timeouts, "100"))
>       .set("spark.akka.timeout", parsed.getOrElse(timeouts, "100"))
>     println("Creating sc")
>     implicit val sc = new SparkContext(conf)
> The output:
> Creating config
> Creating sc
> log4j:WARN No appenders could be found for logger 
> (akka.event.slf4j.Slf4jLogger).
> log4j:WARN Please initialize the log4j system properly.
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
> info.
> [ERROR] [02/26/2014 11:05:25.491] [main] [Remoting] Remoting error: [Startup 
> timed out] [
> akka.remote.RemoteTransportException: Startup timed out
>       at 
> akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
>       at akka.remote.Remoting.start(Remoting.scala:191)
>       at 
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
>       at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
>       at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
>       at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
>       at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
>       at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
>       at 
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96)
>       at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
>       at 
> com.adbrain.accuracy.EvaluateAdtruthIDs$.main(EvaluateAdtruthIDs.scala:40)
>       at 
> com.adbrain.accuracy.EvaluateAdtruthIDs.main(EvaluateAdtruthIDs.scala)
> Caused by: java.util.concurrent.TimeoutException: Futures timed out after 
> [10000 milliseconds]
>       at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>       at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>       at 
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>       at scala.concurrent.Await$.result(package.scala:107)
>       at akka.remote.Remoting.start(Remoting.scala:173)
>       ... 11 more
> ]
> Exception in thread "main" java.util.concurrent.TimeoutException: Futures 
> timed out after [10000 milliseconds]
>       at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>       at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>       at 
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>       at scala.concurrent.Await$.result(package.scala:107)
>       at akka.remote.Remoting.start(Remoting.scala:173)
>       at 
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
>       at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
>       at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
>       at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
>       at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
>       at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
>       at 
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96)
>       at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
>       at 
> com.adbrain.accuracy.EvaluateAdtruthIDs$.main(EvaluateAdtruthIDs.scala:40)
>       at 
> com.adbrain.accuracy.EvaluateAdtruthIDs.main(EvaluateAdtruthIDs.scala)



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to