I am using Spark-1.1.1. When I used "sbt test", I ran into the
following exceptions. Any idea how to solve it? Thanks! I think
somebody posted this question before, but no one seemed to have
answered it. Could it be the version of "io.netty" I put in my
build.sbt? I included an dependency "libraryDependencies += "io.netty"
% "netty" % "3.6.6.Final" in my build.sbt file.

java.lang.NoClassDefFoundError: io/netty/util/TimerTask        at
org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:72)
      at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:168)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:230)        at
org.apache.spark.SparkContext.<init>(SparkContext.scala:204)
      at 
spark.jobserver.util.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:34)
      at 
spark.jobserver.JobManagerActor.createContextFromConfig(JobManagerActor.scala:255)
      at 
spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:104)
      at 
scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
      at 
scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
      at 
scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25).......

Reply via email to