You could try adding netty.io jars
<http://mvnrepository.com/artifact/io.netty/netty-all/4.0.23.Final> in the
classpath. Looks like that jar is missing.

Thanks
Best Regards

On Thu, Dec 11, 2014 at 12:15 AM, S. Zhou <myx...@yahoo.com.invalid> wrote:

> Everything worked fine on Spark 1.1.0 until we upgrade to 1.1.1. For some
> of our unit tests we saw the following exceptions. Any idea how to solve
> it? Thanks!
>
> java.lang.NoClassDefFoundError: io/netty/util/TimerTask
>         at
> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:72)
>         at
> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:168)
>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:230)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:204)
>         at
> spark.jobserver.util.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:34)
>         at
> spark.jobserver.JobManagerActor.createContextFromConfig(JobManagerActor.scala:255)
>         at
> spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:104)
>         at
> scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
>         at
> scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
>         at
> scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
> .......
>
>
>

Reply via email to