Most of the time a NoSuchMethodError means wrong classpath settings, and some jar file is overriden by a wrong version. In your case it could be netty.

On 1/3/15 1:36 PM, Niranda Perera wrote:
Hi all,

I am evaluating the spark sources API released with Spark 1.2.0. But I'm getting a "ava.lang.NoSuchMethodError: org.jboss.netty.channel.socket.nio.NioWorkerPool.<init>(Ljava/util/concurrent/Executor;I)V" error running the program.

Error log:
15/01/03 10:41:30 ERROR ActorSystemImpl: Uncaught fatal error from thread [sparkDriver-akka.remote.default-remote-dispatcher-5] shutting down ActorSystem [sparkDriver] java.lang.NoSuchMethodError: org.jboss.netty.channel.socket.nio.NioWorkerPool.<init>(Ljava/util/concurrent/Executor;I)V at akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:283) at akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:240) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
    at scala.util.Try$.apply(Try.scala:161)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73) at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84) at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
    at scala.util.Success.flatMap(Try.scala:200)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
    at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:692)
    at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:684)
at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721) at akka.remote.EndpointManager.akka$remote$EndpointManager$$listens(Remoting.scala:684) at akka.remote.EndpointManager$$anonfun$receive$2.applyOrElse(Remoting.scala:492)
    at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
    at akka.remote.EndpointManager.aroundReceive(Remoting.scala:395)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
    at akka.actor.ActorCell.invoke(ActorCell.scala:487)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
    at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)



Following is my simple Java code:

public class AvroSparkTest {

    public static void main(String[] args) throws Exception {

        SparkConf sparkConf = new SparkConf()
                .setMaster("local[2]")
                .setAppName("avro-spark-test")
.setSparkHome("/home/niranda/software/spark-1.2.0-bin-hadoop1");

        JavaSparkContext sparkContext = new JavaSparkContext(sparkConf);

        JavaSQLContext sqlContext = new JavaSQLContext(sparkContext);
        JavaSchemaRDD episodes = AvroUtils.avroFile(sqlContext,
"/home/niranda/projects/avro-spark-test/src/test/resources/episodes.avro");

        episodes.printSchema();
    }

}

Dependencies:
    <dependencies>
        <dependency>
            <groupId>com.databricks</groupId>
<artifactId>spark-avro_2.10</artifactId>
            <version>0.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
            <version>1.2.0</version>
        </dependency>
    </dependencies>

I'm using Java 1.7, IntelliJ IDEA and Maven as the build tool.

What might cause this error and what may be the remedy?

Cheers

--
Niranda

Reply via email to