Greetings,

I am using Spark 2.0.2 with Scala 2.11.7 and Hadoop 2.7.3.  When I run
spark-submit local mode, I get a netty exception like the following.  The
code runs fine with Spark 1.6.3, Scala 2.10.x and Hadoop 2.7.3.

6/11/24 08:18:24 ERROR server.TransportRequestHandler: Error sending result
StreamResponse{streamId=/jars/simple-project_2.11-1.0.jar, byteCount=3662,
body=FileSegmentManagedBuffer{file=/home/hdadmin/Examples/spark/wordcount/target/scala-2.11/simple-project_2.11-1.0.jar,
offset=0, length=3662}} to /10.0.2.15:33926; closing connection
io.netty.handler.codec.EncoderException: java.lang.NoSuchMethodError:
io.netty.channel.DefaultFileRegion.<init>(Ljava/io/File;JJ)V
    at
io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:107)
    at
io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:658)
    at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:716)
    at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:651)
    at
io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:266)
    at
io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:658)
    at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:716)
    at
io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:706)
    at
io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:741)
    at
io.netty.channel.DefaultChannelPipeline.writeAndFlush(DefaultChannelPipeline.java:895)
    at
io.netty.channel.AbstractChannel.writeAndFlush(AbstractChannel.java:240)
    at
org.apache.spark.network.server.TransportRequestHandler.respond(TransportRequestHandler.java:194)
    at
org.apache.spark.network.server.TransportRequestHandler.processStreamRequest(TransportRequestHandler.java:150)
    at
org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:111)
    at
org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:119)
    at
org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
    at
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
    at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
    at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
    at
io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254)
    at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
    at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
    at
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
    at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
    at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
    at
org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
    at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
    at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
    at
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
    at
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)
    at
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
    at
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
    at
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
    at
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError:
io.netty.channel.DefaultFileRegion.<init>(Ljava/io/File;JJ)V
    at
org.apache.spark.network.buffer.FileSegmentManagedBuffer.convertToNetty(FileSegmentManagedBuffer.java:133)
    at
org.apache.spark.network.protocol.MessageEncoder.encode(MessageEncoder.java:54)
    at
org.apache.spark.network.protocol.MessageEncoder.encode(MessageEncoder.java:33)
    at
io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:89)
    ... 35 more
16/11/24 08:18:24 ERROR client.TransportResponseHandler: Still have 1
requests outstanding when connection from /10.0.2.15:54561 is closed


PLEASE ADVISE.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/netty-handler-codec-EncoderException-java-lang-NoSuchMethodError-tp28126.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to