[ 
https://issues.apache.org/jira/browse/SPARK-20882?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16025581#comment-16025581
 ] 

cen yuhai edited comment on SPARK-20882 at 5/26/17 12:08 AM:
-------------------------------------------------------------

I think the problem is that It will hang forever when the connection to 
nodemanger:7337 is closed?
{code}
17/05/26 02:01:55 DEBUG ShuffleBlockFetcherIterator: Number of requests in 
flight 1
17/05/26 02:02:03 WARN TransportChannelHandler: Exception in connection from 
bigdata-apache-hdp-132.xg01/10.0.132.58:7337
java.io.IOException: Connection reset by peer
{code}


was (Author: cenyuhai):
I think the problem is when the connection to nodemanger:7337 is closed. It 
will hang forever?
{code}
17/05/26 02:01:55 DEBUG ShuffleBlockFetcherIterator: Number of requests in 
flight 1
17/05/26 02:02:03 WARN TransportChannelHandler: Exception in connection from 
bigdata-apache-hdp-132.xg01/10.0.132.58:7337
java.io.IOException: Connection reset by peer
{code}

> Executor is waiting for ShuffleBlockFetcherIterator
> ---------------------------------------------------
>
>                 Key: SPARK-20882
>                 URL: https://issues.apache.org/jira/browse/SPARK-20882
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.1.0, 2.1.1
>            Reporter: cen yuhai
>         Attachments: executor_jstack, executor_log, screenshot-1.png, 
> screenshot-2.png, screenshot-3.png
>
>
> This bug is like https://issues.apache.org/jira/browse/SPARK-19300.
> but I have updated my client netty version to 4.0.43.Final.
> The shuffle service handler is still 4.0.42.Final
> spark.sql.adaptive.enabled is true
> {code}
> "Executor task launch worker for task 4808985" #5373 daemon prio=5 os_prio=0 
> tid=0x00007f54ef437000 nid=0x1aed0 waiting on condition [0x00007f53aebfe000]
> java.lang.Thread.State: WAITING (parking)
> at sun.misc.Unsafe.park(Native Method)
> parking to wait for <0x0000000498c249c0> (a 
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:189)
> at 
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
> at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
> at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:332)
> at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:58)
> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
> at 
> org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32)
> at 
> org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
> at 
> org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:199)
> at 
> org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
> at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:97)
> at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54)
> at org.apache.spark.scheduler.Task.run(Task.scala:114)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:323)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1147)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)
> at java.lang.Thread.run(Thread.java:834)
> {code}
> {code}
> 7/05/26 02:01:55 DEBUG ShuffleBlockFetcherIterator: Number of requests in 
> flight 3
> 17/05/26 02:01:55 DEBUG ShuffleBlockFetcherIterator: Number of requests in 
> flight 2
> 17/05/26 02:01:55 DEBUG ShuffleBlockFetcherIterator: Number of requests in 
> flight 1
> 17/05/26 02:01:55 DEBUG ShuffleBlockFetcherIterator: Number of requests in 
> flight 1
> 17/05/26 02:01:55 DEBUG ShuffleBlockFetcherIterator: Number of requests in 
> flight 1
> 17/05/26 02:02:03 WARN TransportChannelHandler: Exception in connection from 
> bigdata-apache-hdp-132.xg01/10.0.132.58:7337
> java.io.IOException: Connection reset by peer
>       at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
>       at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
>       at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
>       at sun.nio.ch.IOUtil.read(IOUtil.java:192)
>       at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:379)
>       at 
> io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:221)
>       at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:899)
>       at 
> io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:275)
>       at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119)
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
>       at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
>       at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
>       at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to