[ 
https://issues.apache.org/jira/browse/SPARK-16852?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15404166#comment-15404166
 ] 

Weizhong commented on SPARK-16852:
----------------------------------

I run 2T tpcds, and some times will print the stack.
{noformat}
cases="1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 
28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 
54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 
80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99"
i=1
total=99

for t in ${cases}
do
    echo "run sql99 sample query $t ($i/$total)."
    spark-sql --master yarn-client --driver-memory 30g --executor-memory 30g 
--num-executors 6 --executor-cores 5  -f query/query${t}.sql > logs/${t}.result 
2> logs/${t}.log
    i=`expr $i + 1`
done

{noformat}

> RejectedExecutionException when exit at some times
> --------------------------------------------------
>
>                 Key: SPARK-16852
>                 URL: https://issues.apache.org/jira/browse/SPARK-16852
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>            Reporter: Weizhong
>            Priority: Minor
>
> If we run a huge job, some times when exit will print 
> RejectedExecutionException
> {noformat}
> 16/05/27 08:30:40 ERROR client.TransportResponseHandler: Still have 3 
> requests outstanding when connection from HGH1000017808/10.184.66.104:41980 
> is closed
> java.util.concurrent.RejectedExecutionException: Task 
> scala.concurrent.impl.CallbackRunnable@6b66dba rejected from 
> java.util.concurrent.ThreadPoolExecutor@60725736[Terminated, pool size = 0, 
> active threads = 0, queued tasks = 0, completed tasks = 269]
>       at 
> java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2047)
>       at 
> java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:823)
>       at 
> java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1369)
>       at 
> scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:133)
>       at 
> scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
>       at scala.concurrent.Promise$class.complete(Promise.scala:55)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:153)
>       at scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:324)
>       at scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:324)
>       at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
>       at 
> org.spark-project.guava.util.concurrent.MoreExecutors$SameThreadExecutorService.execute(MoreExecutors.java:293)
>       at 
> scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:133)
>       at 
> scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
>       at scala.concurrent.Promise$class.complete(Promise.scala:55)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:153)
>       at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
>       at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
>       at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
>       at 
> scala.concurrent.Future$InternalCallbackExecutor$Batch$$anonfun$run$1.processBatch$1(Future.scala:643)
>       at 
> scala.concurrent.Future$InternalCallbackExecutor$Batch$$anonfun$run$1.apply$mcV$sp(Future.scala:658)
>       at 
> scala.concurrent.Future$InternalCallbackExecutor$Batch$$anonfun$run$1.apply(Future.scala:635)
>       at 
> scala.concurrent.Future$InternalCallbackExecutor$Batch$$anonfun$run$1.apply(Future.scala:635)
>       at 
> scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
>       at 
> scala.concurrent.Future$InternalCallbackExecutor$Batch.run(Future.scala:634)
>       at 
> scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)
>       at 
> scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:685)
>       at 
> scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
>       at scala.concurrent.Promise$class.tryFailure(Promise.scala:115)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.tryFailure(Promise.scala:153)
>       at 
> org.apache.spark.rpc.netty.NettyRpcEnv.org$apache$spark$rpc$netty$NettyRpcEnv$$onFailure$1(NettyRpcEnv.scala:192)
>       at 
> org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$1.apply(NettyRpcEnv.scala:214)
>       at 
> org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$1.apply(NettyRpcEnv.scala:214)
>       at 
> org.apache.spark.rpc.netty.RpcOutboxMessage.onFailure(Outbox.scala:74)
>       at 
> org.apache.spark.network.client.TransportResponseHandler.failOutstandingRequests(TransportResponseHandler.java:90)
>       at 
> org.apache.spark.network.client.TransportResponseHandler.channelUnregistered(TransportResponseHandler.java:104)
>       at 
> org.apache.spark.network.server.TransportChannelHandler.channelUnregistered(TransportChannelHandler.java:94)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
>       at 
> io.netty.channel.ChannelInboundHandlerAdapter.channelUnregistered(ChannelInboundHandlerAdapter.java:53)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
>       at 
> io.netty.channel.ChannelInboundHandlerAdapter.channelUnregistered(ChannelInboundHandlerAdapter.java:53)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
>       at 
> io.netty.channel.ChannelInboundHandlerAdapter.channelUnregistered(ChannelInboundHandlerAdapter.java:53)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
>       at 
> io.netty.channel.DefaultChannelPipeline.fireChannelUnregistered(DefaultChannelPipeline.java:739)
>       at 
> io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:659)
>       at 
> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:328)
>       at 
> io.netty.util.concurrent.SingleThreadEventExecutor.confirmShutdown(SingleThreadEventExecutor.java:627)
>       at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:362)
>       at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>       at java.lang.Thread.run(Thread.java:745)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to