[ 
https://issues.apache.org/jira/browse/SPARK-24677?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16526051#comment-16526051
 ] 

Apache Spark commented on SPARK-24677:
--------------------------------------

User 'cxzl25' has created a pull request for this issue:
https://github.com/apache/spark/pull/21656

> MedianHeap is empty when speculation is enabled, causing the SparkContext to 
> stop
> ---------------------------------------------------------------------------------
>
>                 Key: SPARK-24677
>                 URL: https://issues.apache.org/jira/browse/SPARK-24677
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.1
>            Reporter: dzcxzl
>            Priority: Critical
>
> When introducing SPARK-23433 , maybe cause stop sparkcontext.
> {code:java}
> ERROR Utils: uncaught error in thread task-scheduler-speculation, stopping 
> SparkContext
> java.util.NoSuchElementException: MedianHeap is empty.
> at org.apache.spark.util.collection.MedianHeap.median(MedianHeap.scala:83)
> at 
> org.apache.spark.scheduler.TaskSetManager.checkSpeculatableTasks(TaskSetManager.scala:968)
> at 
> org.apache.spark.scheduler.Pool$$anonfun$checkSpeculatableTasks$1.apply(Pool.scala:94)
> at 
> org.apache.spark.scheduler.Pool$$anonfun$checkSpeculatableTasks$1.apply(Pool.scala:93)
> at scala.collection.Iterator$class.foreach(Iterator.scala:742)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
> at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> at org.apache.spark.scheduler.Pool.checkSpeculatableTasks(Pool.scala:93)
> at 
> org.apache.spark.scheduler.Pool$$anonfun$checkSpeculatableTasks$1.apply(Pool.scala:94)
> at 
> org.apache.spark.scheduler.Pool$$anonfun$checkSpeculatableTasks$1.apply(Pool.scala:93)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to