[ 
https://issues.apache.org/jira/browse/SPARK-15703?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16440855#comment-16440855
 ] 

Thomas Graves commented on SPARK-15703:
---------------------------------------

this Jira is purely making the size of the event queue configurable which would 
allow you to increase it as long as you have sufficient driver memory.  There 
is no current fix for it dropping events. There is a fix that went into 2.3 
that makes it so the critical services aren't affected:

https://issues.apache.org/jira/browse/SPARK-18838

> Make ListenerBus event queue size configurable
> ----------------------------------------------
>
>                 Key: SPARK-15703
>                 URL: https://issues.apache.org/jira/browse/SPARK-15703
>             Project: Spark
>          Issue Type: Improvement
>          Components: Scheduler, Web UI
>    Affects Versions: 2.0.0
>            Reporter: Thomas Graves
>            Assignee: Dhruve Ashar
>            Priority: Minor
>             Fix For: 2.0.1, 2.1.0
>
>         Attachments: Screen Shot 2016-06-01 at 11.21.32 AM.png, Screen Shot 
> 2016-06-01 at 11.23.48 AM.png, SparkListenerBus .png, 
> spark-dynamic-executor-allocation.png
>
>
> The Spark UI doesn't seem to be showing all the tasks and metrics.
> I ran a job with 100000 tasks but Detail stage page says it completed 93029:
> Summary Metrics for 93029 Completed Tasks
> The Stages for all jobs pages list that only 89519/100000 tasks finished but 
> its completed.  The metrics for shuffled write and input are also incorrect.
> I will attach screen shots.
> I checked the logs and it does show that all the tasks actually finished.
> 16/06/01 16:15:42 INFO TaskSetManager: Finished task 59880.0 in stage 2.0 
> (TID 54038) in 265309 ms on 10.213.45.51 (100000/100000)
> 16/06/01 16:15:42 INFO YarnClusterScheduler: Removed TaskSet 2.0, whose tasks 
> have all completed, from pool



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to