[ https://issues.apache.org/jira/browse/SPARK-2099?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14028681#comment-14028681 ]
Sandy Ryza commented on SPARK-2099: ----------------------------------- https://github.com/apache/spark/pull/1056 > Report metrics for running tasks > -------------------------------- > > Key: SPARK-2099 > URL: https://issues.apache.org/jira/browse/SPARK-2099 > Project: Spark > Issue Type: Improvement > Affects Versions: 1.0.0 > Reporter: Sandy Ryza > > Spark currently collects a set of helpful task metrics, like shuffle bytes > written, GC time, and displays them on the app web UI. These are only > collected and displayed for tasks that have completed. This makes them > unsuited to perhaps the situation where they would be most useful - > determining what's going wrong in currently running tasks. > Reporting metrics progrss for running tasks would probably require adding an > executor->driver heartbeat that reports metrics for all tasks currently > running on the executor. -- This message was sent by Atlassian JIRA (v6.2#6252)