[ 
https://issues.apache.org/jira/browse/SPARK-5745?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14322091#comment-14322091
 ] 

Patrick Wendell commented on SPARK-5745:
----------------------------------------

Hey [~jlewandowski] - TaskMetrics are a mostly internal concept. In fact, there 
isn't really any "nice" framework for aggregation internally. We instead have a 
bunch of manual aggregation in various places.

The primary user-facing API we have aggregated counters are accumulators. Are 
there features lacking from accumulators that make it difficult for you to use 
them for your use case?

> Allow to use custom TaskMetrics implementation
> ----------------------------------------------
>
>                 Key: SPARK-5745
>                 URL: https://issues.apache.org/jira/browse/SPARK-5745
>             Project: Spark
>          Issue Type: Wish
>          Components: Spark Core
>            Reporter: Jacek Lewandowski
>
> There can be various RDDs implemented and the {{TaskMetrics}} provides a 
> great API for collecting metrics and aggregating them. However some RDDs may 
> want to register some custom metrics and the current implementation doesn't 
> allow for this (for example the number of read rows or whatever).
> I suppose that this can be changed without modifying the whole interface - 
> there could used some factory to create the initial {{TaskMetrics}} object. 
> The default factory could be overridden by user.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to