Hi Robert,

A lot of task metrics are already available for individual tasks.  You can
get these programmatically by registering a SparkListener, and you van also
view them in the UI.  Eg., for each task, you can see runtime,
serialization time, amount of shuffle data read, etc.  I'm working on also
exposing the data in the UI as json.

In addition, you can also use the metrics system to get a different view of
the data.  It has a different set of information, and also is better for a
timeline view, as opposed to a task-oriented view you get through the UI.

You can read about both options here:

https://spark.apache.org/docs/latest/monitoring.html


On Mon, Apr 13, 2015 at 12:48 PM, Grandl Robert <rgra...@yahoo.com.invalid>
wrote:

> Guys,
>
> Do you have any thoughts on this ?
>
>
> Thanks,
> Robert
>
>
>
>   On Sunday, April 12, 2015 5:35 PM, Grandl Robert
> <rgra...@yahoo.com.INVALID> wrote:
>
>
> Hi guys,
>
> I was trying to figure out some counters in Spark, related to the amount
> of CPU or Memory used (in some metric), used by a task/stage/job, but I
> could not find any.
>
> Is there any such counter available ?
>
> Thank you,
> Robert
>
>
>
>
>
>

Reply via email to