Thanks Akhil, Ryan!

@Akhil: YARN can only tell me how much vcores my app has been granted but
not actual cpu usage, right? Pulling mem/cpu usage from the OS means i need
to map JVM executor processes to the context they belong to, right?

@Ryan: what a great blog post -- this is super relevant for me to analyze
the state of the cluster as a whole. However, it seems to me that those
metrics are mostly reported globally and not per spark application.

2015-05-19 21:43 GMT+02:00 Ryan Williams <ryan.blake.willi...@gmail.com>:

> Hi Peter, a few months ago I was using MetricsSystem to export to Graphite
> and then view in Grafana; relevant scripts and some instructions are here
> <https://github.com/hammerlab/grafana-spark-dashboards/> if you want to
> take a look.
>
>
> On Sun, May 17, 2015 at 8:48 AM Peter Prettenhofer <
> peter.prettenho...@gmail.com> wrote:
>
>> Hi all,
>>
>> I'm looking for a way to measure the current memory / cpu usage of a
>> spark application to provide users feedback how much resources are actually
>> being used.
>> It seems that the metric system provides this information to some extend.
>> It logs metrics on application level (nr of cores granted) and on the JVM
>> level (memory usage).
>> Is this the recommended way to gather this kind of information? If so,
>> how do i best map a spark application to the corresponding JVM processes?
>>
>> If not, should i rather request this information from the resource
>> manager (e.g. Mesos/YARN)?
>>
>> thanks,
>>  Peter
>>
>> --
>> Peter Prettenhofer
>>
>


-- 
Peter Prettenhofer

Reply via email to