Hi Peter, a few months ago I was using MetricsSystem to export to Graphite
and then view in Grafana; relevant scripts and some instructions are here
<https://github.com/hammerlab/grafana-spark-dashboards/> if you want to
take a look.

On Sun, May 17, 2015 at 8:48 AM Peter Prettenhofer <
peter.prettenho...@gmail.com> wrote:

> Hi all,
>
> I'm looking for a way to measure the current memory / cpu usage of a spark
> application to provide users feedback how much resources are actually being
> used.
> It seems that the metric system provides this information to some extend.
> It logs metrics on application level (nr of cores granted) and on the JVM
> level (memory usage).
> Is this the recommended way to gather this kind of information? If so, how
> do i best map a spark application to the corresponding JVM processes?
>
> If not, should i rather request this information from the resource manager
> (e.g. Mesos/YARN)?
>
> thanks,
>  Peter
>
> --
> Peter Prettenhofer
>

Reply via email to