Hi Isca,

I think SPM can do that for you:
http://blog.sematext.com/2014/10/07/apache-spark-monitoring/

Otis
--
Monitoring * Alerting * Anomaly Detection * Centralized Log Management
Solr & Elasticsearch Support * http://sematext.com/


On Tue, Dec 2, 2014 at 11:57 PM, Isca Harmatz <pop1...@gmail.com> wrote:

> hello,
>
> im running spark on a cluster and i want to monitor how many nodes/ cores
> are active in different (specific) points of the program.
>
> is there any way to do this?
>
> thanks,
>   Isca
>

Reply via email to