If you're only interested in a particular instant, a simpler way is to
check the executors page on the Spark UI:
http://spark.apache.org/docs/latest/monitoring.html. By default each
executor runs one task per core, so you can see how many tasks are being
run at a given time and this translates directly to how many cores are
being used for execution.

2014-12-02 21:49 GMT-08:00 Otis Gospodnetic <otis.gospodne...@gmail.com>:

> Hi Isca,
>
> I think SPM can do that for you:
> http://blog.sematext.com/2014/10/07/apache-spark-monitoring/
>
> Otis
> --
> Monitoring * Alerting * Anomaly Detection * Centralized Log Management
> Solr & Elasticsearch Support * http://sematext.com/
>
>
> On Tue, Dec 2, 2014 at 11:57 PM, Isca Harmatz <pop1...@gmail.com> wrote:
>
>> hello,
>>
>> im running spark on a cluster and i want to monitor how many nodes/ cores
>> are active in different (specific) points of the program.
>>
>> is there any way to do this?
>>
>> thanks,
>>   Isca
>>
>
>

Reply via email to