The other way is to log in to the individual nodes and do

 jps

24819 Worker

And you Processes identified as worker

Also you can use jmonitor to see what they are doing resource wise

You can of course write a small shell script to see if Worker(s) are up and
running in every node and alert if they are down?

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 9 June 2016 at 01:27, Rutuja Kulkarni <rutuja.kulkarn...@gmail.com>
wrote:

> Thank you for the quick response.
> So the workers section would list all the running worker nodes in the
> standalone Spark cluster?
> I was also wondering if this is the only way to retrieve worker nodes or
> is there something like a Web API or CLI I could use?
> Thanks.
>
> Regards,
> Rutuja
>
> On Wed, Jun 8, 2016 at 4:02 PM, Mich Talebzadeh <mich.talebza...@gmail.com
> > wrote:
>
>> check port 8080 on the node that you started start-master.sh
>>
>>
>>
>> [image: Inline images 2]
>>
>> HTH
>>
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>>
>> On 8 June 2016 at 23:56, Rutuja Kulkarni <rutuja.kulkarn...@gmail.com>
>> wrote:
>>
>>> Hello!
>>>
>>> I'm trying to setup a standalone spark cluster and wondering how to
>>> track status of all of it's nodes. I wonder if something like Yarn REST API
>>> or HDFS CLI exists in Spark world that can provide status of nodes on such
>>> a cluster. Any pointers would be greatly appreciated.
>>>
>>> --
>>> *Regards,*
>>> *Rutuja Kulkarni*
>>>
>>>
>>>
>>
>
>
> --
> *Regards,*
> *Rutuja Kulkarni*
>
>
>

Reply via email to