Hi,
I'm operating Spark in standalone cluster configuration (3 slaves) and
have some question.
1. How can I stop a slave on the specific node?
Under `sbin/` directory, there are
`start-{all,master,slave,slaves}` and `stop-{all,master,slaves}`, but
no `stop-slave`. Are there any way to stop
1. You can comment the rest of the workers from the conf/slaves file and do
a stop-slaves.sh from that machine to stop the specific worker.
2. There is no direct command for it, but you can do something like the
following:
$ curl localhost:8080 | grep Applications -C 10 | head -n20
​Where
Hi Kenichi
1. How can I stop a slave on the specific node?
Under `sbin/` directory, there are
`start-{all,master,slave,slaves}` and `stop-{all,master,slaves}`, but
no `stop-slave`. Are there any way to stop the specific (e.g., the
2nd) slave via command line?
You can use
Hi Akhil and Kousuke,
Thank you for your quick response.
Monitoring through JSON API seems straightforward and cool.
Thanks again!
2014-11-18 19:06 GMT+09:00 Kousuke Saruta saru...@oss.nttdata.co.jp:
Hi Kenichi
1. How can I stop a slave on the specific node?
Under `sbin/` directory,