1. You can comment the rest of the workers from the conf/slaves file and do
a stop-slaves.sh from that machine to stop the specific worker.

2. There is no direct command for it, but you can do something like the
following:

$ curl localhost:8080 | grep "Applications" -C 10 | head -n20

​Where localhost is your master machine and 8080 is the web ui port.​

You can also look at the metrics
<http://spark.apache.org/docs/latest/monitoring.html#metrics> options for
more sophisticated version.


Thanks
Best Regards

On Tue, Nov 18, 2014 at 1:57 PM, Kenichi Maehashi <
webmas...@kenichimaehashi.com> wrote:

> Hi,
>
> I'm operating Spark in standalone cluster configuration (3 slaves) and
> have some question.
>
> 1. How can I stop a slave on the specific node?
>    Under `sbin/` directory, there are
> `start-{all,master,slave,slaves}` and `stop-{all,master,slaves}`, but
> no `stop-slave`. Are there any way to stop the specific (e.g., the
> 2nd) slave via command line?
>
> 2. How can I check cluster status from command line?
>    Are there any way to confirm that all Master / Workers are up and
> working without using Web UI?
>
> Thanks in advance!
>
> --
> Kenichi Maehashi
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to