Hi Kenichi

1. How can I stop a slave on the specific node?
   Under `sbin/` directory, there are
`start-{all,master,slave,slaves}` and `stop-{all,master,slaves}`, but
no `stop-slave`. Are there any way to stop the specific (e.g., the
2nd) slave via command line?

You can use sbin/spark-daemon.sh on the machine where the worker you'd like to stop runs. First, you find PID of the worker you'd like to stop and second, you find PID file of the worker.
The PID file is on /tmp/ by default and the file name is like as follows.

xxx.org.apache.spark.deploy.worker.Worker-<WorkerID>.pid

After you find the PID file, you run the following command.

sbin/spark-daemon.sh stop org.apache.spark.worker.Worker <WorkerID>

2. How can I check cluster status from command line?
   Are there any way to confirm that all Master / Workers are up and
working without using Web UI?

AFAIK, there are no command line tools for checking statuses of standalone cluster.
Instead of that, you can use special URL like as follows.

http://<master or worker's hostname>:<webui-port>/json

You can get Master and Worker status as JSON format data.

- Kousuke

(2014/11/18 0:27), Kenichi Maehashi wrote:
Hi,

I'm operating Spark in standalone cluster configuration (3 slaves) and
have some question.

1. How can I stop a slave on the specific node?
    Under `sbin/` directory, there are
`start-{all,master,slave,slaves}` and `stop-{all,master,slaves}`, but
no `stop-slave`. Are there any way to stop the specific (e.g., the
2nd) slave via command line?

2. How can I check cluster status from command line?
    Are there any way to confirm that all Master / Workers are up and
working without using Web UI?

Thanks in advance!



---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to