This is an automated email from the ASF dual-hosted git repository. srowen pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push: new 190a3a4 [SPARK-27047] Document stop-slave.sh in spark-standalone 190a3a4 is described below commit 190a3a4ad8e648d4ed4b38c5189b3baf75b1fc52 Author: Ajith <ajith2...@gmail.com> AuthorDate: Wed Mar 6 09:12:24 2019 -0600 [SPARK-27047] Document stop-slave.sh in spark-standalone ## What changes were proposed in this pull request? spark-standalone documentation do not mention about stop-slave.sh script ## How was this patch tested? Manually tested the changes Closes #23960 from ajithme/slavedoc. Authored-by: Ajith <ajith2...@gmail.com> Signed-off-by: Sean Owen <sean.o...@databricks.com> --- docs/spark-standalone.md | 11 ++++++----- 1 file changed, 6 insertions(+), 5 deletions(-) diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md index 672a4d0..60b84d3 100644 --- a/docs/spark-standalone.md +++ b/docs/spark-standalone.md @@ -85,12 +85,13 @@ If you do not have a password-less setup, you can set the environment variable S Once you've set up this file, you can launch or stop your cluster with the following shell scripts, based on Hadoop's deploy scripts, and available in `SPARK_HOME/sbin`: - `sbin/start-master.sh` - Starts a master instance on the machine the script is executed on. -- `sbin/start-slaves.sh` - Starts a slave instance on each machine specified in the `conf/slaves` file. -- `sbin/start-slave.sh` - Starts a slave instance on the machine the script is executed on. -- `sbin/start-all.sh` - Starts both a master and a number of slaves as described above. +- `sbin/start-slaves.sh` - Starts a worker instance on each machine specified in the `conf/slaves` file. +- `sbin/start-slave.sh` - Starts a worker instance on the machine the script is executed on. +- `sbin/start-all.sh` - Starts both a master and a number of workers as described above. - `sbin/stop-master.sh` - Stops the master that was started via the `sbin/start-master.sh` script. -- `sbin/stop-slaves.sh` - Stops all slave instances on the machines specified in the `conf/slaves` file. -- `sbin/stop-all.sh` - Stops both the master and the slaves as described above. +- `sbin/stop-slave.sh` - Stops all worker instances on the machine the script is executed on. +- `sbin/stop-slaves.sh` - Stops all worker instances on the machines specified in the `conf/slaves` file. +- `sbin/stop-all.sh` - Stops both the master and the workers as described above. Note that these scripts must be executed on the machine you want to run the Spark master on, not your local machine. --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org