Repository: spark
Updated Branches:
  refs/heads/branch-2.0 879e8fd09 -> 8c294f4ad


[SPARK-15781][DOCUMENTATION] remove deprecated environment variable doc

## What changes were proposed in this pull request?

Like `SPARK_JAVA_OPTS` and `SPARK_CLASSPATH`, we will remove the document for 
`SPARK_WORKER_INSTANCES` to discourage user not to use them. If they are 
actually used, SparkConf will show a warning message as before.

## How was this patch tested?

Manually tested.

Author: bomeng <bm...@us.ibm.com>

Closes #13533 from bomeng/SPARK-15781.

(cherry picked from commit 3fd3ee038b89821f51f30a4ecd4452b5b3bc6568)
Signed-off-by: Sean Owen <so...@cloudera.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/8c294f4a
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/8c294f4a
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/8c294f4a

Branch: refs/heads/branch-2.0
Commit: 8c294f4ad95e95f6c8873d7b346394d34cc40975
Parents: 879e8fd
Author: bomeng <bm...@us.ibm.com>
Authored: Sun Jun 12 12:58:34 2016 +0100
Committer: Sean Owen <so...@cloudera.com>
Committed: Sun Jun 12 12:58:41 2016 +0100

----------------------------------------------------------------------
 docs/spark-standalone.md | 9 ---------
 1 file changed, 9 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/8c294f4a/docs/spark-standalone.md
----------------------------------------------------------------------
diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md
index fd94c34..40c7293 100644
--- a/docs/spark-standalone.md
+++ b/docs/spark-standalone.md
@@ -134,15 +134,6 @@ You can optionally configure the cluster further by 
setting environment variable
     <td>Port for the worker web UI (default: 8081).</td>
   </tr>
   <tr>
-    <td><code>SPARK_WORKER_INSTANCES</code></td>
-    <td>
-      Number of worker instances to run on each machine (default: 1). You can 
make this more than 1 if
-      you have have very large machines and would like multiple Spark worker 
processes. If you do set
-      this, make sure to also set <code>SPARK_WORKER_CORES</code> explicitly 
to limit the cores per worker,
-      or else each worker will try to use all the cores.
-    </td>
-  </tr>
-  <tr>
     <td><code>SPARK_WORKER_DIR</code></td>
     <td>Directory to run applications in, which will include both logs and 
scratch space (default: SPARK_HOME/work).</td>
   </tr>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to