yutoacts commented on a change in pull request #33537:
URL: https://github.com/apache/spark/pull/33537#discussion_r681376487



##########
File path: docs/submitting-applications.md
##########
@@ -162,9 +162,10 @@ The master URL passed to Spark can be in one of the 
following formats:
 <tr><th>Master URL</th><th>Meaning</th></tr>
 <tr><td> <code>local</code> </td><td> Run Spark locally with one worker thread 
(i.e. no parallelism at all). </td></tr>
 <tr><td> <code>local[K]</code> </td><td> Run Spark locally with K worker 
threads (ideally, set this to the number of cores on your machine). </td></tr>
-<tr><td> <code>local[K,F]</code> </td><td> Run Spark locally with K worker 
threads and F maxFailures (see <a 
href="configuration.html#scheduling">spark.task.maxFailures</a> for an 
explanation of this variable) </td></tr>
+<tr><td> <code>local[K,F]</code> </td><td> Run Spark locally with K worker 
threads and F maxFailures (see <a 
href="configuration.html#scheduling">spark.task.maxFailures</a> for an 
explanation of this variable). </td></tr>
 <tr><td> <code>local[*]</code> </td><td> Run Spark locally with as many worker 
threads as logical cores on your machine.</td></tr>
 <tr><td> <code>local[*,F]</code> </td><td> Run Spark locally with as many 
worker threads as logical cores on your machine and F maxFailures.</td></tr>
+<tr><td> <code>local-cluster[N,C,M]</code> </td><td> Run Spark cluster locally 
with N number of workers, C cores per worker and M MiB of memory per worker 
(only for unit test purpose).</td></tr>

Review comment:
       Thanks for the review. I just fixed it so it now explicitly say that 
local-cluster mode is only for unit testing.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to