Hi,

The submitting applications guide in 
http://spark.apache.org/docs/latest/submitting-applications.html says:


Alternatively, if your application is submitted from a machine far from the 
worker machines (e.g. locally on your laptop), it is common to usecluster mode 
to minimize network latency between the drivers and the executors. Note that 
cluster mode is currently not supported for standalone clusters, Mesos 
clusters, or python applications.




But there is a followed example, is this an error? and is "cluster" mode 
supported for standalone clusters?


# Run on a Spark Standalone cluster in cluster deploy mode with supervise 
./bin/spark-submit \   --class org.apache.spark.examples.SparkPi \   --master 
spark://207.184.161.138:7077 \   --deploy-mode cluster   --supervise   
--executor-memory 20G \   --total-executor-cores 100 \   /path/to/examples.jar 
\   1000

Reply via email to