In Local Mode  all processes are executed inside a single JVM.
Application is started in a local mode by setting master to local, local[*] or 
local[n].
spark.executor.cores and spark.executor.cores are not applicable in the local 
mode because there is only one embedded executor.


In Standalone mode, you need  standalone Spark 
cluster<https://spark.apache.org/docs/latest/spark-standalone.html>.

It requires a master node (can be started using SPARK_HOME/sbin/start-master.sh 
script) and at least one worker node (can be started using 
SPARK_HOME/sbin/start-slave.sh script).SparkConf should use master node address 
to create (spark://host:port)

Thanks!

Gangadhar
From: Li Jin <ice.xell...@gmail.com<mailto:ice.xell...@gmail.com>>
Date: Friday, March 24, 2017 at 3:43 PM
To: "user@spark.apache.org<mailto:user@spark.apache.org>" 
<user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: EXT: Multiple cores/executors in Pyspark standalone mode

Hi,

I am wondering does pyspark standalone (local) mode support multi 
cores/executors?

Thanks,
Li

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to