Thanks for the reply. Yeah I found the same doc and am able to use multiple
cores in spark-shell, however, when I use pyspark, it appears only to use
one core, I am wondering if this is something I did't configure correctly
or something supported in pyspark.
On Fri, Mar 24, 2017 at 3:52 PM,
In Local Mode all processes are executed inside a single JVM.
Application is started in a local mode by setting master to local, local[*] or
local[n].
spark.executor.cores and spark.executor.cores are not applicable in the local
mode because there is only one embedded executor.
In Standalone