I am a Spark newbie and I use python (pyspark). I am trying to run a
program on a 64 core system, but no matter what I do, it always uses 1
core. It doesn't matter if I run it using "spark-submit --master local[64]
run.sh" or I call x.repartition(64) in my code with an RDD, the spark
program always uses one core. Has anyone experience of running spark
programs on multicore processors with success? Can someone provide me a
very simple example that does properly run on all cores of a multicore
system?

Reply via email to