Also, the level of parallelism would be affected by how big your input is.
Could this be a problem in your case?
On Sunday, November 9, 2014, Aaron Davidson ilike...@gmail.com wrote:
oops, meant to cc userlist too
On Sat, Nov 8, 2014 at 3:13 PM, Aaron Davidson ilike...@gmail.com
Try adding the following entry inside your conf/spark-defaults.conf file
spark.cores.max 64
Thanks
Best Regards
On Sun, Nov 9, 2014 at 3:50 AM, Blind Faith person.of.b...@gmail.com
wrote:
I am a Spark newbie and I use python (pyspark). I am trying to run a
program on a 64 core system, but no
I am a Spark newbie and I use python (pyspark). I am trying to run a
program on a 64 core system, but no matter what I do, it always uses 1
core. It doesn't matter if I run it using spark-submit --master local[64]
run.sh or I call x.repartition(64) in my code with an RDD, the spark
program always