Hi -
Does anybody have any ideas how to dynamically allocate cores instead of
statically partitioning them among multiple applications? Thanks.
Mohammed
From: Mohammed Guller
Sent: Friday, December 5, 2014 11:26 PM
To: user@spark.apache.org
Subject: Fair scheduling accross applications
Hi -
I understand that one can use spark.deploy.defaultCores and spark.cores.max
to assign a fixed number of worker cores to different apps. However, instead of
statically assigning the cores, I would like Spark to dynamically assign the
cores to multiple apps. For example, when there is a