RE: Fair scheduling accross applications in stand-alone mode

2014-12-08 Thread Mohammed Guller
Hi - Does anybody have any ideas how to dynamically allocate cores instead of statically partitioning them among multiple applications? Thanks. Mohammed From: Mohammed Guller Sent: Friday, December 5, 2014 11:26 PM To: user@spark.apache.org Subject: Fair scheduling accross applications

Fair scheduling accross applications in stand-alone mode

2014-12-05 Thread Mohammed Guller
Hi - I understand that one can use spark.deploy.defaultCores and spark.cores.max to assign a fixed number of worker cores to different apps. However, instead of statically assigning the cores, I would like Spark to dynamically assign the cores to multiple apps. For example, when there is a