Hey,

Tried to get the new spark.dynamicAllocation.enabled feature working on
Yarn (Hadoop 2.2), but am unsuccessful so far. I've tested with the
following settings:

      conf
        .set("spark.dynamicAllocation.enabled", "true")
        .set("spark.shuffle.service.enabled", "true")
        .set("spark.dynamicAllocation.minExecutors", "10")
        .set("spark.dynamicAllocation.maxExecutors", "700")

The app works fine on Spark 1.2 if dynamicAllocation is not enabled, but
with the settings above, it will start the app and the first job is listed
in the web ui. However, no tasks are started and it seems to be stuck
waiting for a container to be allocated forever.

Any help would be appreciated. Need to do something specific to get the
external yarn shuffle service running in the node manager?

TIA,
Anders

Reply via email to