Hi,
using spark 1.5.2 on yarn (client mode) and was trying to use the dynamic 
resource allocation but it seems once it is enabled by first app then any 
following application is managed that way even if explicitly disabling.
example:1) yarn configured with 
org.apache.spark.network.yarn.YarnShuffleService as spark_shuffle aux class2) 
running first app that doesnt specify dynamic allocation / shuffle service - it 
runs as expected with static executors3) running second application that 
enables spark.dynamicAllocation.enabled and spark.shuffle.service.enabled - it 
is dynamic as expected4) running another app that doesnt enable and it even 
disables dynamic allocation / shuffle service still the executors are being 
added/removed dynamically throughout the runtime.5) restarting nodemanagers to 
reset this
Is this known issue or have I missed something? Can the dynamic resource 
allocation be enabled per application?
Thanks,Antony.

Reply via email to