Completely not a silly question, it’s actually a really good observation. I 
think we didn’t update correctly the ml_ops.sh script when we added these 
parameters. 

What we could start discussing is whether we want dynamic allocation or if we 
want fixed number of executors. I’m going to let the mic open to see what 
people think about this.

Thanks Giacomo.

On 4/3/17, 3:29 AM, "Giacomo Bernardi" <[email protected]> wrote:

    Hi,
    hope this is not a silly question. In ml_ops.sh there are:
      --num-executors ${SPK_EXEC} \
    and:
      --conf spark.dynamicAllocation.enabled=true \
    
    which trigger the warning:
      WARN spark.SparkContext: Dynamic Allocation and num executors both
    set, thus dynamic allocation disabled.
    
    Shouldn't we remove the "--num-executors" and add instead:
      --conf spark.dynamicAllocation.maxExecutors=${SPK_EXEC} \
    ?
    
    Thanks.
    Giacomo
    

Reply via email to