Re: dynamic allocation in spark-shell

2019-05-31 Thread Deepak Sharma
You can start spark-shell with these properties: --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.initialExecutors=2 --conf spark.dynamicAllocation.minExecutors=2 --conf spark.dynamicAllocation.maxExecutors=5 On Fri, May 31, 2019 at 5:30 AM Qian He wrote: > Sometimes

dynamic allocation in spark-shell

2019-05-30 Thread Qian He
Sometimes it's convenient to start a spark-shell on cluster, like ./spark/bin/spark-shell --master yarn --deploy-mode client --num-executors 100 --executor-memory 15g --executor-cores 4 --driver-memory 10g --queue myqueue However, with command like this, those allocated resources will be occupied