Hi folks, When running spark on Kubernetes is it possible to use dynamic allocation? Some blog posts <https://spot.io/blog/setting-up-managing-monitoring-spark-on-kubernetes/> mentioned that dynamic allocation is available, however I am not sure how it works. Spark official docs <https://spark.apache.org/docs/latest/running-on-kubernetes.html#future-work> say that shuffle service is not yet available.
Thanks Nikhil