Hi, 1. Dynamic allocation is currently only supported with YARN, correct?
2. In spark streaming, it is possible to change the number of executors while an application is running? If so, can the allocation be controlled by the application, instead of using any already defined automatic policy? That is, I want to be able to get more executors or decommission executors on demand. Is there some way to achieve this? Thanks.