Hi, I see the following jira is resolved in Spark 2.0 https://issues.apache.org/jira/browse/SPARK-12133 which is supposed to support Dynamic Resource Allocation in Spark Streaming.
I also see the JiRA https://issues.apache.org/jira/browse/SPARK-22008 which is about fixing numer of executor related issue in dynamic allocation in Spark Streaming But when I check http://spark.apache.org/docs/2.1.1/configuration.html (or teh same for 2.2) I don;t see the configuration parameter park.streaming.dynamicAllocation.enabled. Is this feature there at all in Spark 2.0 ? Or setting spark.dynamicAllocation.enabled is good enough ? Regards, Sourav