something similar to what spark streaming did.
From: cbowden <cbcweb...@gmail.com>
Sent: Thursday, August 24, 2017 7:01 PM
To: user@spark.apache.org
Subject: Re: [Streaming][Structured Streaming] Understanding dynamic allocation
in streaming jobs
You can le
requirements
- My source's underlying rdd representing the dataframe provided by getbatch
is volatile, eg. #partitions batch to batch
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Structured-Streaming-Understanding-dynamic-allocation-in-streaming-jobs
I'm trying to understand dynamic allocation in Spark Streaming and Structured
Streaming. It seems if you set spark.dynamicAllocation.enabled=true, both
frameworks use Core's dynamic allocation algorithm -- request executors if the
task backlog is a certain size, and remove executors if they