spark.streaming.concurrentJobs may help. Experimental according to TD from an older thread here http://stackoverflow.com/questions/23528006/how-jobs-are-assigned-to-executors-in-spark-streaming
On Sat, Feb 20, 2016 at 11:24 AM, Jorge Rodriguez <jo...@bloomreach.com> wrote: > > Is it possible to have the scheduler schedule the next batch even if the > previous batch has not completed yet? I'd like to schedule up to 3 batches > at the same time if this is possible. >