Does spark.streaming.concurrentJobs still exist?

spark.streaming.concurrentJobs (default: 1) is the number of concurrent
jobs, i.e. threads in streaming-job-executor thread pool
<https://github.com/jaceklaskowski/spark-streaming-notebook/blob/master/spark-streaming-jobscheduler.adoc#streaming-job-executor>
.

Also how is this definition different from executor-cores?

Reply via email to