I have a spark streaming application which processes 3 kafka streams and has 5 output operations.
Not sure what should be the setting for spark.streaming.concurrentJobs. 1. If the concurrentJobs setting is 4 does that mean 2 output operations will be run sequentially? 2. If I had 6 cores what would be a ideal setting for concurrentJobs in this situation? I appreciate your input. Thanks