Hi:
I am working on spark structured streaming (2.2.1) with kafka and want 100 
executors to be alive. I set spark.executor.instances to be 100.  The process 
starts running with 100 executors but after some time only a few remain which 
causes backlog of events from kafka.  
I thought I saw a setting to keep the executors from being killed.  However, I 
am not able to find that configuration in spark docs.  If anyone knows that 
setting, please let me know.
Thanks

Reply via email to