Hi,

I was looking at Spark source code, and I found that when launching a
Executor, actually Spark is launching a threadpool; each time the scheduler
launches a task, the executor will launch a thread within the threadpool.

However, I also found that the Spark process always has approximately 40
threads running regardless of my configuration (SPARK_WORKER_CORES,
SPARK_WORKER_INSTANCES, --executor-cores, --total-executor-cores, etc.).
Does it mean Spark will pre-launch 40 threads even before the tasks are
launched? Great thanks!

Best,
Ray

Reply via email to