Hi twinkle,
To be completely honest, I'm not sure, I had never heard spark.task.cpus
before. But I could imagine two different use cases:
a) instead of just relying on spark's creation of tasks for parallelism, a
user wants to run multiple threads *within* a task. This is sort of going
against
Hi,
In spark, there are two settings regarding number of cores, one is at task
level :spark.task.cpus
and there is another one, which drives number of cores per executors:
spark.executor.cores
Apart from using more than one core for a task which has to call some other
external API etc, is there