Hi,

I am launching through code (client mode) a Spark program to run in Hadoop.
Whenever I check the executors tab of Spark UI I always get 0 as the number
of vcores for the driver. I tried to change that using *spark.driver.cores*,
or also *spark.yarn.am.cores* in the SparkSession configuration but in
vain. I also tried to set those parameters in spark-defaults but, again,
with no success.
To note that in the environment tab, the right config is displayed.

Could this be the reason for a *CollectAsList *to freeze the execution (not
having enough CPU)?

Reply via email to