Hey,

I was wondering if it is possible to tune number of jobs generated by spark
sql? Currently my query generates over 80 "runJob at SparkPlan.scala:122"
jobs, every one of them gets executed in ~4 sec and contains only 5 tasks.
As a result of this, most of my cores do nothing.

Reply via email to