I tried to submit a job with  --conf "spark.cores.max=6"
 or --total-executor-cores 6 on a standalone cluster. But I don't see more
than 1 executor on each worker. I am wondering how to use multiple
executors when submitting jobs.

Thanks
larry

Reply via email to