Hi,

If I submit the same job to spark in cluster mode, does it mean in cluster
mode it will be run in cluster memory pool and it will fail if it runs out
of cluster's memory?

--driver-memory 64g \

--executor-memory 16g \

Regards

Reply via email to