Hi All,

I have a setup which consists of 8 small machines (1 core) and 8G RAM and 1 large machine (8 cores) with 100G RAM. Is there a way to enable spark to run multiple executors on the large machine, and a single executor on each of the small machines ?

Alternatively, is is possible to run a single executor that will utilize all cores and available memory on the large machine as well as executors with less memory on the smaller machines?

I tried configuring spark-env.sh on the large machine, but java -Xmx is configured uniformly for the entire cluster.
Is there any way to configure -Xmx separately for each machine ?


Thanks,

Yadid


Reply via email to