Hi There,

I am new to Spark and I was wondering when you have so much memory on each
machine of the cluster, is it better to run multiple workers with limited
memory on each machine or is it better to run a single worker with access
to the majority of the machine memory? If the answer is "it depends", would
you please elaborate?

Thanks,
Mike

Reply via email to