Hi,

I gave my spark job 16 gb of memory and it is running on 8 executors.

The job needs more memory due to ALS requirements (20M x 1M matrix)

On each node I do have 96 gb of memory and I am using 16 gb out of it. I
want to increase the memory but I am not sure what is the right way to do
that...

On 8 executor if I give 96 gb it might be a issue due to GC...

Ideally on 8 nodes, I would run with 48 executors and each executor will
get 16 gb of memory..Total  48 JVMs...

Is it possible to increase executors per node ?

Thanks.
Deb

Reply via email to