newb question...

say, memory per node is 16GB for 6 nodes (for a total of 96GB for the
cluster)

is 16GB the max amount of memory that can be allocated to driver? (since, it
is, after all, 16GB per node)



Thanks



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to