hi there

Can anyone clarify the driver memory aspects of pySpark?
According to [1], spark.driver.memory limits JVM + python memory.

In case:
spark.driver.memory=2G
Then does it mean the user won't be able to use more than 2G, whatever
the python code + the RDD stuff he is using ?

Thanks,

[1]: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-is-running-extremely-slow-with-larger-data-set-like-2G-td17152.html



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to