Le 06 nov. 2017 à 19:56, Nicolas Paris écrivait :
> Can anyone clarify the driver memory aspects of pySpark?
> According to [1], spark.driver.memory limits JVM + python memory.
> 
> In case:
> spark.driver.memory=2G
> Then does it mean the user won't be able to use more than 2G, whatever
> the python code + the RDD stuff he is using ?
> 
> Thanks,
> 
> [1]: 
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-is-running-extremely-slow-with-larger-data-set-like-2G-td17152.html
> 


after some testing, the python driver memory is not limited by
spark.driver.memory
instead, there is no limit at all for those processes. This may be
managed by cgroups however.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to