Hello, I'm using the Github snapshot of PySpark and having trouble setting the worker memory correctly. I've set spark.executor.memory to 5g, but somewhere along the way Xmx is getting capped to 512M. This was not occurring with the same setup and 0.9.0. How many places do I need to configure the memory? Thank you!
- Pyspark worker memory Jim Blomo
- Re: Pyspark worker memory Matei Zaharia
- Re: Pyspark worker memory Jim Blomo
- Re: Pyspark worker memory Jim Blomo
- Re: Pyspark worker memory Andrew Ash
- Re: Pyspark worker memory Matei Zaharia