Try checking spark-env.sh on the workers as well. Maybe code there is somehow 
overriding the spark.executor.memory setting.

Matei

On Mar 18, 2014, at 6:17 PM, Jim Blomo <jim.bl...@gmail.com> wrote:

> Hello, I'm using the Github snapshot of PySpark and having trouble setting 
> the worker memory correctly. I've set spark.executor.memory to 5g, but 
> somewhere along the way Xmx is getting capped to 512M. This was not occurring 
> with the same setup and 0.9.0. How many places do I need to configure the 
> memory? Thank you!
> 

Reply via email to