Hi, Laurent

You could set Spark.executor.memory and heap size by following methods:

1. in you conf/spark-env.sh:
    *export SPARK_WORKER_MEMORY=38g*
*    export SPARK_JAVA_OPTS="-XX:-UseGCOverheadLimit
-XX:+UseConcMarkSweepGC -Xmx2g -XX:MaxPermSize=256m"*

2. you could also add modification for executor memory and java opts
in *spark-submit
*parameters.

Check the Spark *configure *and *tuning *docs, you could find full answers
there.


Regards,
Wang Hao(王灏)

CloudTeam | School of Software Engineering
Shanghai Jiao Tong University
Address:800 Dongchuan Road, Minhang District, Shanghai, 200240
Email:wh.s...@gmail.com


On Thu, Jun 12, 2014 at 6:29 PM, Laurent T <laurent.thou...@ldmobile.net>
wrote:

> Hi,
>
> Can you give us a little more insight on how you used that file to solve
> your problem ?
> We're having the same OOM as you were and haven't been able to solve it
> yet.
>
> Thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-set-spark-executor-memory-and-heap-size-tp4719p7469.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to