Barak,

The SPARK_WORKER_MEMORYsetting is used for allocating memory to executors.

You can use SPARK_DAEMON_MEMORY to set memory for the worker JVM.

Mohammed

From: Barak Yaish [mailto:barak.ya...@gmail.com]
Sent: Wednesday, January 13, 2016 12:59 AM
To: user@spark.apache.org
Subject: Spark ignores SPARK_WORKER_MEMORY?

Hello,

Although I'm setting SPARK_WORKER_MEMORY in spark-env.sh, looks like this 
setting is ignored. I can't find any indication at the scripts under bin/sbin 
that -Xms/-Xmx are set.

If I ps the worker pid, it looks like memory set to 1G:

[hadoop@sl-env1-hadoop1 spark-1.5.2-bin-hadoop2.6]$ ps -ef | grep 20232
hadoop   20232     1  0 02:01 ?        00:00:22 /usr/java/latest//bin/java -cp 
/workspace/3rd-party/spark/spark-1.5.2-bin-hadoop2.6/sbin/../conf/:/workspace/3rd-party/spark/spark-1.5.2-bin-hadoop2.6/lib/spark-assembly-1.5.2-hadoop2.6.0.jar:/workspace/3rd-party/spark/spark-1.5.2-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar:/workspace/3rd-party/spark/spark-1.5.2-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar:/workspace/3rd-party/spark/spark-1.5.2-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar:/workspace/3rd-party/hadoop/2.6.3//etc/hadoop/
 -Xms1g -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 8081 
spark://10.52.39.92:7077<http://10.52.39.92:7077>

Am I missing something?

Thanks.

Reply via email to