GitHub user CodingCat opened a pull request:
https://github.com/apache/incubator-spark/pull/602
[SPARK-1092] remove SPARK_MEM usage in sparkcontext.scala
https://spark-project.atlassian.net/browse/SPARK-1092?jql=project%20%3D%20SPARK
Currently, users will usually set SPARK_MEM to control the memory usage of
driver programs, (in spark-class)
91 JAVA_OPTS="$OUR_JAVA_OPTS"
92 JAVA_OPTS="$JAVA_OPTS -Djava.library.path=$SPARK_LIBRARY_PATH"
93 JAVA_OPTS="$JAVA_OPTS -Xms$SPARK_MEM -Xmx$SPARK_MEM"
if they didn't set spark.executor.memory, the value in this environment
variable will also affect the memory usage of executors, because the following
lines in SparkContext
privatespark val executorMemory = conf.getOption("spark.executor.memory")
.orElse(Option(System.getenv("SPARK_MEM")))
.map(Utils.memoryStringToMb)
.getOrElse(512)
also
since SPARK_MEM has been (proposed to) deprecated in SPARK-929
(https://spark-project.atlassian.net/browse/SPARK-929) and the corresponding PR
(https://github.com/apache/incubator-spark/pull/104)
we should remove this line
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/apache/incubator-spark clean_spark_mem
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/incubator-spark/pull/602.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #602
----
commit 31ea1cfe4231c7ff14f2721fa2be99baa43c29d0
Author: CodingCat <[email protected]>
Date: 2014-02-14T23:31:03Z
remove SPARK_MEM usage in sparkcontext.scala
----