GitHub user aarondav opened a pull request:

    https://github.com/apache/incubator-spark/pull/615

    SPARK-929: Fully deprecate usage of SPARK_MEM

    This patch cements our deprecation of the SPARK_MEM environment variable by 
replacing it with three more specialized variables:
    SPARK_DAEMON_MEMORY, SPARK_EXECUTOR_MEM, and SPARK_DRIVER_MEM
    
    The creation of the latter two variables means that we can safely set 
driver/job memory without accidentally setting the executor memory.
    SPARK_EXECUTOR_MEM is not actually public -- it is only used by the Mesos 
scheduler (and set within SparkContext). The proper way of configuring executor 
memory is through the "spark.executor.memory" property.
    
    SPARK_DRIVER_MEM is a new public variable, which is needed because there is 
currently no public way of setting the memory of jobs launched by spark-class.
    
    Other memory considerations:
    - The repl's memory can be set through the "--drivermem" command-line 
option, which really just sets SPARK_DRIVER_MEM.
    - run-example doesn't use spark-class, so the only way to modify examples' 
memory is actually an unusual use of SPARK_JAVA_OPTS (which is normally 
overriden in all cases by spark-class).
    
    This patch also fixes a lurking bug where spark-shell misused spark-class 
(the first argument is supposed to be the main class name, not java options), 
as well as a bug in the Windows spark-class2.cmd. I have not yet tested this 
patch on either Windows or Mesos, however.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/aarondav/incubator-spark sparkmem

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/incubator-spark/pull/615.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #615
    
----
commit c6ff53251d49c6e4033d6b56268d2f3fb0166d88
Author: Aaron Davidson <aa...@databricks.com>
Date:   2014-02-17T23:09:51Z

    SPARK-929: Fully deprecate usage of SPARK_MEM
    
    This patch cements our deprecation of the SPARK_MEM environment variable
    by replacing it with case-specific variables:
    SPARK_DAEMON_MEMORY, SPARK_EXECUTOR_MEM, and SPARK_DRIVER_MEM
    
    The creation of the latter two variables means that we can safely
    set driver/job memory without accidentally setting the executor memory.
    SPARK_EXECUTOR_MEM is not actually public, though -- it is only used
    by the Mesos scheduler (and set within SparkContext). The proper way of
    configuring executor memory is through the "spark.executor.memory"
    property.
    
    SPARK_DRIVER_MEM is a new public variable, which is needed because
    there is currently no public way of setting the memory of jobs
    launched by spark-class.
    
    Other memory considerations:
    - The repl's memory can be set through the "--drivermem" command-line 
option,
      which really just sets SPARK_DRIVER_MEM.
    - run-example doesn't use spark-class, so the only way to modify examples'
      memory is actually an unusual use of SPARK_JAVA_OPTS (which is normally
      overriden in all cases by spark-class).
    
    This patch also fixes a lurking bug where spark-shell misused spark-class
    (the first argument is supposed to be the main class name, not java
    options).

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. To do so, please top-post your response.
If your project does not have this feature enabled and wishes so, or if the
feature is enabled but not working, please contact infrastructure at
infrastruct...@apache.org or file a JIRA ticket with INFRA.
---

Reply via email to