Thomas created SPARK-10375:
------------------------------

             Summary: Setting the driver memory with 
SparkConf().set("spark.driver.memory","1g") does not work
                 Key: SPARK-10375
                 URL: https://issues.apache.org/jira/browse/SPARK-10375
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 1.3.0
         Environment: Running with yarn
            Reporter: Thomas
            Priority: Minor


When running pyspark 1.3.0 with yarn, the following code has no effect:
pyspark.SparkConf().set("spark.driver.memory","1g")

The Environment tab in yarn shows that the driver has 1g, however, the 
Executors tab only shows 512 M (the default value) for the driver memory.  This 
issue goes away when the driver memory is specified via the command line (i.e. 
--driver-memory 1g)




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to