You can't set up the driver memory programatically in client mode. In
that mode, the same JVM is running the driver, so you can't modify
command line options anymore when initializing the SparkContext.

(And you can't really start cluster mode apps that way, so the only
way to set this is through the command line / config files.)

On Wed, Oct 1, 2014 at 9:26 AM, jamborta <jambo...@gmail.com> wrote:
> Hi all,
>
> I cannot figure out why this command is not setting the driver memory (it is
> setting the executor memory):
>
>     conf = (SparkConf()
>                 .setMaster("yarn-client")
>                 .setAppName("test")
>                 .set("spark.driver.memory", "1G")
>                 .set("spark.executor.memory", "1G")
>                 .set("spark.executor.instances", 2)
>                 .set("spark.executor.cores", 4))
>     sc = SparkContext(conf=conf)
>
> whereas if I run the spark console:
> ./bin/pyspark --driver-memory 1G
>
> it sets it correctly. Seemingly they both generate the same commands in the
> logs.
>
> thanks a lot,
>
>
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-memory-is-not-set-pyspark-1-1-0-tp15498.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to