(Clarification: you'll need to pass in --driver-memory not just for local
mode, but for any application you're launching with "client" deploy mode)


2014-08-05 9:24 GMT-07:00 Andrew Or <and...@databricks.com>:

> Hi Grzegorz,
>
> For local mode you only have one executor, and this executor is your
> driver, so you need to set the driver's memory instead. *That said, in
> local mode, by the time you run spark-submit, a JVM has already been
> launched with the default memory settings, so setting "spark.driver.memory"
> in your conf won't actually do anything for you. Instead, you need to run
> spark-submit as follows
>
> bin/spark-submit --driver-memory 2g --class your.class.here app.jar
>
> This will start the JVM with 2G instead of the default 512M.
>
> -Andrew
>
>
> 2014-08-05 6:43 GMT-07:00 Grzegorz Białek <grzegorz.bia...@codilime.com>:
>
> Hi,
>>
>> I wanted to make simple Spark app running in local mode with 2g
>> spark.executor.memory and 1g for caching. But following code:
>>
>>   val conf = new SparkConf()
>>     .setMaster("local")
>>     .setAppName("app")
>>     .set("spark.executor.memory", "2g")
>>     .set("spark.storage.memoryFraction", "0.5")
>>   val sc = new SparkContext(conf)
>>
>> doesn't work. In spark UI this variables are set properly but memory store
>> is around 0.5 * 512MB (default spark.executor.memory) not 0.5 * 2GB:
>>
>> 14/08/05 15:34:00 INFO MemoryStore: MemoryStore started with capacity
>> 245.8 MB.
>>
>> I have neither spark-defaults.conf nor spark-env.sh in my
>> $SPARK_HOME/conf directory. I use Spark 1.0.0
>> How can I set this values properly?
>>
>> Thanks,
>> Grzegorz
>>
>>
>

Reply via email to