Hi,

I wanted to make simple Spark app running in local mode with 2g
spark.executor.memory and 1g for caching. But following code:

  val conf = new SparkConf()
    .setMaster("local")
    .setAppName("app")
    .set("spark.executor.memory", "2g")
    .set("spark.storage.memoryFraction", "0.5")
  val sc = new SparkContext(conf)

doesn't work. In spark UI this variables are set properly but memory store
is around 0.5 * 512MB (default spark.executor.memory) not 0.5 * 2GB:

14/08/05 15:34:00 INFO MemoryStore: MemoryStore started with capacity 245.8
MB.

I have neither spark-defaults.conf nor spark-env.sh in my $SPARK_HOME/conf
directory. I use Spark 1.0.0
How can I set this values properly?

Thanks,
Grzegorz

Reply via email to