Hi Matej,
I'm also using this and I'm having the same behavior here, my driver has
only 530mb which is the default value.

Maybe this is a bug.

2015-10-23 9:43 GMT-02:00 Matej Holec <hol...@gmail.com>:

> Hello!
>
> How to adjust the memory settings properly for SparkR with
> master="local[*]"
> in R?
>
>
> *When running from  R -- SparkR doesn't accept memory settings :(*
>
> I use the following commands:
>
> R>  library(SparkR)
> R>  sc <- sparkR.init(master = "local[*]", sparkEnvir =
> list(spark.driver.memory = "5g"))
>
> Despite the variable spark.driver.memory is correctly set (checked in
> http://node:4040/environment/), the driver has only the default amount of
> memory allocated (Storage Memory 530.3 MB).
>
> *But when running from  spark-1.5.1-bin-hadoop2.6/bin/sparkR -- OK*
>
> The following command:
>
> ]$ spark-1.5.1-bin-hadoop2.6/bin/sparkR --driver-memory 5g
>
> creates SparkR session with properly adjustest driver memory (Storage
> Memory
> 2.6 GB).
>
>
> Any suggestion?
>
> Thanks
> Matej
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-memory-for-SparkR-with-master-local-tp25178.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to