Hi, Matej,
For the convenience of SparkR users, when they start SparkR without using
bin/sparkR, (for example, in RStudio),
https://issues.apache.org/jira/browse/SPARK-11340 enables setting of
“spark.driver.memory”, (also other similar options, like:
spark.driver.extraClassPath,
As documented in
http://spark.apache.org/docs/latest/configuration.html#available-properties,
Note for “spark.driver.memory”:
Note: In client mode, this config must not be set through the SparkConf
directly in your application, because the driver JVM has already started at
that point. Instead,
Hi Matej,
I'm also using this and I'm having the same behavior here, my driver has
only 530mb which is the default value.
Maybe this is a bug.
2015-10-23 9:43 GMT-02:00 Matej Holec :
> Hello!
>
> How to adjust the memory settings properly for SparkR with
> master="local[*]"
>