Looking at the script, I'm not sure whether --driver-memory is
supposed to work in standalone client mode. It's "too late" to set the
driver's memory if the driver is what's already running. It specially
handles the case where the value is the environment config though. Not
sure, this might be on purpose.

On Thu, Feb 12, 2015 at 1:16 PM, poiuytrez <guilla...@databerries.com> wrote:
> Very interesting. It works.
>
> When I set SPARK_DRIVER_MEMORY=83971m in spark-env.sh or spark-default.conf
> it works.
> However, when I set the --driver-memory option with spark submit, the memory
> is not allocated to the spark master. (the web ui shows the correct value of
> spark.driver.memory (83971m) but the memory is not correctly allocated as we
> can see on the webui executor page).
>
> I am going to file an issue in the bug tracker.
>
> Thank you for your help.
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/OutOfMemoryError-with-ramdom-forest-and-small-training-dataset-tp21598p21620.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to