On Thu, Feb 18, 2016 at 10:26 AM, wgtmac <ust...@gmail.com> wrote:
> In the code, I did following:
> val sc = new SparkContext(new
> SparkConf().setAppName("test").set("spark.driver.memory", "4g"))

You can't set the driver memory like this, in any deploy mode. When
that code runs, the driver is already running, so there's no way to
modify the JVM's command line options at that time.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to