Please make sure that you have enough memory available on the driver node. If there is not enough free memory on the driver node, then your application won't start.
Pankaj From: vaquar khan <vaquar.k...@gmail.com<mailto:vaquar.k...@gmail.com>> Date: Saturday, June 10, 2017 at 5:02 PM To: Abdulfattah Safa <fattah.s...@gmail.com<mailto:fattah.s...@gmail.com>> Cc: User <user@spark.apache.org<mailto:user@spark.apache.org>> Subject: [E] Re: Spark Job is stuck at SUBMITTED when set Driver Memory > Executor Memory You can add memory in your command make sure given memory available on your executor ./bin/spark-submit \ --class org.apache.spark.examples.SparkPi \ --master spark://207.184.161.138:7077<https://urldefense.proofpoint.com/v2/url?u=http-3A__207.184.161.138-3A7077&d=DwMFaQ&c=udBTRvFvXC5Dhqg7UHpJlPps3mZ3LRxpb6__0PomBTQ&r=zQqmwCNxd6rBWnFRMGXIzVL1nRVw40AD5ViBUj89NkA&m=wxxfRxzLq-84-0MgK0lf3k9fISTBemTByQfiA5jv7zQ&s=vnOyOle4HerCDAASfIwUj29e-H2eVhtSuknGDC9mHyI&e=> \ --executor-memory 20G \ --total-executor-cores 100 \ /path/to/examples.jar \ 1000 https://spark.apache.org/docs/1.1.0/submitting-applications.html<https://urldefense.proofpoint.com/v2/url?u=https-3A__spark.apache.org_docs_1.1.0_submitting-2Dapplications.html&d=DwMFaQ&c=udBTRvFvXC5Dhqg7UHpJlPps3mZ3LRxpb6__0PomBTQ&r=zQqmwCNxd6rBWnFRMGXIzVL1nRVw40AD5ViBUj89NkA&m=wxxfRxzLq-84-0MgK0lf3k9fISTBemTByQfiA5jv7zQ&s=RPQU9484Nv1qoYOjnB_R_w5pjZga5v3YaA5UMTxEXA0&e=> Also try to avoid function need memory like collect etc. Regards, Vaquar khan On Jun 4, 2017 5:46 AM, "Abdulfattah Safa" <fattah.s...@gmail.com<mailto:fattah.s...@gmail.com>> wrote: I'm working on Spark with Standalone Cluster mode. I need to increase the Driver Memory as I got OOM in t he driver thread. If found that when setting the Driver Memory to > Executor Memory, the submitted job is stuck at Submitted in the driver and the application never starts.