You can add memory in your command make sure given memory available on your
executor

./bin/spark-submit \
  --class org.apache.spark.examples.SparkPi \
  --master spark://207.184.161.138:7077 \
  --executor-memory 20G \
  --total-executor-cores 100 \
  /path/to/examples.jar \
  1000


https://spark.apache.org/docs/1.1.0/submitting-applications.html

Also try to avoid function need memory like collect etc.


Regards,
Vaquar khan


On Jun 4, 2017 5:46 AM, "Abdulfattah Safa" <fattah.s...@gmail.com> wrote:

I'm working on Spark with Standalone Cluster mode. I need to increase the
Driver Memory as I got OOM in t he driver thread. If found that when
setting  the Driver Memory to > Executor Memory, the submitted job is stuck
at Submitted in the driver and the application never starts.

Reply via email to