Hi all,

I encountered strange behavior with the driver memory setting, and was
wondering if some of you experienced it as well, or know what the problem
is.
I want to start a Spark job in the background with spark-submit. If I have
the driver memory setting in my spark-defaults.conf:
spark.driver.memory   1g
the background process will be stopped after ~1 second. The log doesn't
contain a hint as to why this happens. Command used:
./spark-submit --class <MyClass> <path to jar> &> output.log &

If I run the process in the foreground, it runs through. As soon as I
remove the driver memory setting from the defaults property file,
everything works without problems. I can then specify the driver memory
from the command line, like such:
./spark-submit --driver-memory 1g --class <MyClass> <path to jar> &>
output.log &

The amount of memory I assign to the driver does not change the behavior.
Spark release used: Spark 1.2.0 built for Hadoop 2.4.0
This both happens on the Amazon EC2 cluster as well as on my local machine
(Mac).

I couldn't find any similar reports on the web. Any hint as to why this
happens is much appreciated!

Thanks,
Andy

Reply via email to