Hi all,

I'm trying to figure out how to set this option: " spark.yarn.driver.memoryOverhead" on Spark 1.2.0. I found this helpful overview http://apache-spark-user-list.1001560.n3.nabble.com/Stable-spark-streaming-app-td14105.html#a14476, which suggests to launch with --spark.yarn.driver.memoryOverhead 1024 added to spark-submit. However, when I do that I get this error:
Error: Unrecognized option '--spark.yarn.driver.memoryOverhead'.
Run with --help for usage help or --verbose for debug output
I have also tried calling sparkConf.set("spark.yarn.driver.memoryOverhead", "1024") on my spark configuration object but I still get "Will allocate AM container, with XXXX MB memory including 384 MB overhead" when launching. I'm running in yarn-cluster mode.

Any help or tips would be appreciated.

Thanks,
David

--

David McWhorter
Software Engineer
Commonwealth Computer Research, Inc.
1422 Sachem Place, Unit #1
Charlottesville, VA 22901
mcwhor...@ccri.com | 434.299.0090x204

Reply via email to