I ran into problems in building Spark 2.0. The build process actually
succeeded but when I uploaded to cluster and launched the Spark shell on
YARN, it reported following exceptions again and again:
16/06/17 03:32:00 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
Container marked as
Hi
I'm using spark 1.5.1. But I encountered a problem using SparkConf to set
spark.driver.memory in yarn-cluster mode.
Example 1:
In the code, I did following:
val sc = new SparkContext(new
SparkConf().setAppName("test").set("spark.driver.memory", "4g"))
And used following command to submit