Build Spark 2.0 succeeded but could not run it on YARN

2016-06-20 Thread wgtmac
I ran into problems in building Spark 2.0. The build process actually succeeded but when I uploaded to cluster and launched the Spark shell on YARN, it reported following exceptions again and again: 16/06/17 03:32:00 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Container marked as

SparkConf does not work for spark.driver.memory

2016-02-18 Thread wgtmac
Hi I'm using spark 1.5.1. But I encountered a problem using SparkConf to set spark.driver.memory in yarn-cluster mode. Example 1: In the code, I did following: val sc = new SparkContext(new SparkConf().setAppName("test").set("spark.driver.memory", "4g")) And used following command to submit