Thanks for the reply RK. Using the first option, my application doesn't recognize spark.driver.extraJavaOptions.
With the second option, the issue remains as same, 2016-07-21 12:59:41 ERROR SparkContext:95 - Error initializing SparkContext. org.apache.spark.SparkException: Found both spark.executor.extraJavaOptions and SPARK_JAVA_OPTS. Use only the former. Looks like either of the two issue :- 1. Some where in my cluster SPARK_JAVA_OPTS is getting set, but i have done a details review of my cluster, no where i am exporting this value. 2. There is some issue with this specific version of CDH (CDH 5.7.1 + spark 1.6.0) -Sam -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-extraJavaOptions-tp27389p27392.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org