Thanks, it looks like the config has to start with "spark", a very interesting 
requirement.
I am using Spark 1.3.1, I didn't see this Warning log in the console.

Thanks for your help.
Yong
Date: Thu, 12 Nov 2015 23:03:12 +0530
Subject: Re: In Spark application, how to get the passed in configuration?
From: varunsharman...@gmail.com
To: java8...@hotmail.com
CC: user@spark.apache.org

You must be getting a warning at the start of application like : Warning: 
Ignoring non-spark config property: runtime.environment=passInValue .

Configs in spark should start with spark as prefix. So try something like 
--conf spark.runtime.environment=passInValue .
RegardsVarun
On Thu, Nov 12, 2015 at 9:51 PM, java8964 <java8...@hotmail.com> wrote:



In my Spark application, I want to access the pass in configuration, but it 
doesn't work. How should I do that?
object myCode extends Logging {
  // starting point of the application
  def main(args: Array[String]): Unit = {
    val sparkContext = new SparkContext()
    val runtimeEnvironment = sparkContext.getConf.get("runtime.environment", 
"default")
    Console.println("load properties from runtimeEnvironment: " + 
runtimeEnvironment)
    logInfo("load properties from runtimeEnvironment: " + runtimeEnvironment)
    sparkContext.stop()
  }
}
/opt/spark/bin/spark-submit --class myCode --conf 
runtime.environment=passInValue my.jarload properties from runtimeEnvironment: 
default
It looks like that I cannot access the dynamic passed in value from the command 
line this way.In the Hadoop, the Configuration object will include all the 
passed in key/value in the application. How to archive that in Spark?ThanksYong 
                                       


-- 
VARUN SHARMA
Flipkart
Bangalore

                                          

Reply via email to