In my Spark application, I want to access the pass in configuration, but it 
doesn't work. How should I do that?
object myCode extends Logging {
  // starting point of the application
  def main(args: Array[String]): Unit = {
    val sparkContext = new SparkContext()
    val runtimeEnvironment = sparkContext.getConf.get("runtime.environment", 
"default")
    Console.println("load properties from runtimeEnvironment: " + 
runtimeEnvironment)
    logInfo("load properties from runtimeEnvironment: " + runtimeEnvironment)
    sparkContext.stop()
  }
}
/opt/spark/bin/spark-submit --class myCode --conf 
runtime.environment=passInValue my.jarload properties from runtimeEnvironment: 
default
It looks like that I cannot access the dynamic passed in value from the command 
line this way.In the Hadoop, the Configuration object will include all the 
passed in key/value in the application. How to archive that in Spark?ThanksYong 
                                       

Reply via email to