I have a global config Object in my spark app. 

    Object Config {
     var lambda = 0.01
    }

and I will set the value of lambda according to user's input.

    Object MyApp {
       def main(args: String[]) {
         Config.lambda = args(0).toDouble
         ...
         rdd.map(_ * Config.lambda)
       }
    }

and I found that the modification does not take effect in executors. The
value of lambda is always 0.01. I guess the modification in driver's jvm
will not effect the executor's.


Do you have other solution ? 

I found a similar question in stackoverflow :

http://stackoverflow.com/questions/29685330/how-to-set-and-get-static-variables-from-spark

in @DanielL.  's answer, he gives three solutions:

1. Put the value inside a closure to be serialized to the executors to
perform a task. 

**But I wonder how to write the closure and how to serialized it to the
executors, could any one give me some code example?**


2.If the values are fixed or the configuration is available on the executor
nodes (lives inside the jar, etc), then you can have a lazy val,
guaranteeing initialization only once.

**what if I declare the lambda as a lazy val variable? the modification in
the driver will take effects in the executors? could you give me some code
example?** 


3.Create a broadcast variable with the data. I know this way, but it also
need a local Broadcast[] variable which wraps the Config Object right? for
example:

    val config = sc.broadcast(Config)

and use `config.value.lambda` in executors , right ?





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-use-the-global-config-variables-in-executors-tp22846.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to