Hi,

I have the following case, which i am not sure how to resolve.

My code uses HadoopRDD and creates various RDDs on top of it 
(MapPartitionsRDD, and so on ) 
After all RDDs were lazily created, my code "knows" some new information 
and i want that "compute" method of the HadoopRDD will be aware of it (at 
the point when "compute" method will be called). 
What is the possible way 'to send' some additional information to the 
compute method of the HadoopRDD after this RDD is lazily created?
I tried to play with configuration, like to perform set("test","111") in 
the code and modify the compute method of HadoopRDD with get("test") - but 
of it's not working,  since SparkContext has only clone of the of the 
configuration and it can't be modified in run time.

Any thoughts how can i make it? 

Thanks
Gil.

Reply via email to