Github user steveloughran commented on the issue:

    https://github.com/apache/flink/pull/4926
  
    creating YarnConfiguration & HdfsConfiguration through some dynamic 
classloading is enough to force in these files & configs underneath your own 
Configurations. You shouldn't be reading in all their values and sending them 
over the wire with your job, just the direct values off your config, which you 
can just enum directly off the Configuration (which is Writeable, BTW),  I have 
some scala code to send it around [using Java 
serialization](https://github.com/hortonworks-spark/cloud-integration/blob/master/spark-cloud-integration/src/main/scala/com/hortonworks/spark/cloud/utils/ConfigSerDeser.scala)
 if that helps


---

Reply via email to