Github user gjhkael commented on the issue:

    https://github.com/apache/spark/pull/22887
  
    > can you explain more about why you make the change?
       Some hadoop configuration set it in spark-default.conf, we want it to be 
global, but in some cases, user need to override the configuration but cannot 
works, for the sparkContext's conf fill the hadoopConf again finally before 
broadcast this hadoop conf. 
    > Did you try `spark.SessionState.newHadoopConf()`?
       We have this problem is in Spark-sql, not use the datafame api 
    



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to