Github user jerryshao commented on the issue:

    https://github.com/apache/spark/pull/19469
  
    There's a similar PR #19427 , I was wondering if we can provide a general 
solution for such issues, like using a configuration to specify all the confs 
which needs to be reloaded, spark.streaming.confsToReload = 
spark.yarn.jars,spark.xx.xx. So that we don't need to fix related issues again 
and again. What do you think?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to