Github user lvdongr commented on the issue:
https://github.com/apache/spark/pull/18987
ok. Thank you all the same for your review @srowen @jerryshao @ajbozarth .
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18987
We won't merge this, it's too much overhead for little gain. You can close
this.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user lvdongr commented on the issue:
https://github.com/apache/spark/pull/18987
The log level setting is a very useful function.Our team is doing a spark
application and when we want to see the debug log, we have to restart the
application every time. So we develop this
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/18987
This seems like a super specific requirement, AFAIK there's no Hadoop
related project support dynamically changing log level, do we really need this
feature is Spark, do we have other workaround
Github user ajbozarth commented on the issue:
https://github.com/apache/spark/pull/18987
I agree with @srowen and I'm not a fan out the UI either
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18987
This seems like a ton of complexity. I don't think this functionality is
worth nearly this.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18987
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this