Github user gjhkael commented on the issue:
https://github.com/apache/spark/pull/22887
@vanzin @cloud-fan
The simplest description: user set 'spark.hadoop.xxx' through setCommand
will not cover the same configutation that set in spark-defaults.conf file.
I don't know
Github user gjhkael commented on the issue:
https://github.com/apache/spark/pull/22887
@vanzin Thanks for you review, I add a new commit to let the user's "set"
command take effect. Let me know if you have an easier w
Github user gjhkael commented on the issue:
https://github.com/apache/spark/pull/22887
> can you explain more about why you make the change?
Some hadoop configuration set it in spark-default.conf, we want it to be
global, but in some cases, user need to overr
Github user gjhkael commented on the issue:
https://github.com/apache/spark/pull/22887
test this please
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
GitHub user gjhkael opened a pull request:
https://github.com/apache/spark/pull/22887
user set's hadoop conf should not overwrite by sparkcontext's conf
## What changes were proposed in this pull request?
Hadoop conf which is set by user which is use sparksql's set command
Github user gjhkael closed the pull request at:
https://github.com/apache/spark/pull/22886
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org
GitHub user gjhkael opened a pull request:
https://github.com/apache/spark/pull/22886
Hadoop config should overwrite by users conf
## What changes were proposed in this pull request?
Hadoop conf which is set by user which is use sparksql's set command should
not overwrite