GitHub user jerryshao opened a pull request: https://github.com/apache/spark/pull/18962
[SPARK-21714][CORE][YARN] Avoiding re-uploading remote resources in yarn client ## What changes were proposed in this pull request? With SPARK-10643, Spark supports download resources from remote in client deploy mode. But the implementation overrides variables which representing added resources (like `args.jars`, `args.pyFiles`) to local path, And yarn client leverage this local path to re-upload resources to distributed cache. This is unnecessary to break the semantics of putting resources in a shared FS. So here proposed to fix it. ## How was this patch tested? This is manually verified with jars, pyFiles in local and remote storage, both in client and cluster mode. You can merge this pull request into a Git repository by running: $ git pull https://github.com/jerryshao/apache-spark SPARK-21714 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/18962.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #18962 ---- commit c2cb5f7d816c87cdc3e906131d1c1c3bb9f80b04 Author: jerryshao <ss...@hortonworks.com> Date: 2017-08-16T08:46:29Z Avoid re-uploading resources Change-Id: I2cb667aedd53b228e6dbfed5725cd8b268a498e9 commit 6e3093186f838972dc5b08d1a52991968885dfb7 Author: jerryshao <ss...@hortonworks.com> Date: 2017-08-16T13:01:41Z Fix some potential issues when fetch remote resource from http(s) server in yarn mode Change-Id: I6317a464c4fd526a8057c578a05a60420d975a47 ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org