GitHub user JoshRosen opened a pull request: https://github.com/apache/spark/pull/2586
[SPARK-3734] DriverRunner should't read SPARK_HOME from submitter's environment When using spark-submit in `cluster` mode to submit a job to a Spark Standalone cluster, if the JAVA_HOME environment variable was set on the submitting machine then DriverRunner would attempt to use the submitter's JAVA_HOME to launch the driver process (instead of the worker's JAVA_HOME), causing the driver to fail unless the submitter and worker had the same Java location. This commit fixes this by reading JAVA_HOME from sys.env instead of command.environment. You can merge this pull request into a Git repository by running: $ git pull https://github.com/JoshRosen/spark SPARK-3734 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/2586.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2586 ---- commit e9513d97eca953c623b82259bb64cae3e461b720 Author: Josh Rosen <joshro...@apache.org> Date: 2014-09-29T23:22:28Z [SPARK-3734] DriverRunner should not read SPARK_HOME from submitter's environment. When using spark-submit in `cluster` mode to submit a job to a Spark Standalone cluster, if the JAVA_HOME environment variable was set on the submitting machine then DriverRunner would attempt to use the submitter's JAVA_HOME to launch the driver process (instead of the worker's JAVA_HOME), causing the driver to fail unless the submitter and worker had the same Java location. This commit fixes this by reading JAVA_HOME from sys.env instead of command.environment. ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org