spark git commit: [SPARK-16781][PYSPARK] java launched by PySpark as gateway may not be the same java used in the spark environment

2016-08-24 Thread srowen
Repository: spark
Updated Branches:
  refs/heads/branch-2.0 29091d7cd -> 9f924a01b


[SPARK-16781][PYSPARK] java launched by PySpark as gateway may not be the same 
java used in the spark environment

## What changes were proposed in this pull request?

Update to py4j 0.10.3 to enable JAVA_HOME support

## How was this patch tested?

Pyspark tests

Author: Sean Owen 

Closes #14748 from srowen/SPARK-16781.

(cherry picked from commit 0b3a4be92ca6b38eef32ea5ca240d9f91f68aa65)
Signed-off-by: Sean Owen 


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/9f924a01
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/9f924a01
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/9f924a01

Branch: refs/heads/branch-2.0
Commit: 9f924a01b27ebba56080c9ad01b84fff026d5dcd
Parents: 29091d7
Author: Sean Owen 
Authored: Wed Aug 24 20:04:09 2016 +0100
Committer: Sean Owen 
Committed: Wed Aug 24 20:04:20 2016 +0100

--
 LICENSE|   2 +-
 bin/pyspark|   2 +-
 bin/pyspark2.cmd   |   2 +-
 core/pom.xml   |   2 +-
 .../org/apache/spark/api/python/PythonUtils.scala  |   2 +-
 dev/deps/spark-deps-hadoop-2.2 |   2 +-
 dev/deps/spark-deps-hadoop-2.3 |   2 +-
 dev/deps/spark-deps-hadoop-2.4 |   2 +-
 dev/deps/spark-deps-hadoop-2.6 |   2 +-
 dev/deps/spark-deps-hadoop-2.7 |   2 +-
 python/docs/Makefile   |   2 +-
 python/lib/py4j-0.10.1-src.zip | Bin 61356 -> 0 bytes
 python/lib/py4j-0.10.3-src.zip | Bin 0 -> 91275 bytes
 sbin/spark-config.sh   |   2 +-
 .../org/apache/spark/deploy/yarn/Client.scala  |   6 +++---
 .../spark/deploy/yarn/YarnClusterSuite.scala   |   2 +-
 16 files changed, 16 insertions(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/9f924a01/LICENSE
--
diff --git a/LICENSE b/LICENSE
index 94fd46f..d68609c 100644
--- a/LICENSE
+++ b/LICENSE
@@ -263,7 +263,7 @@ The text of each license is also included at 
licenses/LICENSE-[project].txt.
  (New BSD license) Protocol Buffer Java API 
(org.spark-project.protobuf:protobuf-java:2.4.1-shaded - 
http://code.google.com/p/protobuf)
  (The BSD License) Fortran to Java ARPACK 
(net.sourceforge.f2j:arpack_combined_all:0.1 - http://f2j.sourceforge.net)
  (The BSD License) xmlenc Library (xmlenc:xmlenc:0.52 - 
http://xmlenc.sourceforge.net)
- (The New BSD License) Py4J (net.sf.py4j:py4j:0.10.1 - 
http://py4j.sourceforge.net/)
+ (The New BSD License) Py4J (net.sf.py4j:py4j:0.10.3 - 
http://py4j.sourceforge.net/)
  (Two-clause BSD-style license) JUnit-Interface 
(com.novocode:junit-interface:0.10 - http://github.com/szeiger/junit-interface/)
  (BSD licence) sbt and sbt-launch-lib.bash
  (BSD 3 Clause) d3.min.js 
(https://github.com/mbostock/d3/blob/master/LICENSE)

http://git-wip-us.apache.org/repos/asf/spark/blob/9f924a01/bin/pyspark
--
diff --git a/bin/pyspark b/bin/pyspark
index ac8aa04..037645d 100755
--- a/bin/pyspark
+++ b/bin/pyspark
@@ -65,7 +65,7 @@ export PYSPARK_PYTHON
 
 # Add the PySpark classes to the Python path:
 export PYTHONPATH="${SPARK_HOME}/python/:$PYTHONPATH"
-export PYTHONPATH="${SPARK_HOME}/python/lib/py4j-0.10.1-src.zip:$PYTHONPATH"
+export PYTHONPATH="${SPARK_HOME}/python/lib/py4j-0.10.3-src.zip:$PYTHONPATH"
 
 # Load the PySpark shell.py script when ./pyspark is used interactively:
 export OLD_PYTHONSTARTUP="$PYTHONSTARTUP"

http://git-wip-us.apache.org/repos/asf/spark/blob/9f924a01/bin/pyspark2.cmd
--
diff --git a/bin/pyspark2.cmd b/bin/pyspark2.cmd
index 3e2ff10..1217a4f 100644
--- a/bin/pyspark2.cmd
+++ b/bin/pyspark2.cmd
@@ -30,7 +30,7 @@ if "x%PYSPARK_DRIVER_PYTHON%"=="x" (
 )
 
 set PYTHONPATH=%SPARK_HOME%\python;%PYTHONPATH%
-set PYTHONPATH=%SPARK_HOME%\python\lib\py4j-0.10.1-src.zip;%PYTHONPATH%
+set PYTHONPATH=%SPARK_HOME%\python\lib\py4j-0.10.3-src.zip;%PYTHONPATH%
 
 set OLD_PYTHONSTARTUP=%PYTHONSTARTUP%
 set PYTHONSTARTUP=%SPARK_HOME%\python\pyspark\shell.py

http://git-wip-us.apache.org/repos/asf/spark/blob/9f924a01/core/pom.xml
--
diff --git a/core/pom.xml b/core/pom.xml
index bb27ec9..208659b 100644
--- a/core/pom.xml
+++ b/core/pom.xml
@@ -327,7 +327,7 @@
 
   net.sf.py4j
  

spark git commit: [SPARK-16781][PYSPARK] java launched by PySpark as gateway may not be the same java used in the spark environment

2016-08-24 Thread srowen
Repository: spark
Updated Branches:
  refs/heads/master 2fbdb6063 -> 0b3a4be92


[SPARK-16781][PYSPARK] java launched by PySpark as gateway may not be the same 
java used in the spark environment

## What changes were proposed in this pull request?

Update to py4j 0.10.3 to enable JAVA_HOME support

## How was this patch tested?

Pyspark tests

Author: Sean Owen 

Closes #14748 from srowen/SPARK-16781.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/0b3a4be9
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/0b3a4be9
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/0b3a4be9

Branch: refs/heads/master
Commit: 0b3a4be92ca6b38eef32ea5ca240d9f91f68aa65
Parents: 2fbdb60
Author: Sean Owen 
Authored: Wed Aug 24 20:04:09 2016 +0100
Committer: Sean Owen 
Committed: Wed Aug 24 20:04:09 2016 +0100

--
 LICENSE|   2 +-
 bin/pyspark|   2 +-
 bin/pyspark2.cmd   |   2 +-
 core/pom.xml   |   2 +-
 .../org/apache/spark/api/python/PythonUtils.scala  |   2 +-
 dev/deps/spark-deps-hadoop-2.2 |   2 +-
 dev/deps/spark-deps-hadoop-2.3 |   2 +-
 dev/deps/spark-deps-hadoop-2.4 |   2 +-
 dev/deps/spark-deps-hadoop-2.6 |   2 +-
 dev/deps/spark-deps-hadoop-2.7 |   2 +-
 python/docs/Makefile   |   2 +-
 python/lib/py4j-0.10.1-src.zip | Bin 61356 -> 0 bytes
 python/lib/py4j-0.10.3-src.zip | Bin 0 -> 91275 bytes
 sbin/spark-config.sh   |   2 +-
 .../org/apache/spark/deploy/yarn/Client.scala  |   6 +++---
 .../spark/deploy/yarn/YarnClusterSuite.scala   |   2 +-
 16 files changed, 16 insertions(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/0b3a4be9/LICENSE
--
diff --git a/LICENSE b/LICENSE
index 94fd46f..d68609c 100644
--- a/LICENSE
+++ b/LICENSE
@@ -263,7 +263,7 @@ The text of each license is also included at 
licenses/LICENSE-[project].txt.
  (New BSD license) Protocol Buffer Java API 
(org.spark-project.protobuf:protobuf-java:2.4.1-shaded - 
http://code.google.com/p/protobuf)
  (The BSD License) Fortran to Java ARPACK 
(net.sourceforge.f2j:arpack_combined_all:0.1 - http://f2j.sourceforge.net)
  (The BSD License) xmlenc Library (xmlenc:xmlenc:0.52 - 
http://xmlenc.sourceforge.net)
- (The New BSD License) Py4J (net.sf.py4j:py4j:0.10.1 - 
http://py4j.sourceforge.net/)
+ (The New BSD License) Py4J (net.sf.py4j:py4j:0.10.3 - 
http://py4j.sourceforge.net/)
  (Two-clause BSD-style license) JUnit-Interface 
(com.novocode:junit-interface:0.10 - http://github.com/szeiger/junit-interface/)
  (BSD licence) sbt and sbt-launch-lib.bash
  (BSD 3 Clause) d3.min.js 
(https://github.com/mbostock/d3/blob/master/LICENSE)

http://git-wip-us.apache.org/repos/asf/spark/blob/0b3a4be9/bin/pyspark
--
diff --git a/bin/pyspark b/bin/pyspark
index a0d7e22..7590309 100755
--- a/bin/pyspark
+++ b/bin/pyspark
@@ -57,7 +57,7 @@ export PYSPARK_PYTHON
 
 # Add the PySpark classes to the Python path:
 export PYTHONPATH="${SPARK_HOME}/python/:$PYTHONPATH"
-export PYTHONPATH="${SPARK_HOME}/python/lib/py4j-0.10.1-src.zip:$PYTHONPATH"
+export PYTHONPATH="${SPARK_HOME}/python/lib/py4j-0.10.3-src.zip:$PYTHONPATH"
 
 # Load the PySpark shell.py script when ./pyspark is used interactively:
 export OLD_PYTHONSTARTUP="$PYTHONSTARTUP"

http://git-wip-us.apache.org/repos/asf/spark/blob/0b3a4be9/bin/pyspark2.cmd
--
diff --git a/bin/pyspark2.cmd b/bin/pyspark2.cmd
index 3e2ff10..1217a4f 100644
--- a/bin/pyspark2.cmd
+++ b/bin/pyspark2.cmd
@@ -30,7 +30,7 @@ if "x%PYSPARK_DRIVER_PYTHON%"=="x" (
 )
 
 set PYTHONPATH=%SPARK_HOME%\python;%PYTHONPATH%
-set PYTHONPATH=%SPARK_HOME%\python\lib\py4j-0.10.1-src.zip;%PYTHONPATH%
+set PYTHONPATH=%SPARK_HOME%\python\lib\py4j-0.10.3-src.zip;%PYTHONPATH%
 
 set OLD_PYTHONSTARTUP=%PYTHONSTARTUP%
 set PYTHONSTARTUP=%SPARK_HOME%\python\pyspark\shell.py

http://git-wip-us.apache.org/repos/asf/spark/blob/0b3a4be9/core/pom.xml
--
diff --git a/core/pom.xml b/core/pom.xml
index 04b94a2..ab6c3ce 100644
--- a/core/pom.xml
+++ b/core/pom.xml
@@ -326,7 +326,7 @@
 
   net.sf.py4j
   py4j
-  0.10.1
+  0.10.3
 
 
   org.apache.spark