[jira] [Commented] (SPARK-16781) java launched by PySpark as gateway may not be the same java used in the spark environment

2016-08-22 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16781?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15430341#comment-15430341
 ] 

Apache Spark commented on SPARK-16781:
--

User 'srowen' has created a pull request for this issue:
https://github.com/apache/spark/pull/14748

> java launched by PySpark as gateway may not be the same java used in the 
> spark environment
> --
>
> Key: SPARK-16781
> URL: https://issues.apache.org/jira/browse/SPARK-16781
> Project: Spark
>  Issue Type: Bug
>  Components: PySpark
>Affects Versions: 1.6.2
>Reporter: Michael Berman
>
> When launching spark on a system with multiple javas installed, there are a 
> few options for choosing which JRE to use, setting `JAVA_HOME` being the most 
> straightforward.
> However, when pyspark's internal py4j launches its JavaGateway, it always 
> invokes `java` directly, without qualification. This means you get whatever 
> java's first on your path, which is not necessarily the same one in spark's 
> JAVA_HOME.
> This could be seen as a py4j issue, but from their point of view, the fix is 
> easy: make sure the java you want is first on your path. I can't figure out a 
> way to make that reliably happen through the pyspark executor launch path, 
> and it seems like something that would ideally happen automatically. If I set 
> JAVA_HOME when launching spark, I would expect that to be the only java used 
> throughout the stack.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-16781) java launched by PySpark as gateway may not be the same java used in the spark environment

2016-08-15 Thread Michael Berman (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16781?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15420995#comment-15420995
 ] 

Michael Berman commented on SPARK-16781:


In 0.10.3, py4j introduced an option to use the java from JAVA_HOME instead of 
just launching a bare {{java}} command. So one thing PySpark could do to help 
with this situation would be to update to that version, and then pass 
{{java_path=None}} when launching the gateway.

> java launched by PySpark as gateway may not be the same java used in the 
> spark environment
> --
>
> Key: SPARK-16781
> URL: https://issues.apache.org/jira/browse/SPARK-16781
> Project: Spark
>  Issue Type: Bug
>  Components: PySpark
>Affects Versions: 1.6.2
>Reporter: Michael Berman
>
> When launching spark on a system with multiple javas installed, there are a 
> few options for choosing which JRE to use, setting `JAVA_HOME` being the most 
> straightforward.
> However, when pyspark's internal py4j launches its JavaGateway, it always 
> invokes `java` directly, without qualification. This means you get whatever 
> java's first on your path, which is not necessarily the same one in spark's 
> JAVA_HOME.
> This could be seen as a py4j issue, but from their point of view, the fix is 
> easy: make sure the java you want is first on your path. I can't figure out a 
> way to make that reliably happen through the pyspark executor launch path, 
> and it seems like something that would ideally happen automatically. If I set 
> JAVA_HOME when launching spark, I would expect that to be the only java used 
> throughout the stack.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-16781) java launched by PySpark as gateway may not be the same java used in the spark environment

2016-08-15 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16781?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15420747#comment-15420747
 ] 

Sean Owen commented on SPARK-16781:
---

Yeah, I think this is something that's up to the execution environment, and 
thus an issue with py4j, or YARN, or your OS or whatever. I don't see what 
Spark can do differently?

> java launched by PySpark as gateway may not be the same java used in the 
> spark environment
> --
>
> Key: SPARK-16781
> URL: https://issues.apache.org/jira/browse/SPARK-16781
> Project: Spark
>  Issue Type: Bug
>  Components: PySpark
>Affects Versions: 1.6.2
>Reporter: Michael Berman
>
> When launching spark on a system with multiple javas installed, there are a 
> few options for choosing which JRE to use, setting `JAVA_HOME` being the most 
> straightforward.
> However, when pyspark's internal py4j launches its JavaGateway, it always 
> invokes `java` directly, without qualification. This means you get whatever 
> java's first on your path, which is not necessarily the same one in spark's 
> JAVA_HOME.
> This could be seen as a py4j issue, but from their point of view, the fix is 
> easy: make sure the java you want is first on your path. I can't figure out a 
> way to make that reliably happen through the pyspark executor launch path, 
> and it seems like something that would ideally happen automatically. If I set 
> JAVA_HOME when launching spark, I would expect that to be the only java used 
> throughout the stack.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-16781) java launched by PySpark as gateway may not be the same java used in the spark environment

2016-08-14 Thread Jeff Zhang (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16781?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15420547#comment-15420547
 ] 

Jeff Zhang commented on SPARK-16781:


JAVA_HOME will be set by yarn, not sure about other cluster managers. 

> java launched by PySpark as gateway may not be the same java used in the 
> spark environment
> --
>
> Key: SPARK-16781
> URL: https://issues.apache.org/jira/browse/SPARK-16781
> Project: Spark
>  Issue Type: Bug
>  Components: PySpark
>Affects Versions: 1.6.2
>Reporter: Michael Berman
>
> When launching spark on a system with multiple javas installed, there are a 
> few options for choosing which JRE to use, setting `JAVA_HOME` being the most 
> straightforward.
> However, when pyspark's internal py4j launches its JavaGateway, it always 
> invokes `java` directly, without qualification. This means you get whatever 
> java's first on your path, which is not necessarily the same one in spark's 
> JAVA_HOME.
> This could be seen as a py4j issue, but from their point of view, the fix is 
> easy: make sure the java you want is first on your path. I can't figure out a 
> way to make that reliably happen through the pyspark executor launch path, 
> and it seems like something that would ideally happen automatically. If I set 
> JAVA_HOME when launching spark, I would expect that to be the only java used 
> throughout the stack.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org