[ 
https://issues.apache.org/jira/browse/SPARK-46910?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon reassigned SPARK-46910:
------------------------------------

    Assignee: Amanda Liu

> Eliminate JDK Requirement in PySpark Installation
> -------------------------------------------------
>
>                 Key: SPARK-46910
>                 URL: https://issues.apache.org/jira/browse/SPARK-46910
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 3.5.0
>            Reporter: Amanda Liu
>            Assignee: Amanda Liu
>            Priority: Minor
>              Labels: pull-request-available
>
> PySpark requires users to have the correct JDK version (JDK 8+ for Spark<4; 
> JDK 17+ for Spark>=4) installed locally.
> We can make the Spark installation script install the JDK, so users don’t 
> need to do this step manually.
> h1. Details
>  # When the entry point for a Spark class is invoked, the spark-class script 
> checks if Java is installed in the user environment.
>  # If Java is not installed, the user is prompted to select whether they want 
> to install JDK 17.
>  # If the user selects yes, JDK 17 is installed (using the [install-jdk 
> library|https://pypi.org/project/install-jdk/]) and JAVA_HOME variable and 
> RUNNER are set appropriately. The Spark build will now work!
>  # If the user selects no, we provide them a brief description of how to 
> install JDK manually.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to