[ 
https://issues.apache.org/jira/browse/SPARK-16265?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15365998#comment-15365998
 ] 

Andrew Duffy commented on SPARK-16265:
--------------------------------------

Hi Sean, yeah I can see where you're coming from, but I feel like this change 
is simple and targeted enough (meant to be used with the {{SparkLauncher}} API) 
that it can actually be useful without adding much (if any) maintenance load. 
If anything I would argue it at least deserves consideration as an experimental 
feature, as users who write programs that use SparkLauncher are going to have 
to split Java versions for the code that launches and interacts with the Spark 
app and the Spark app itself if the application is eg. written for one 
environment and then deployed in another uncontrolled customer environment 
where the cluster does not have Java 8 installed.

> Add option to SparkSubmit to ship driver JRE to YARN
> ----------------------------------------------------
>
>                 Key: SPARK-16265
>                 URL: https://issues.apache.org/jira/browse/SPARK-16265
>             Project: Spark
>          Issue Type: Improvement
>    Affects Versions: 1.6.2
>            Reporter: Andrew Duffy
>
> Add an option to {{SparkSubmit}} to allow the driver to package up it's 
> version of the JRE to be shipped to a YARN cluster. This allows deploying 
> Spark applications to a YARN cluster in which its required Java version need 
> not match one of the versions already installed on the YARN cluster, useful 
> in situations in which the Spark Application developer does not have 
> administrative access over the YARN cluster (ex. school or corporate 
> environment) but still wants to use certain language features in their code.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to