[ 
https://issues.apache.org/jira/browse/SPARK-26190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16700909#comment-16700909
 ] 

Gyanendra Dwivedi edited comment on SPARK-26190 at 11/27/18 7:51 PM:
---------------------------------------------------------------------

[~vanzin] If it was so easy to execute any custom script

using {{Runtime.getRuntime().exec(); then why does sparkLauncher exist to 
execute builtin spark-submit script?}}

Creating fake symlinks etc is not a viable solution for production servers 
where installation location just may change with new version etc. Creating a 
symlink like adhoc solution should not be a reason for closing this feature 
request.

Don't know why it was never thought to keep things configurable. Hard coding, 
assuming a specific environment/use case setup and forcing developers to look 
for work around should not be encouraged.


was (Author: gm_dwivedi):
[~vanzin] If it was so easy to run or any custom script

using {{Runtime.getRuntime().exec(); then why does sparkLauncher exist to 
execute builtin spark-submit script?}}

Creating fake symlinks etc is not a viable solution for production servers 
where installation location just may change with new version etc. Creating a 
symlink like adhoc solution should not be a reason for closing this feature 
request.

Don't know why it was never thought to keep things configurable. Hard coding, 
assuming a specific environment/use case setup and forcing developers to look 
for work around should not be encouraged.

> SparkLauncher: Allow users to set their own submitter script instead of 
> hardcoded spark-submit
> ----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26190
>                 URL: https://issues.apache.org/jira/browse/SPARK-26190
>             Project: Spark
>          Issue Type: Improvement
>          Components: Java API, Spark Core, Spark Submit
>    Affects Versions: 2.1.0
>         Environment: Apache Spark 2.0.1 on yarn cluster (MapR distribution)
>            Reporter: Gyanendra Dwivedi
>            Priority: Major
>
> Currently the script name is hard-coded in the 'createBuilder()' method of 
> org.apache.spark.launcher.SparkLauncher class:
> {code:java}
> // code placeholder
> private ProcessBuilder createBuilder() {
>     List<String> cmd = new ArrayList();
>     String script = CommandBuilderUtils.isWindows() ? "spark-submit.cmd" : 
> "spark-submit";
>     cmd.add(CommandBuilderUtils.join(File.separator, new 
> String[]{this.builder.getSparkHome(), "bin", script}));
>     cmd.addAll(this.builder.buildSparkSubmitArgs());
> ......
> ......
> }{code}
>  
>  
> It has following issues, which prevents its usage in certain scenario. 
> 1) Developer may not use their own custom scripts with different name. They 
> are forced to use the one shipped with the installation. Overwriting that may 
> not be the option, when it is not allowed to alter the original installation.
> 2) The code expect the script to be present at "SPARK_HOME/bin" folder. 
> 3) The 'createBuilder()' method is private and hence, extending the 
> 'org.apache.spark.launcher.SparkLauncher' is not an option.
>  
> Proposed solution:
> 1) Developer should be given an optional parameter to set their own custom 
> script, which may be located at any path.
> 2) Only in case the parameter is not set, the default spark-submit script 
> should be taken from SPARK_HOME/bin folder.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to