[ 
https://issues.apache.org/jira/browse/SPARK-26190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16700937#comment-16700937
 ] 

Marcelo Vanzin commented on SPARK-26190:
----------------------------------------

If you need to run things before spark-submit runs, considering writing your 
own spark-env.sh that does what you need. That's a supported feature of Spark.

Sorry but I still don't think it's a good idea to provide the functionality 
you're asking for.

> SparkLauncher: Allow users to set their own submitter script instead of 
> hardcoded spark-submit
> ----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26190
>                 URL: https://issues.apache.org/jira/browse/SPARK-26190
>             Project: Spark
>          Issue Type: Improvement
>          Components: Java API, Spark Core, Spark Submit
>    Affects Versions: 2.1.0
>         Environment: Apache Spark 2.0.1 on yarn cluster (MapR distribution)
>            Reporter: Gyanendra Dwivedi
>            Priority: Major
>
> The improvement request is for improvement in the SparkLauncher class which 
> is responsible to execute builtin spark-submit script using Java API.
> In my use case, there is a custom wrapper script which help in integrating 
> the security features while submitting the spark job using builtin 
> spark-submit.
> Currently the script name is hard-coded in the 'createBuilder()' method of 
> org.apache.spark.launcher.SparkLauncher class:
> {code:java}
> // code placeholder
> private ProcessBuilder createBuilder() {
>     List<String> cmd = new ArrayList();
>     String script = CommandBuilderUtils.isWindows() ? "spark-submit.cmd" : 
> "spark-submit";
>     cmd.add(CommandBuilderUtils.join(File.separator, new 
> String[]{this.builder.getSparkHome(), "bin", script}));
>     cmd.addAll(this.builder.buildSparkSubmitArgs());
> ......
> ......
> }{code}
>  
>  
> It has following issues, which prevents its usage in certain scenario. 
> 1) Developer may not use their own custom scripts with different name. They 
> are forced to use the one shipped with the installation. Overwriting that may 
> not be the option, when it is not allowed to alter the original installation.
> 2) The code expect the script to be present at "SPARK_HOME/bin" folder. 
> 3) The 'createBuilder()' method is private and hence, extending the 
> 'org.apache.spark.launcher.SparkLauncher' is not an option.
>  
> Proposed solution:
> 1) Developer should be given an optional parameter to set their own custom 
> script, which may be located at any path.
> 2) Only in case the parameter is not set, the default spark-submit script 
> should be taken from SPARK_HOME/bin folder.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to