[ 
https://issues.apache.org/jira/browse/SPARK-26190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16701136#comment-16701136
 ] 

Gyanendra Dwivedi commented on SPARK-26190:
-------------------------------------------

[~vanzin] I am sorry I am not able to explain you a "real" enterprise level 
limitations for developers. I cannot expose more on why a custom script is the 
only option !!

Can you give me one good reason that why you must have the script name 
hard-coded in the SparkLauncher? Why SparkLauncher should expect the script 
name "spark-submit" for non-window OS in a path SPARK_HOME/bin only?

I am not willing to invest my time justifying any more, feel free to close or 
whatever. This thing was for an Spark improvement from a developer's real time 
challenge.

Anyway by the time it comes back to me (if someone fixes it);  its too late for 
my bus.  

I will just patch it and move on, as I do with most of the poorly written APIs.

> SparkLauncher: Allow users to set their own submitter script instead of 
> hardcoded spark-submit
> ----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26190
>                 URL: https://issues.apache.org/jira/browse/SPARK-26190
>             Project: Spark
>          Issue Type: Improvement
>          Components: Java API, Spark Core, Spark Submit
>    Affects Versions: 2.1.0
>         Environment: Apache Spark 2.0.1 on yarn cluster (MapR distribution)
>            Reporter: Gyanendra Dwivedi
>            Priority: Major
>
> The improvement request is for improvement in the SparkLauncher class which 
> is responsible to execute builtin spark-submit script using Java API.
> In my use case, there is a custom wrapper script which help in integrating 
> the security features while submitting the spark job using builtin 
> spark-submit.
> Currently the script name is hard-coded in the 'createBuilder()' method of 
> org.apache.spark.launcher.SparkLauncher class:
> {code:java}
> // code placeholder
> private ProcessBuilder createBuilder() {
>     List<String> cmd = new ArrayList();
>     String script = CommandBuilderUtils.isWindows() ? "spark-submit.cmd" : 
> "spark-submit";
>     cmd.add(CommandBuilderUtils.join(File.separator, new 
> String[]{this.builder.getSparkHome(), "bin", script}));
>     cmd.addAll(this.builder.buildSparkSubmitArgs());
> ......
> ......
> }{code}
>  
>  
> It has following issues, which prevents its usage in certain scenario. 
> 1) Developer may not use their own custom scripts with different name. They 
> are forced to use the one shipped with the installation. Overwriting that may 
> not be the option, when it is not allowed to alter the original installation.
> 2) The code expect the script to be present at "SPARK_HOME/bin" folder. 
> 3) The 'createBuilder()' method is private and hence, extending the 
> 'org.apache.spark.launcher.SparkLauncher' is not an option.
>  
> Proposed solution:
> 1) Developer should be given an optional parameter to set their own custom 
> script, which may be located at any path.
> 2) Only in case the parameter is not set, the default spark-submit script 
> should be taken from SPARK_HOME/bin folder.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to