[ 
https://issues.apache.org/jira/browse/SPARK-20016?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15935005#comment-15935005
 ] 

Marcelo Vanzin commented on SPARK-20016:
----------------------------------------

This was a long time ago and mostly trial & error, since Windows batch files 
make no sense. Since I don't really have a Windows test env anymore, I'd 
appreciated if someone who does have one can try things out.

> SparkLauncher submit job failed after setConf with special charaters under 
> windows
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-20016
>                 URL: https://issues.apache.org/jira/browse/SPARK-20016
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API
>    Affects Versions: 2.0.0
>         Environment: windows 7, 8, 10, 2008, 2008R2, etc.
>            Reporter: Vincent Sun
>
> I am using sparkLauncher JAVA API to submit job to a remote spark cluster 
> master.  Codes looks like follow:
> /*
> * launch Job
> */
> public static void launch() throws Exception {
>     SparkLauncher spark = new SparkLauncher();      
> spark.setAppName("sparkdemo").setAppResource("hdfs://10.250.1.121:9000/application.jar").setMainClass("test.Application");
>     spark.setMaster(spark://10.250.1.120:6066);
>     spark.setDeployMode("cluster");
>     spark.setConf("spark.executor.cores","2") 
>     spark.setConf("spark.executor.memory","8G") 
>     spark.startApplication(new MyAppListener(job.getAppName()));
>   }
> It works fine under Linux/CentOS, but failed on my own desktop which is a 
> windows 8 OS. It will throw out error:
> <b>[launcher-proc-1] The filename, directory name, or volume label syntax is 
> incorrect.</b>
> The finial command I caught is this:
> spark-submit.cmd  --master spark://10.250.1.120:6066 --deploy-mode cluster 
> --name sparkdemo --conf "spark.executor.memory=8G" --conf 
> "spark.executor.cores=2"  --class test.Application 
> hdfs://10.250.1.121:9000/application.jar
> The quote on spark.executor.memory=8G and spark.executor.cores=2 cause the 
> exception.
> After debug into the source code I found the reason is at:
> quoteForBatchScript method of CommandBuilderUtils class
> It will add quotes while there is '=' or some other kinds of special 
> characters under windows system. Here is the source codes:
> static String quoteForBatchScript(String arg) {
>     boolean needsQuotes = false;
>     for (int i = 0; i < arg.length(); i++) {
>       int c = arg.codePointAt(i);
>       if (Character.isWhitespace(c) || c == '"' || c == '=' || c == ',' || c 
> == ';') {
>         needsQuotes = true;
>         break;
>       }
>     }
>     if (!needsQuotes) {
>       return arg;
>     }
>     StringBuilder quoted = new StringBuilder();
>     quoted.append("\"");
>     for (int i = 0; i < arg.length(); i++) {
>       int cp = arg.codePointAt(i);
>       switch (cp) {
>       case '"':
>         quoted.append('"');
>         break;
>       default:
>         break;
>       }
>       quoted.appendCodePoint(cp);
>     }
>     if (arg.codePointAt(arg.length() - 1) == '\\') {
>       quoted.append("\\");
>     }
>     quoted.append("\"");
>     return quoted.toString();
>   }



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to