When running something like this:

    spark-shell --jars foo.jar,bar.jar

This keeps failing to include the tail of the jars list. Digging into the
launch scripts I found that the comma makes it so that the list was sent as
separate parameters. So, to keep things together, I tried 

    spark-shell --jars "foo.jar, bar.jar"

But, this still failed as the quotes carried over into some of the string
checks and resulted in invalid character errors. So, I am curious if anybody
sees a problem with making a PR to fix the script from

...
    if "x%2"=="x" (
      echo "%1" requires an argument. >&2
      exit /b 1
    )
    set SUBMISSION_OPTS=%SUBMISSION_OPTS% %1 %2
...

TO

...
    if "x%~2"=="x" (
      echo "%1" requires an argument. >&2
      exit /b 1
    )
    set SUBMISSION_OPTS=%SUBMISSION_OPTS% %1 %~2
...

The only difference is the use of the tilde to remove any surrounding quotes
if there are some. 

I figured I would ask here first to vet any unforeseen bugs this might cause
in other systems. As far as I know this should be harmless and only make it
so that comma separated lists will work in DOS.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Windows-DOS-bug-in-windows-utils-cmd-tp22946.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to