[ 
https://issues.apache.org/jira/browse/SPARK-6435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14383597#comment-14383597
 ] 

Masayoshi TSUZUKI commented on SPARK-6435:
------------------------------------------

This cause is the following code in spark-class2.cmd
{code}
for /f "tokens=*" %%i in ('cmd /C ""%RUNNER%" -cp %LAUNCHER_CP% 
org.apache.spark.launcher.Main %*"') do (
  set SPARK_CMD=%%i
)
%SPARK_CMD%
{code}

"for" clause is a little bit complex, but we can get this when we expand its 
variables between the single quotations.
{code}
'cmd /C ""C:\Program Files\Java\jdk1.7.0_67\bin\java" -cp 
C:\Users\tsudukim\Documents\workspace\spark-dev\bin\..\launcher\target\scala-2.10\classes
 org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.repl.Main --jars "C:\jar1.jar,C:\jar2.jar""'
{code}
and when this is executed, java code 
(launcher\src\main\java\org\apache\spark\launcher\Main.java) receives the args 
like followings as "argsArray".
{code}
[0] = {java.lang.String@410}"org.apache.spark.deploy.SparkSubmit"
[1] = {java.lang.String@417}"--class"
[2] = {java.lang.String@418}"org.apache.spark.repl.Main"
[3] = {java.lang.String@419}"--jars"
[4] = {java.lang.String@420}"C:\jar1.jar C:\jar2.jar"
{code}
The comma between C:\jar1.jar and C:\jar2.jar disappeared here.

The handling of double quotation in Windows batch is so difficult.
I'm not sure but perhaps it was parsed twice.
The separator of args is only space in bash, but space, semicolon and comma are 
also used in Windows batch.
So comma was converted to space in 1st parse.

To avoid this problem, I think it's better to parse only once, not twice.
(escaping double-quotations in Windows batch is so painful.)

> spark-shell --jars option does not add all jars to classpath
> ------------------------------------------------------------
>
>                 Key: SPARK-6435
>                 URL: https://issues.apache.org/jira/browse/SPARK-6435
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell, Windows
>    Affects Versions: 1.3.0
>         Environment: Win64
>            Reporter: vijay
>
> Not all jars supplied via the --jars option will be added to the driver (and 
> presumably executor) classpath.  The first jar(s) will be added, but not all.
> To reproduce this, just add a few jars (I tested 5) to the --jars option, and 
> then try to import a class from the last jar.  This fails.  A simple 
> reproducer: 
> Create a bunch of dummy jars:
> jar cfM jar1.jar log.txt
> jar cfM jar2.jar log.txt
> jar cfM jar3.jar log.txt
> jar cfM jar4.jar log.txt
> Start the spark-shell with the dummy jars and guava at the end:
> %SPARK_HOME%\bin\spark-shell --master local --jars 
> jar1.jar,jar2.jar,jar3.jar,jar4.jar,c:\code\lib\guava-14.0.1.jar
> In the shell, try importing from guava; you'll get an error:
> {code}
> scala> import com.google.common.base.Strings
> <console>:19: error: object Strings is not a member of package 
> com.google.common.base
>        import com.google.common.base.Strings
>               ^
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to