[ https://issues.apache.org/jira/browse/SPARK-6435?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen updated SPARK-6435: ----------------------------- Component/s: Windows Great debugging! [~tsudukim] do you have thoughts on this? I think this bit was part of your change in https://github.com/apache/spark/commit/8d932475e6759e869c16ce6cac203a2e56558716#diff-7ac5881d6bad553b23f5225775c8fde3 So, it sounds like you do need to quote the comma-separated arg? but then quoting doesn't work as expected? The {{"x%2"=="x"}} idiom is used several places in the Windows scripts. Is the square bracket syntax definitely preferred? > spark-shell --jars option does not add all jars to classpath > ------------------------------------------------------------ > > Key: SPARK-6435 > URL: https://issues.apache.org/jira/browse/SPARK-6435 > Project: Spark > Issue Type: Bug > Components: Spark Shell, Windows > Affects Versions: 1.3.0 > Environment: Win64 > Reporter: vijay > > Not all jars supplied via the --jars option will be added to the driver (and > presumably executor) classpath. The first jar(s) will be added, but not all. > To reproduce this, just add a few jars (I tested 5) to the --jars option, and > then try to import a class from the last jar. This fails. A simple > reproducer: > Create a bunch of dummy jars: > jar cfM jar1.jar log.txt > jar cfM jar2.jar log.txt > jar cfM jar3.jar log.txt > jar cfM jar4.jar log.txt > Start the spark-shell with the dummy jars and guava at the end: > %SPARK_HOME%\bin\spark-shell --master local --jars > jar1.jar,jar2.jar,jar3.jar,jar4.jar,c:\code\lib\guava-14.0.1.jar > In the shell, try importing from guava; you'll get an error: > {code} > scala> import com.google.common.base.Strings > <console>:19: error: object Strings is not a member of package > com.google.common.base > import com.google.common.base.Strings > ^ > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org