[ 
https://issues.apache.org/jira/browse/SPARK-6435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14375164#comment-14375164
 ] 

Sean Owen commented on SPARK-6435:
----------------------------------

I tried a simplified version of this with {{spark-shell}}:

{code}
spark-shell --master local --jars android-core.jar,core.jar,javase.jar
{code}

and it worked as expected. I was able to access code in all the JARs. Same 
worked on YARN.

The JARs were added too:

{code}
15/03/22 13:57:53 INFO SparkContext: Added JAR 
file:/home/srowen/android-core.jar at 
http://10.16.180.26:49005/jars/android-core.jar with timestamp 1427057873313
15/03/22 13:57:53 INFO SparkContext: Added JAR file:/home/srowen/core.jar at 
http://10.16.180.26:49005/jars/core.jar with timestamp 1427057873315
15/03/22 13:57:53 INFO SparkContext: Added JAR file:/home/srowen/javase.jar at 
http://10.16.180.26:49005/jars/javase.jar with timestamp 1427057873315
{code}

This wasn't Windows though. I suppose it's possible there's something wrong 
with how the classpath is reassembled for Windows or something. Or could have 
been fixed along the way.

> spark-shell --jars option does not add all jars to classpath
> ------------------------------------------------------------
>
>                 Key: SPARK-6435
>                 URL: https://issues.apache.org/jira/browse/SPARK-6435
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.3.0
>         Environment: Win64
>            Reporter: vijay
>
> Not all jars supplied via the --jars option will be added to the driver (and 
> presumably executor) classpath.  The first jar(s) will be added, but not all.
> To reproduce this, just add a few jars (I tested 5) to the --jars option, and 
> then try to import a class from the last jar.  This fails.  A simple 
> reproducer: 
> Create a bunch of dummy jars:
> jar cfM jar1.jar log.txt
> jar cfM jar2.jar log.txt
> jar cfM jar3.jar log.txt
> jar cfM jar4.jar log.txt
> Start the spark-shell with the dummy jars and guava at the end:
> %SPARK_HOME%\bin\spark-shell --master local --jars 
> jar1.jar,jar2.jar,jar3.jar,jar4.jar,c:\code\lib\guava-14.0.1.jar
> In the shell, try importing from guava; you'll get an error:
> {code}
> scala> import com.google.common.base.Strings
> <console>:19: error: object Strings is not a member of package 
> com.google.common.base
>        import com.google.common.base.Strings
>               ^
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to