vijay created SPARK-6435:
----------------------------

             Summary: spark-shell --jars option does not add all jars to 
classpath
                 Key: SPARK-6435
                 URL: https://issues.apache.org/jira/browse/SPARK-6435
             Project: Spark
          Issue Type: Bug
          Components: Spark Shell
    Affects Versions: 1.3.0
         Environment: Win64
            Reporter: vijay


Not all jars supplied via the --jars option will be added to the driver (and 
presumably executor) classpath.  The first jar(s) will be added, but not all.

To reproduce this, just add a few jars (I tested 5) to the --jars option, and 
then try to import a class from the last jar.  This fails.  A simple 
reproducer: 

Create a bunch of dummy jars:
jar cfM jar1.jar log.txt
jar cfM jar2.jar log.txt
jar cfM jar3.jar log.txt
jar cfM jar4.jar log.txt

Start the spark-shell with the dummy jars and guava at the end:
%SPARK_HOME%\bin\spark-shell --master local --jars jar1.jar,jar2.jar,jar
3.jar,jar4.jar,c:\code\lib\guava-14.0.1.jar

In the shell, try importing from guava; you'll get an error:
{code}
scala> import com.google.common.base.Strings
<console>:19: error: object Strings is not a member of package 
com.google.common.base
       import com.google.common.base.Strings
              ^
{code}






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to