[ 
https://issues.apache.org/jira/browse/SPARK-6435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14393856#comment-14393856
 ] 

Marcelo Vanzin commented on SPARK-6435:
---------------------------------------

I haven't tested this on 1.3 so I can't comment. But are you guys sure this is 
an issue on master? See the session below:

{noformat}
C:\cygwin64\home\admin\work\spark>bin\spark-shell --jars 
..\jar1.jar,..\jar2.jar,..\jar3.jar
log4j:WARN No appenders could be found for logger 
(org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/04/02 18:02:56 INFO SecurityManager: Changing view acls to: admin
15/04/02 18:02:56 INFO SecurityManager: Changing modify acls to: admin
15/04/02 18:02:56 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(admin); users with 
modify permissions: Set(admin)
15/04/02 18:02:56 INFO HttpServer: Starting HTTP Server
15/04/02 18:02:57 INFO Utils: Successfully started service 'HTTP class server' 
on port 49289.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.4.0-SNAPSHOT
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_25)
Type in expressions to have them evaluated.
Type :help for more information.
15/04/02 18:03:44 INFO SparkContext: Running Spark version 1.4.0-SNAPSHOT
15/04/02 18:03:45 INFO SecurityManager: Changing view acls to: admin
15/04/02 18:03:45 INFO SecurityManager: Changing modify acls to: admin
15/04/02 18:03:45 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(admin); users with 
modify permissions: Set(admin)
15/04/02 18:03:53 INFO Slf4jLogger: Slf4jLogger started
15/04/02 18:03:54 INFO Remoting: Starting remoting
15/04/02 18:04:00 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkDriver@vanzin-win7:49302]
15/04/02 18:04:00 INFO Utils: Successfully started service 'sparkDriver' on 
port 49302.
15/04/02 18:04:00 INFO SparkEnv: Registering MapOutputTracker
15/04/02 18:04:01 INFO SparkEnv: Registering BlockManagerMaster
15/04/02 18:04:01 INFO DiskBlockManager: Created local directory at 
C:\Users\admin\AppData\Local\Temp\spark-6f29266b-d302-4917-9e7a-cbbc77d87faa\blockmgr-398614d1-f50d-450f-becb-9230aaf5200b
15/04/02 18:04:01 INFO MemoryStore: MemoryStore started with capacity 267.3 MB
15/04/02 18:04:02 INFO HttpFileServer: HTTP File server directory is 
C:\Users\admin\AppData\Local\Temp\spark-6f29266b-d302-4917-9e7a-cbbc77d87faa\httpd-f4510339-e54b-44cb-a201-22a23d32c8d6
15/04/02 18:04:02 INFO HttpServer: Starting HTTP Server
15/04/02 18:04:02 INFO Utils: Successfully started service 'HTTP file server' 
on port 49303.
15/04/02 18:04:03 INFO SparkEnv: Registering OutputCommitCoordinator
15/04/02 18:04:04 INFO Utils: Successfully started service 'SparkUI' on port 
4040.
15/04/02 18:04:04 INFO SparkUI: Started SparkUI at http://vanzin-win7:4040
15/04/02 18:04:05 INFO SparkContext: Added JAR 
file:/C:/cygwin64/home/admin/work/spark/../jar1.jar at 
http://192.168.56.101:49303/jars/jar1.jar with timestamp 1428023045788
15/04/02 18:04:05 INFO SparkContext: Added JAR 
file:/C:/cygwin64/home/admin/work/spark/../jar2.jar at 
http://192.168.56.101:49303/jars/jar2.jar with timestamp 1428023045804
15/04/02 18:04:05 INFO SparkContext: Added JAR 
file:/C:/cygwin64/home/admin/work/spark/../jar3.jar at 
http://192.168.56.101:49303/jars/jar3.jar with timestamp 1428023045819
15/04/02 18:04:07 INFO Executor: Starting executor ID <driver> on host localhost
15/04/02 18:04:07 INFO Executor: Using REPL class URI: 
http://192.168.56.101:49289
15/04/02 18:04:07 INFO AkkaUtils: Connecting to HeartbeatReceiver: 
akka.tcp://sparkDriver@vanzin-win7:49302/user/HeartbeatReceiver
15/04/02 18:04:10 INFO NettyBlockTransferService: Server created on 49310
15/04/02 18:04:10 INFO BlockManagerMaster: Trying to register BlockManager
15/04/02 18:04:10 INFO BlockManagerMasterActor: Registering block manager 
localhost:49310 with 267.3 MB RAM, BlockManagerId(<driver>, localhost, 49310)
15/04/02 18:04:10 INFO BlockManagerMaster: Registered BlockManager
15/04/02 18:04:15 INFO SparkILoop: Created spark context..
Spark context available as sc.
15/04/02 18:04:18 INFO SparkILoop: Created sql context (with Hive support)..
SQL context available as sqlContext.

scala> getClass().getResource("/test.txt")
res0: java.net.URL = 
jar:file:/C:/cygwin64/home/admin/work/spark/../jar1.jar!/test.txt

scala> getClass().getResource("/test2.txt")
res2: java.net.URL = 
jar:file:/C:/cygwin64/home/admin/work/spark/../jar2.jar!/test2.txt

scala>
{noformat}

The current PR is against master, and I'd like to avoid changing that code 
unless it's reeeeeally needed. Those batch scripts are already cryptic enough.

> spark-shell --jars option does not add all jars to classpath
> ------------------------------------------------------------
>
>                 Key: SPARK-6435
>                 URL: https://issues.apache.org/jira/browse/SPARK-6435
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell, Windows
>    Affects Versions: 1.3.0
>         Environment: Win64
>            Reporter: vijay
>
> Not all jars supplied via the --jars option will be added to the driver (and 
> presumably executor) classpath.  The first jar(s) will be added, but not all.
> To reproduce this, just add a few jars (I tested 5) to the --jars option, and 
> then try to import a class from the last jar.  This fails.  A simple 
> reproducer: 
> Create a bunch of dummy jars:
> jar cfM jar1.jar log.txt
> jar cfM jar2.jar log.txt
> jar cfM jar3.jar log.txt
> jar cfM jar4.jar log.txt
> Start the spark-shell with the dummy jars and guava at the end:
> %SPARK_HOME%\bin\spark-shell --master local --jars 
> jar1.jar,jar2.jar,jar3.jar,jar4.jar,c:\code\lib\guava-14.0.1.jar
> In the shell, try importing from guava; you'll get an error:
> {code}
> scala> import com.google.common.base.Strings
> <console>:19: error: object Strings is not a member of package 
> com.google.common.base
>        import com.google.common.base.Strings
>               ^
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to