[ https://issues.apache.org/jira/browse/SPARK-18648?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Michel Lemay closed SPARK-18648. -------------------------------- Resolution: Not A Bug > spark-shell --jars option does not add jars to classpath on windows > ------------------------------------------------------------------- > > Key: SPARK-18648 > URL: https://issues.apache.org/jira/browse/SPARK-18648 > Project: Spark > Issue Type: Bug > Components: Spark Shell, Windows > Affects Versions: 2.0.2 > Environment: Windows 7 x64 > Reporter: Michel Lemay > Labels: windows > > I can't import symbols from command line jars when in the shell: > Adding jars via --jars: > {code} > spark-shell --master local[*] --jars path\to\deeplearning4j-core-0.7.0.jar > {code} > Same result if I add it through maven coordinates: > {code}spark-shell --master local[*] --packages > org.deeplearning4j:deeplearning4j-core:0.7.0 > {code} > I end up with: > {code} > scala> import org.deeplearning4j > <console>:23: error: object deeplearning4j is not a member of package org > import org.deeplearning4j > {code} > NOTE: It is working as expected when running on linux. > Sample output with --verbose: > {code} > Using properties file: null > Parsed arguments: > master local[*] > deployMode null > executorMemory null > executorCores null > totalExecutorCores null > propertiesFile null > driverMemory null > driverCores null > driverExtraClassPath null > driverExtraLibraryPath null > driverExtraJavaOptions null > supervise false > queue null > numExecutors null > files null > pyFiles null > archives null > mainClass org.apache.spark.repl.Main > primaryResource spark-shell > name Spark shell > childArgs [] > jars > file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.7.0.jar > packages null > packagesExclusions null > repositories null > verbose true > Spark properties used, including those specified through > --conf and those from the properties file null: > Main class: > org.apache.spark.repl.Main > Arguments: > System properties: > SPARK_SUBMIT -> true > spark.app.name -> Spark shell > spark.jars -> > file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.7.0.jar > spark.submit.deployMode -> client > spark.master -> local[*] > Classpath elements: > file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.7.0.jar > 16/11/30 08:30:49 WARN NativeCodeLoader: Unable to load native-hadoop library > for your platform... using builtin-java classes where applicable > 16/11/30 08:30:51 WARN SparkContext: Use an existing SparkContext, some > configuration may not take effect. > Spark context Web UI available at http://192.168.70.164:4040 > Spark context available as 'sc' (master = local[*], app id = > local-1480512651325). > Spark session available as 'spark'. > Welcome to > ____ __ > / __/__ ___ _____/ /__ > _\ \/ _ \/ _ `/ __/ '_/ > /___/ .__/\_,_/_/ /_/\_\ version 2.0.2 > /_/ > Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_101) > Type in expressions to have them evaluated. > Type :help for more information. > scala> import org.deeplearning4j > <console>:23: error: object deeplearning4j is not a member of package org > import org.deeplearning4j > ^ > scala> > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org