[ 
https://issues.apache.org/jira/browse/SPARK-17126?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15432707#comment-15432707
 ] 

Ozioma Ihekwoaba commented on SPARK-17126:
------------------------------------------

Hi Sean,
Thanks for the update.
What I meant was the driver path and executor path entries make it to the Web 
UI.
Meaning the values I set for the driver classpath and executor classpath are 
read by Spark during startup.
However, the jars I specified in the 2 paths are not on the classpath entries 
list in the Web UI.
They are also not loaded by Spark during startup.
For example, the Spark CSV jar and other associated jars are not loaded.
On Linux, the driver path jars and executor path jars are successfully added to 
the Spark classpath,
IN ADDITION to being listed in the Spark Web UI environment tab.
On Windows, the jars in the folder do not get listed in the Spark Web UI.

I finally found a solution to this on Windows, I simply set SPARK_CLASSPATH. 
That was it.
In summary, this worked when set in spark-env.cmd:
set SPARK_CLASSPATH=C://hadoop//spark//v162//lib//*

But none of these did not work when set in spark-defaults.conf:
spark.driver.extraC‌​lassPath  C:\\hadoop\\spark\\v162\\lib\\*
spark.driver.extraC‌​lassPath  C://hadoop//spark//v162//lib//*
spark.driver.extraC‌​lassPath  
C:\\hadoop\\spark\\v162\\lib\\mysql-connector-java-5.1.25-bin.jar;
spark.driver.extraC‌​lassPath  
C:\\hadoop\\spark\\v162\\lib\\mysql-connector-java-5.1.25-bin.jar
spark.driver.extraC‌​lassPath  file:/C:/hadoop/spark/v162/lib/*jar;
spark.driver.extraC‌​lassPath  
file:///C:/hadoop/spark/v162/lib/mysql-connector-java-5.1.25-bin.jar;

What I needed was add all necessary jars to the classpath during startup, I 
found the commandline syntax for 
adding packages and driver jars too cumbersome.

Still wondering why just dropping jars in the lib folder (pre 2.0 versions) 
does not suffice as a default
folder to resolve jars.

Thanks,
Ozzy

> Errors setting driver classpath in spark-defaults.conf on Windows 7
> -------------------------------------------------------------------
>
>                 Key: SPARK-17126
>                 URL: https://issues.apache.org/jira/browse/SPARK-17126
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Shell, SQL
>    Affects Versions: 1.6.1
>         Environment: Windows 7
>            Reporter: Ozioma Ihekwoaba
>
> I am having issues starting up Spark shell with a local hive-site.xml on 
> Windows 7.
> I have a local Hive 2.1.0 instance on Windows using a MySQL metastore.
> The Hive instance is working fine.
> I copied over the hive-site.xml to my local instance of Spark 1.6.1 conf 
> folder and also copied over mysql-connector-java-5.1.25-bin.jar to the lib 
> folder.
> I was expecting Spark to pick up jar files in the lib folder automatically, 
> but found out Spark expects a spark.driver.extraC‌​lassPath and 
> spark.executor.extraClassPath settings to resolve jars.
> Thing is this has failed on Windows for me with a 
> DataStoreDriverNotFoundException saying com.mysql.jdbc.Driver could not be 
> found.
> Here are some of the different file paths I've tried:
> C:/hadoop/spark/v161/lib/mysql-connector-java-5.1.25-bin.jar;C:/hadoop/spark/v161/lib/commons-csv-1.4.jar;C:/hadoop/spark/v161/lib/spark-csv_2.11-1.4.0.jar
> ".;C:\hadoop\spark\v161\lib\*"
> ....NONE has worked so far.
> Please, what is the correct way to set driver classpaths on Windows?
> Also, what is the correct file path format on Windows?
> I have it working fine on Linux but my current engagement requires me to run 
> Spark on a Windows box.
> Is there a way for Spark to automatically resolve jars from the lib folder in 
> all modes?
> Thanks.
> Ozzy



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to