Hello,
This approach unfortunately doesn’t work for job submission for me. It works in 
the shell, but not when submitted.
I ensured the (only worker) node has desired directory.

Neither specifying all jars as you suggested, neither using /path/to/jarfiles/* 
works.

Could you verify, that using this settings you are able to submit jobs with 
according dependencies, please?

From: Mich Talebzadeh [mailto:mich.talebza...@gmail.com]
Sent: Tuesday, November 1, 2016 2:18 PM
To: Vinod Mangipudi <vinod...@gmail.com>
Cc: user <user@spark.apache.org>
Subject: Re: Add jar files on classpath when submitting tasks to Spark

you can do that as long as every node has the directory referenced.

For example

spark.driver.extraClassPath      
/home/hduser/jars/ojdbc6.jar:/home/hduser/jars/jconn4.jar
spark.executor.extraClassPath    
/home/hduser/jars/ojdbc6.jar:/home/hduser/jars/jconn4.jar

this will work as long as all nodes have that directory.

The other alternative is to mount the shared directory as NFS mount across all 
the nodes and all the noses can read from that shared directory

HTH






Dr Mich Talebzadeh



LinkedIn  
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw



http://talebzadehmich.wordpress.com



Disclaimer: Use it at your own risk. Any and all responsibility for any loss, 
damage or destruction of data or any other property which may arise from 
relying on this email's technical content is explicitly disclaimed. The author 
will in no case be liable for any monetary damages arising from such loss, 
damage or destruction.



On 1 November 2016 at 13:04, Vinod Mangipudi 
<vinod...@gmail.com<mailto:vinod...@gmail.com>> wrote:
unsubscribe

On Tue, Nov 1, 2016 at 8:56 AM, Jan Botorek 
<jan.boto...@infor.com<mailto:jan.boto...@infor.com>> wrote:
Thank you for the reply.
I am aware of the parameters used when submitting the tasks (--jars is working 
for us).

But, isn’t there any way how to specify a location (directory) for jars „in 
global“ - in the spark-defaults.conf??


From: ayan guha [mailto:guha.a...@gmail.com<mailto:guha.a...@gmail.com>]
Sent: Tuesday, November 1, 2016 1:49 PM
To: Jan Botorek <jan.boto...@infor.com<mailto:jan.boto...@infor.com>>
Cc: user <user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Re: Add jar files on classpath when submitting tasks to Spark


There are options to specify external jars in the form of --jars, 
--driver-classpath etc depending on spark version and cluster manager.. Please 
see spark documents for configuration sections and/or run spark submit help to 
see available options.
On 1 Nov 2016 23:13, "Jan Botorek" 
<jan.boto...@infor.com<mailto:jan.boto...@infor.com>> wrote:
Hello,
I have a problem trying to add jar files to be available on classpath when 
submitting task to Spark.

In my spark-defaults.conf file I have configuration:
spark.driver.extraClassPath = path/to/folder/with/jars
all jars in the folder are available in SPARK-SHELL

The problem is that jars are not on the classpath for SPARK-MASTER; more 
precisely – when I submit any job that utilizes any jar from external folder, 
the java.lang.ClassNotFoundException is thrown.
Moving all external jars into the jars folder solves the situation, but we need 
to keep external files separatedly.

Thank you for any help
Best regards,
Jan


Reply via email to