There are options to specify external jars in the form of --jars, --driver-classpath etc depending on spark version and cluster manager.. Please see spark documents for configuration sections and/or run spark submit help to see available options. On 1 Nov 2016 23:13, "Jan Botorek" <jan.boto...@infor.com> wrote:
> Hello, > > I have a problem trying to add jar files to be available on classpath when > submitting task to Spark. > > > > In my spark-defaults.conf file I have configuration: > > *spark.driver.extraClassPath = path/to/folder/with/jars* > > all jars in the folder are available in SPARK-SHELL > > > > The problem is that jars are not on the classpath for SPARK-MASTER; more > precisely – when I submit any job that utilizes any jar from external > folder, the* java.lang.ClassNotFoundException* is thrown. > > Moving all external jars into the *jars* folder solves the situation, but > we need to keep external files separatedly. > > > > Thank you for any help > > Best regards, > > Jan >