unsubscribe

On Tue, Nov 1, 2016 at 8:56 AM, Jan Botorek <jan.boto...@infor.com> wrote:

> Thank you for the reply.
>
> I am aware of the parameters used when submitting the tasks (--jars is
> working for us).
>
>
>
> But, isn’t there any way how to specify a location (directory) for jars
> „in global“ - in the spark-defaults.conf??
>
>
>
>
>
> *From:* ayan guha [mailto:guha.a...@gmail.com]
> *Sent:* Tuesday, November 1, 2016 1:49 PM
> *To:* Jan Botorek <jan.boto...@infor.com>
> *Cc:* user <user@spark.apache.org>
> *Subject:* Re: Add jar files on classpath when submitting tasks to Spark
>
>
>
> There are options to specify external jars in the form of --jars,
> --driver-classpath etc depending on spark version and cluster manager..
> Please see spark documents for configuration sections and/or run spark
> submit help to see available options.
>
> On 1 Nov 2016 23:13, "Jan Botorek" <jan.boto...@infor.com> wrote:
>
> Hello,
>
> I have a problem trying to add jar files to be available on classpath when
> submitting task to Spark.
>
>
>
> In my spark-defaults.conf file I have configuration:
>
> *spark.driver.extraClassPath = path/to/folder/with/jars*
>
> all jars in the folder are available in SPARK-SHELL
>
>
>
> The problem is that jars are not on the classpath for SPARK-MASTER; more
> precisely – when I submit any job that utilizes any jar from external
> folder, the* java.lang.ClassNotFoundException* is thrown.
>
> Moving all external jars into the *jars* folder solves the situation, but
> we need to keep external files separatedly.
>
>
>
> Thank you for any help
>
> Best regards,
>
> Jan
>
>

Reply via email to