Hi

With spark-submit we can start a new spark job,  but it will not add new
jar files in already running job.

~Sushil

On Wed, May 23, 2018, 17:28 kedarsdixit <kedarnath_di...@persistent.com>
wrote:

> Hi,
>
> You can add dependencies in spark-submit as below:
>
> ./bin/spark-submit \
>   --class <main-class> \
>   --master <master-url> \
>   --deploy-mode <deploy-mode> \
>   --conf <key>=<value> \
>   *--jars <jars with location>* \
>   ... # other options
>   <application-jar> \
>   [application-arguments]
>
> Hope this helps.
>
> Regards,
>
> Kedar Dixit
> Data Science at Persistent Systems Ltd
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to