By any chance, are you using docker to execute?
On 11 May 2016 21:16, "Raghavendra Pandey" <raghavendra.pan...@gmail.com>
wrote:

> On 11 May 2016 02:13, "gpatcham" <gpatc...@gmail.com> wrote:
>
> >
>
> > Hi All,
> >
> > I'm using --jars option in spark-submit to send 3rd party jars . But I
> don't
> > see they are actually passed to mesos slaves. Getting Noclass found
> > exceptions.
> >
> > This is how I'm using --jars option
> >
> > --jars hdfs://namenode:8082/user/path/to/jar
> >
> > Am I missing something here or what's the correct  way to do ?
> >
> > Thanks
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Not-able-pass-3rd-party-jars-to-mesos-executors-tp26918.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>

Reply via email to