You have kept 3rd party jars at hdfs. I don't think executors as of today
can download jars from hdfs..  Can you try with a shared directory..
Application jar is downloaded by executors through http server..

-Raghav
On 12 May 2016 00:04, "Giri P" <gpatc...@gmail.com> wrote:

> Yes..They are  reachable. Application jar which I send as argument is at
> same location as third party jar. Application jar is getting uploaded.
>
> On Wed, May 11, 2016 at 10:51 AM, lalit sharma <lalitkishor...@gmail.com>
> wrote:
>
>> Point to note as per docs as well :
>>
>> *Note that jars or python files that are passed to spark-submit should be
>> URIs reachable by Mesos slaves, as the Spark driver doesn’t automatically
>> upload local jars.**http://spark.apache.org/docs/latest/running-on-mesos.html
>> <http://spark.apache.org/docs/latest/running-on-mesos.html> *
>>
>> On Wed, May 11, 2016 at 10:05 PM, Giri P <gpatc...@gmail.com> wrote:
>>
>>> I'm not using docker
>>>
>>> On Wed, May 11, 2016 at 8:47 AM, Raghavendra Pandey <
>>> raghavendra.pan...@gmail.com> wrote:
>>>
>>>> By any chance, are you using docker to execute?
>>>> On 11 May 2016 21:16, "Raghavendra Pandey" <
>>>> raghavendra.pan...@gmail.com> wrote:
>>>>
>>>>> On 11 May 2016 02:13, "gpatcham" <gpatc...@gmail.com> wrote:
>>>>>
>>>>> >
>>>>>
>>>>> > Hi All,
>>>>> >
>>>>> > I'm using --jars option in spark-submit to send 3rd party jars . But
>>>>> I don't
>>>>> > see they are actually passed to mesos slaves. Getting Noclass found
>>>>> > exceptions.
>>>>> >
>>>>> > This is how I'm using --jars option
>>>>> >
>>>>> > --jars hdfs://namenode:8082/user/path/to/jar
>>>>> >
>>>>> > Am I missing something here or what's the correct  way to do ?
>>>>> >
>>>>> > Thanks
>>>>> >
>>>>> >
>>>>> >
>>>>> > --
>>>>> > View this message in context:
>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Not-able-pass-3rd-party-jars-to-mesos-executors-tp26918.html
>>>>> > Sent from the Apache Spark User List mailing list archive at
>>>>> Nabble.com.
>>>>> >
>>>>> > ---------------------------------------------------------------------
>>>>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>> > For additional commands, e-mail: user-h...@spark.apache.org
>>>>> >
>>>>>
>>>>
>>>
>>
>

Reply via email to