Hi Hyung,

Thanks for the response. This I have tried but did not work.

regards
Bala

On 25 January 2016 at 11:27, Hyung Sung Shim <hss...@nflabs.com> wrote:

> Hello. Balachandar.
> In case of third one that you've tried, It must be first executed in the
> notebook.
> Could you try restart the zeppelin and run first the "%dep z.load()"
> paragraph?
>
>
> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <balachandar...@gmail.com>:
>
>> Hi
>>
>> Any help would be greatly appreciated :-)
>>
>>
>> ---------- Forwarded message ----------
>> From: Balachandar R.A. <balachandar...@gmail.com>
>> Date: 21 January 2016 at 14:11
>> Subject: Providing third party jar files to spark
>> To: users@zeppelin.incubator.apache.org
>>
>>
>> Hello
>>
>> My spark based map tasks needs to access third party jar files. I found
>> below options to submit third party jar files to spark interpreter
>>
>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma seprated> in
>> conf/zeppelin-env.sh
>>
>> 2. include the statement spark.jars  <all the jar files with comma
>> separated> in <spark>?conf/spark-defaults.conf
>>
>> 3. use the z.load("the location of jar file in the local filesystem") in
>> zepelin notebook
>>
>> I could test the first two and they both works fine. The third one does
>> not work. Here is the snippet i use
>>
>> %dep
>> z.reset()
>>
>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>
>>
>> Further, the import of class belongs to the above jar file is working
>> when I use the statement import com.....  in zeppelin notebook. However, I
>> get the class not found exception in the executor for the same class.
>>
>> Any clue here would help greatly
>>
>>
>> regards
>> Bala
>>
>>
>>
>>
>

Reply via email to