1) Does your local program use the native library before submitting
the job to the cluster?

Here is an example of using native code in MR
https://github.com/brockn/hadoop-thumbnail

2) I thought libjars would work for local classpath issues as well as
remove. However, to add the jar to your local classpath as well you
can:

env HADOOP_CLASSPATH=my.jar hadoop jar ...

Brock


On Thu, Oct 25, 2012 at 7:11 PM, Dipesh Khakhkhar
<dipeshsoftw...@gmail.com> wrote:
> Thanks for answering my query.
>
> 1. I have tried -files path _o_my_libary.so while invoking my MR application
> but I still UnsatisfiedLinkError: no mylibrary in java.library.path
>
> 2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
> provide -libjars path_to_myfile.jar and tried running my MR application
> (bin/hadoop jar......) but it failed to load class from the jar file
> mentioned in libjars path. I'm using this classes from this jar before
> launching my M/R jobs.
>
> Unfortunately above methods didn't work for me.
>
> Thanks.
>
>
> On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <br...@cloudera.com> wrote:
>>
>> Hi,
>>
>> That should be:
>>
>> -files path_to_my_library.so
>>
>> and to include jars in for your mrjobs, you would do:
>>
>> 2) -libjars path_to_my1.jar,path_to_my2.jar
>>
>> Brock
>>
>> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
>> <dipeshsoftw...@gmail.com> wrote:
>> > Hi,
>> >
>> > I am a new hadoop user and have few very basic questions (they might
>> > sound
>> > very stupid to many people so please bear with me).
>> >
>> > I am running a MR task and my launcher program needs to load a library
>> > using
>> > System.loadLibrary(somelibrary). This works fine if I put this library
>> > in
>> > lib/native/Linux-amd64-64. I tried the following -
>> >
>> > 1. provided -files=/path_to_directory_containging_my_library
>> > 2. provided the following in mapred-site.xml (didn't try it in
>> > core-site.xml
>> > or hdfs-site.xml)
>> >
>> > -Djava.library.path=//path_to_directory_containging_my_library
>> >
>> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
>> > purpose.
>> >
>> > I have a production environment where I'm running 4 data nodes and
>> > currently
>> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each
>> > node's
>> > hadoop installation.
>> >
>> > A related question regarding providing jars required for running the
>> > whole
>> > M/R application - currently I have edited hadoop-classpath variable in
>> > hadoop-env.sh. For cluster if I provide -libjars option will that work
>> > without editing classpath? I require this jar's classes before launching
>> > M/R
>> > jobs.
>> >
>> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
>> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in
>> > the
>> > lib directory of hadoop installation.
>> >
>> > Thanks in advance for answering my queries.
>> >
>> > Thanks.
>>
>>
>>
>> --
>> Apache MRUnit - Unit testing MapReduce -
>> http://incubator.apache.org/mrunit/
>
>



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/

Reply via email to