Re: Loading native libraries

2009-02-11 Thread Rasit OZDAS
I have also the same problem.
It would be wonderful if someone has some info about this..

Rasit

2009/2/10 Mimi Sun m...@rapleaf.com:
 I see UnsatisfiedLinkError.  Also I'm calling
  System.getProperty(java.library.path) in the reducer and logging it. The
 only thing that prints out is
 ...hadoop-0.18.2/bin/../lib/native/Mac_OS_X-i386-32
 I'm using Cascading, not sure if that affects anything.

 - Mimi

 On Feb 10, 2009, at 11:40 AM, Arun C Murthy wrote:


 On Feb 10, 2009, at 11:06 AM, Mimi Sun wrote:

 Hi,

 I'm new to Hadoop and I'm wondering what the recommended method is for
 using native libraries in mapred jobs.
 I've tried the following separately:
 1. set LD_LIBRARY_PATH in .bashrc
 2. set LD_LIBRARY_PATH and  JAVA_LIBRARY_PATH in hadoop-env.sh
 3. set -Djava.library.path=... for mapred.child.java.opts

 For what you are trying (i.e. given that the JNI libs are present on all
 machines at a constant path) setting -Djava.library.path for the child task
 via mapred.child.java.opts should work. What are you seeing?

 Arun


 4. change bin/hadoop to include  $LD_LIBRARY_PATH in addition to the path
 it generates:  HADOOP_OPTS=$HADOOP_OPTS
 -Djava.library.path=$LD_LIBRARY_PATH:$JAVA_LIBRARY_PATH
 5. drop the .so files I need into hadoop/lib/native/...

 1~3 didn't work, 4 and 5 did but seem to be hacks. I also read that I can
 do this using DistributedCache, but that seems to be extra work for loading
 libraries that are already present on each machine. (I'm using the JNI libs
 for berkeley db).
 It seems that there should be a way to configure java.library.path for
 the mapred jobs.  Perhaps bin/hadoop should make use of LD_LIBRARY_PATH?

 Thanks,
 - Mimi






-- 
M. Raşit ÖZDAŞ


Re: Loading native libraries

2009-02-11 Thread Arun C Murthy


On Feb 10, 2009, at 12:24 PM, Mimi Sun wrote:

I see UnsatisfiedLinkError.  Also I'm calling   
System.getProperty(java.library.path) in the reducer and logging  
it. The only thing that prints out is ...hadoop-0.18.2/bin/../lib/ 
native/Mac_OS_X-i386-32

I'm using Cascading, not sure if that affects anything.



Hmm... that's odd. The framework does try to pass the user provided  
java.library.path down to the launched jvm. I assume your  
mapred.child.java.opts looks something like

-Xmx heapsize -Djava.library.path=path ?

Arun



Loading native libraries

2009-02-10 Thread Mimi Sun

Hi,

I'm new to Hadoop and I'm wondering what the recommended method is for  
using native libraries in mapred jobs.

I've tried the following separately:
1. set LD_LIBRARY_PATH in .bashrc
2. set LD_LIBRARY_PATH and  JAVA_LIBRARY_PATH in hadoop-env.sh
3. set -Djava.library.path=... for mapred.child.java.opts
4. change bin/hadoop to include  $LD_LIBRARY_PATH in addition to the  
path it generates:  HADOOP_OPTS=$HADOOP_OPTS -Djava.library.path= 
$LD_LIBRARY_PATH:$JAVA_LIBRARY_PATH

5. drop the .so files I need into hadoop/lib/native/...

1~3 didn't work, 4 and 5 did but seem to be hacks. I also read that I  
can do this using DistributedCache, but that seems to be extra work  
for loading libraries that are already present on each machine. (I'm  
using the JNI libs for berkeley db).
It seems that there should be a way to configure java.library.path for  
the mapred jobs.  Perhaps bin/hadoop should make use of LD_LIBRARY_PATH?


Thanks,
- Mimi


Re: Loading native libraries

2009-02-10 Thread Arun C Murthy


On Feb 10, 2009, at 11:06 AM, Mimi Sun wrote:


Hi,

I'm new to Hadoop and I'm wondering what the recommended method is  
for using native libraries in mapred jobs.

I've tried the following separately:
1. set LD_LIBRARY_PATH in .bashrc
2. set LD_LIBRARY_PATH and  JAVA_LIBRARY_PATH in hadoop-env.sh
3. set -Djava.library.path=... for mapred.child.java.opts


For what you are trying (i.e. given that the JNI libs are present on  
all machines at a constant path) setting -Djava.library.path for the  
child task via mapred.child.java.opts should work. What are you seeing?


Arun



4. change bin/hadoop to include  $LD_LIBRARY_PATH in addition to the  
path it generates:  HADOOP_OPTS=$HADOOP_OPTS -Djava.library.path= 
$LD_LIBRARY_PATH:$JAVA_LIBRARY_PATH

5. drop the .so files I need into hadoop/lib/native/...

1~3 didn't work, 4 and 5 did but seem to be hacks. I also read that  
I can do this using DistributedCache, but that seems to be extra  
work for loading libraries that are already present on each machine.  
(I'm using the JNI libs for berkeley db).
It seems that there should be a way to configure java.library.path  
for the mapred jobs.  Perhaps bin/hadoop should make use of  
LD_LIBRARY_PATH?


Thanks,
- Mimi




Re: Loading native libraries

2009-02-10 Thread Mimi Sun
I see UnsatisfiedLinkError.  Also I'm calling   
System.getProperty(java.library.path) in the reducer and logging it.  
The only thing that prints out is ...hadoop-0.18.2/bin/../lib/native/ 
Mac_OS_X-i386-32

I'm using Cascading, not sure if that affects anything.

- Mimi

On Feb 10, 2009, at 11:40 AM, Arun C Murthy wrote:



On Feb 10, 2009, at 11:06 AM, Mimi Sun wrote:


Hi,

I'm new to Hadoop and I'm wondering what the recommended method is  
for using native libraries in mapred jobs.

I've tried the following separately:
1. set LD_LIBRARY_PATH in .bashrc
2. set LD_LIBRARY_PATH and  JAVA_LIBRARY_PATH in hadoop-env.sh
3. set -Djava.library.path=... for mapred.child.java.opts


For what you are trying (i.e. given that the JNI libs are present on  
all machines at a constant path) setting -Djava.library.path for the  
child task via mapred.child.java.opts should work. What are you  
seeing?


Arun



4. change bin/hadoop to include  $LD_LIBRARY_PATH in addition to  
the path it generates:  HADOOP_OPTS=$HADOOP_OPTS - 
Djava.library.path=$LD_LIBRARY_PATH:$JAVA_LIBRARY_PATH

5. drop the .so files I need into hadoop/lib/native/...

1~3 didn't work, 4 and 5 did but seem to be hacks. I also read that  
I can do this using DistributedCache, but that seems to be extra  
work for loading libraries that are already present on each  
machine. (I'm using the JNI libs for berkeley db).
It seems that there should be a way to configure java.library.path  
for the mapred jobs.  Perhaps bin/hadoop should make use of  
LD_LIBRARY_PATH?


Thanks,
- Mimi