On Mar 8, 2011, at 1:21 PM, Ratner, Alan S (IS) wrote:
We had tried putting all the libraries directly in HDFS with a pointer in
mapred-site.xml:
propertynamemapred.child.env/namevalueLD_LIBRARY_PATH=/user/ngc/lib/value/property
as described in
image processing
algorithms.
-Original Message-
From: Ratner, Alan S (IS) [mailto:alan.rat...@ngc.com]
Sent: Friday, March 04, 2011 3:53 PM
To: common-user@hadoop.apache.org
Subject: EXT :Problem running a Hadoop program with external libraries
We are having difficulties running a Hadoop
500 image processing
algorithms.
-Original Message-
From: Ratner, Alan S (IS) [mailto:alan.rat...@ngc.com]
Sent: Friday, March 04, 2011 3:53 PM
To: common-user@hadoop.apache.org
Subject: EXT :Problem running a Hadoop program with external libraries
We are having difficulties running
Kimball [mailto:akimbal...@gmail.com]
Sent: Friday, March 04, 2011 4:30 PM
To: common-user@hadoop.apache.org
Cc: Ratner, Alan S (IS)
Subject: EXT :Re: Problem running a Hadoop program with external
libraries
Actually, I just misread your email and missed the difference between
your 2nd
We are having difficulties running a Hadoop program making calls to external
libraries - but this occurs only when we run the program on our cluster and not
from within Eclipse where we are apparently running in Hadoop's standalone
mode. This program invokes the Open Computer Vision libraries
I'm only guessing here and might be grossly wrong about my hunch.
Are you reusing your JVMs across tasks? Could you see if this goes
away without reuse?
Would be good if you can monitor your launched Tasks
(JConsole/VisualVM/etc.) to affirm that there's either a code-based
memory leak or some
I don't know if putting native-code .so files inside a jar works. A
native-code .so is not classloaded in the same way .class files are.
So the correct .so files probably need to exist in some physical directory
on the worker machines. You may want to doublecheck that the correct
directory on the
Actually, I just misread your email and missed the difference between your
2nd and 3rd attempts.
Are you enforcing min/max JVM heap sizes on your tasks? Are you enforcing a
ulimit (either through your shell configuration, or through Hadoop itself)?
I don't know where these cannot allocate memory
) 707-8605 (cell)
From: Aaron Kimball [mailto:akimbal...@gmail.com]
Sent: Friday, March 04, 2011 4:30 PM
To: common-user@hadoop.apache.org
Cc: Ratner, Alan S (IS)
Subject: EXT :Re: Problem running a Hadoop program with external libraries
Actually, I just misread your email and missed
, March 04, 2011 4:30 PM
To: common-user@hadoop.apache.org
Cc: Ratner, Alan S (IS)
Subject: EXT :Re: Problem running a Hadoop program with external libraries
Actually, I just misread your email and missed the difference between your
2nd and 3rd attempts.
Are you enforcing min/max JVM heap
) 707-8605 (cell)
From: Aaron Kimball [mailto:akimbal...@gmail.com]
Sent: Friday, March 04, 2011 4:30 PM
To: common-user@hadoop.apache.org
Cc: Ratner, Alan S (IS)
Subject: EXT :Re: Problem running a Hadoop program with external libraries
Actually, I just misread your email and missed
11 matches
Mail list logo