The problem is resolved in the next release of hadoop (2.0.3-alpha cf.
MAPREDUCE-1700)

For hadoop 1.x based releases/distributions, put
-Dmapreduce.user.classpath.first=true on the hadoop command line and/or
client config


On Tue, Mar 12, 2013 at 6:49 AM, Jane Wayne <jane.wayne2...@gmail.com>wrote:

> hi,
>
> i need to know how to resolve conflicts with jar dependencies.
>
> * first, my job requires Jackson JSON-processor v1.9.11.
> * second, the hadoop cluster has Jackson JSON-processor v1.5.2. the
> jars are installed in $HADOOP_HOME/lib.
>
> according to this link,
>
> http://blog.cloudera.com/blog/2011/01/how-to-include-third-party-libraries-in-your-map-reduce-job/
> ,
> there are 3 ways to include 3rd party libraries in a map/reduce (mr)
> job.
> * use the -libjars flag
> * include the dependent libraries in the executing jar file's /lib
> directory
> * put the jars in the $HADOOP_HOME/lib directory
>
> i can report that using -libjars and including the libraries in my
> jar's /lib directory "do not work" (in my case of jar conflicts). i
> still get a NoSuchMethodException. the only way to get my job to run
> is the last option, placing the newer jars in $HADOOP_HOME/lib. the
> last option is fine on a sandbox or development instance, but there
> are some political difficulties (not only technical) in modifying our
> production environment.
>
> my questions/concerns are:
> 1. how come the -libjars and /lib options do not work? how does class
> loading work in mr tasks?
> 2. is there another option available that i am not aware of to try and
> get dependent jars by the job to "overwrite" what's in
> $HADOOP_HOME/lib at runtime of the mr tasks?
>
>
> any help is appreciated. thank you all.
>

Reply via email to