Hi Pankaj

AFAIK You can do the same. Just provide the properties like mapper class, 
reducer class, input format, output format etc using -D option at run time.



Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Pankaj Gupta <pan...@brightroll.com>
Date: Tue, 20 Nov 2012 20:49:29 
To: user@hadoop.apache.org<user@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Supplying a jar for a map-reduce job

Hi,

I am running map-reduce jobs on Hadoop 0.23 cluster. Right now I supply the jar 
to use for running the map-reduce job using the setJarByClass function on 
org.apache.hadoop.mapreduce.Job. This makes my code depend on a class in the MR 
job at compile. What I want is to be able to run an MR job without being 
dependent on it at compile time. It would be great if I could use a jar that 
contains the Mapper and Reducer classes and just pass it to run the map reduce 
job. That would make it easy to choose an MR job to run at runtime. Is that 
possible?


Thanks in Advance,
Pankaj

Reply via email to