Thanks Steve. I could solve the problem by moving the set() methods before job creation, as Amogh suggested. However, I will also try your solution.
On Tue, Jan 5, 2010 at 1:24 PM, Steve Kuo <kuosen...@gmail.com> wrote: > There seemed to be a change between 0.20 and 0.19 API in that 0.20 no > longer > set "map.input.file". config.set(), as far as I can tell, should work. I > however use the following to pass the parameters. > > String[] params = new String[] { "-D", "tag1=string_value", ...} > > ToolRunner(new Configuration(), someJob.class, params); > > > On Mon, Jan 4, 2010 at 9:52 AM, Farhan Husain <farhan.hus...@csebuet.org > >wrote: > > > Hello all, > > > > I am using hadoop-0.20.1. I need to know the input file name in my map > > processes and pass an integer and a string to my reducer processes. I > used > > the following method calls for that: > > > > config.set("tag1", "string_value"); > > config.setInt("tag2", int_value); > > > > In setup() method of mapper: > > String filename = > > context.getConfiguration().get("map.input.file") // returns > null > > > > In setup() method of reducer: > > String val = > > context.getConfiguration().get("tag1"); // > > returns null > > int n = context.getConfiguration().getInt("tag2", > > def_val); // returns def_val > > > > Can anyone please tell me what may be wrong with this code or anything > > related to it? Is it a bug of this version of Hadoop? Is there any > > alternative way to accomplish the same objective? I am stuck with this > > problem for about one week. I would appreciate if someone would shed some > > light on it. > > > > Thanks, > > Farhan > > >