Hi Arko,

    What is value of  'no_of_reduce_tasks'? 

If no of reduce tasks are 0, then the map task will directly write map output  
into the Job output path.

Thanks
Devaraj

________________________________________
From: Arko Provo Mukherjee [arkoprovomukher...@gmail.com]
Sent: Tuesday, April 17, 2012 10:32 AM
To: mapreduce-user@hadoop.apache.org
Subject: Reducer not firing

Dear All,

I am porting code from the old API to the new API (Context objects)
and run on Hadoop 0.20.203.

Job job_first = new Job();

job_first.setJarByClass(My.class);
job_first.setNumReduceTasks(no_of_reduce_tasks);
job_first.setJobName("My_Job");

FileInputFormat.addInputPath( job_first, new Path (Input_Path) );
FileOutputFormat.setOutputPath( job_first, new Path (Output_Path) );

job_first.setMapperClass(Map_First.class);
job_first.setReducerClass(Reduce_First.class);

job_first.setMapOutputKeyClass(IntWritable.class);
job_first.setMapOutputValueClass(Text.class);

job_first.setOutputKeyClass(NullWritable.class);
job_first.setOutputValueClass(Text.class);

job_first.waitForCompletion(true);

The problem I am facing is that instead of emitting values to
reducers, the mappers are directly writing their output in the
OutputPath and the reducers and not processing anything.

As read from the online materials that are available both my Map and
Reduce method uses the context.write method to emit the values.

Please help. Thanks a lot in advance!!

Warm regards
Arko

Reply via email to