Hi,

Please note that you are referring to a very old version of Hadoop. the
current stable release is Hadoop 1.x. The API has changed in 1.x. Take a
look at the wordcount example here:
http://hadoop.apache.org/docs/r1.0.4/mapred_tutorial.html#Example%3A+WordCount+v2.0


But, in principle your method should work. I wrote it using the new API in
a similar fashion and it worked fine. Can you show the code of your driver
program (i.e. where you have main) ?

Thanks
hemanth



On Tue, Jan 22, 2013 at 5:22 AM, jamal sasha <jamalsha...@gmail.com> wrote:

> Hi,
>   Lets say I have the standard helloworld program
>
> http://hadoop.apache.org/docs/r0.17.0/mapred_tutorial.html#Example%3A+WordCount+v2.0
>
> Now, lets say, I want to start the counting not from zero but from 200000.
> So my reference line is 200000.
>
> I modified the Reduce code as following:
>  public static class Reduce extends MapReduceBase implements Reducer<Text,
> IntWritable, Text, IntWritable> {
>      *private static int baseSum ;*
> *      public void configure(JobConf job){*
> *      baseSum = Integer.parseInt(job.get("basecount"));*
> *      *
> *      }*
>        public void reduce(Text key, Iterator<IntWritable> values,
> OutputCollector<Text, IntWritable> output, Reporter reporter) throws
> IOException {
>          int sum =* baseSum*;
>         while (values.hasNext()) {
>           sum += values.next().get();
>          }
>         output.collect(key, new IntWritable(sum));
>       }
>      }
>
>
> And in main added:
>    conf.setInt("basecount",200000);
>
>
>
> So my hope was this should have done the trick..
> But its not working. the code is running normally :(
> How do i resolve this?
> Thanks
>

Reply via email to