Try putting @Override before your reduce method to make sure you're overriding
the method properly. You'll get a compile time error if not.
-Steven Willis
From: Bejoy KS [mailto:bejoy.had...@gmail.com]
Sent: Tuesday, April 17, 2012 10:03 AM
To: mapreduce-user@hadoop.apache.org
S
> There is still some shadows on API deprecation however. We do want to
> support both APIs for a longer term, per our earlier discussions on
> this list. However, this is a developer nightmare again cause docs
> would go inconsistent and questions would arise on what to pick.
>
> We'd have to pic
e soon, shouldn't at least the javadoc
examples be updated to use only the new APIs so that developers have some
indication of where they should be going with their code?
-Steven Willis
e better off
avoid it if you can as it will drastically impact performance". Would it be
more efficient to simply copy the byte array returned from getBytes() to do the
comparison?
-Steven Willis
-Original Message-
From: Harsh J [mailto:ha...@cloudera.com]
Sent: Friday, March 16, 2
;
}
}
}
}
/*/
My description of the problem stands.
-Steven Willis
I'm trying to write a Reducer which will eliminate duplicates from the list of
values before writing them out. I have the following code for my Reducer:
/*/
public class ClickStreamIndexerReducer extends Reducer {
@Override
public void reduce(Text dirName, Iterable values,