Hi Parkirat,

I don't think that HBase is causing the problems. You might already know
this but need to add the reducer class to the job as you add the mapper.
Also, if you want to read from a HBase table in a MapReduce job, you need
to implement the TableMapper for the mapper and if you want to write to a
file on HDFS instead of a HBase table, you need to implement the generic
Reducer class for the reducer.

But again, if you can show everyone the code, people will be able to help
you better.

Cheers,
Arun

Sent from a mobile device. Please don't mind the typos.
On Jul 31, 2014 1:07 PM, "Nick Dimiduk" <ndimi...@gmail.com> wrote:

> Hi Parkirat,
>
> I don't follow the reducer problem you're having. Can you post your code
> that configures the job? I assume you're using TableMapReduceUtil
> someplace.
>
> Your reducer is removing duplicate values? Sounds like you need to update
> it's logic to only emit a value once. Pastebin-ing your reducer code may be
> helpful as well.
>
> -n
>
>
> On Thu, Jul 31, 2014 at 8:20 AM, Parkirat <parkiratbigd...@gmail.com>
> wrote:
>
> > Hi All,
> >
> > I am using Mapreduce API to read Hbase Table, based on some scan
> operation
> > in mapper and putting the data to a file in reducer.
> > I am using Hbase Version "Version 0.94.5.23".
> >
> > *Problem:*
> > Now in my job, my mapper output a key as text and value as text, but my
> > reducer output key as text and value as nullwritable, but it seems *hbase
> > mapreduce api dont consider reducer*, and outputs both key and value as
> > text.
> >
> > Moreover if the same key comes twice, it goes to the file twice, even if
> my
> > reducer want to log it only once.
> >
> > Could anybody help me with this problem?
> >
> > Regards,
> > Parkirat Singh Bagga.
> >
> >
> >
> > --
> > View this message in context:
> >
> http://apache-hbase.679495.n3.nabble.com/Hbase-Mapreduce-API-Reduce-to-a-file-is-not-working-properly-tp4062141.html
> > Sent from the HBase User mailing list archive at Nabble.com.
> >
>

Reply via email to