If you're calling HFileOutputFormat.configureIncrementalLoad, that should
be setting up the Serialization for you.

Can you look at the job configuration and see what's present for the key
"io.serializations"?

-Sean

On Sun, Nov 2, 2014 at 3:53 PM, Serega Sheypak <serega.shey...@gmail.com>
wrote:

> I use it to prepare HFile using my custom mapper emitting Put and
>   HFileOutputFormat.configureIncrementalLoad(job, createHTable())
> //connection to target table
>
> and then bulk load data to table using LoadIncrementalHFiles
>
> P.S.
> HFileOutputFormat is also deprecated... so many changes... (((
>
>
> 2014-11-03 0:41 GMT+03:00 Sean Busbey <bus...@cloudera.com>:
>
> > In the 0.94.x API, Put implemented Writable[1]. This meant that MR code,
> > like yours, could use it as a Key or Value between Mapper and Reducer.
> >
> > In 0.96 and later APIs, Put no longer directly implements Writable[2].
> > Instead, HBase now includes a Hadoop Seriazliation implementation.
> > Normally, this would be configured via the TableMapReduceUtil class for
> > either a TableMapper or TableReducer.
> >
> > Presuming that the intention of your MR job is to have all the Puts write
> > to some HBase table, you should be able to follow the "write to HBase"
> part
> > of the examples for reading and writing HBase via mapreduce in the
> > reference guide[3].
> >
> > Specifically, you should have your Driver call one of the
> > initTableReducerJob methods on TableMapReduceUtil, where it currently
> sets
> > the Mapper class for your application[4].
> >
> > -Sean
> >
> > [1]:
> >
> >
> http://hbase.apache.org/0.94/apidocs/org/apache/hadoop/hbase/client/Put.html
> > [2]:
> > http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/client/Put.html
> > [3]: http://hbase.apache.org/book/mapreduce.example.html
> > [4]:
> >
> >
> http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
> >
> >
> > On Sun, Nov 2, 2014 at 3:02 PM, Serega Sheypak <serega.shey...@gmail.com
> >
> > wrote:
> >
> > > Hi, I'm migrating from CDH4 to CDH5 (hbase 0.98.6-cdh5.2.0)
> > > I had a unit test for mapper used to create HFile and bulk load later.
> > >
> > > I've bumped maven deps from cdh4 to cdh5 0.98.6-cdh5.2.0
> > > Now I've started to get exception
> > >
> > > java.lang.IllegalStateException: No applicable class implementing
> > > Serialization in conf at io.serializations: class
> > > org.apache.hadoop.hbase.client.Put
> > > at
> > com.google.common.base.Preconditions.checkState(Preconditions.java:149)
> > > at
> > >
> > >
> >
> org.apache.hadoop.mrunit.internal.io.Serialization.copy(Serialization.java:75)
> > > at
> > >
> > >
> >
> org.apache.hadoop.mrunit.internal.io.Serialization.copy(Serialization.java:97)
> > > at
> > >
> > >
> >
> org.apache.hadoop.mrunit.internal.output.MockOutputCollector.collect(MockOutputCollector.java:48)
> > > at
> > >
> > >
> >
> org.apache.hadoop.mrunit.internal.mapreduce.AbstractMockContextWrapper$4.answer(AbstractMockContextWrapper.java:90)
> > > at
> > >
> > >
> >
> org.mockito.internal.stubbing.StubbedInvocationMatcher.answer(StubbedInvocationMatcher.java:34)
> > > at
> > >
> > >
> >
> org.mockito.internal.handler.MockHandlerImpl.handle(MockHandlerImpl.java:91)
> > > at
> > >
> > >
> >
> org.mockito.internal.handler.NullResultGuardian.handle(NullResultGuardian.java:29)
> > > at
> > >
> > >
> >
> org.mockito.internal.handler.InvocationNotifierHandler.handle(InvocationNotifierHandler.java:38)
> > > at
> > >
> > >
> >
> org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:51)
> > > at
> > >
> > >
> >
> org.apache.hadoop.mapreduce.Mapper$Context$$EnhancerByMockitoWithCGLIB$$ba4633fb.write(<generated>)
> > >
> > >
> > > And here is mapper code:
> > >
> > >
> > >
> > > public class ItemRecommendationHBaseMapper extends Mapper<LongWritable,
> > > BytesWritable, ImmutableBytesWritable, Put> {
> > >
> > >     private final ImmutableBytesWritable hbaseKey = new
> > > ImmutableBytesWritable();
> > >     private final DynamicObjectSerDe<ItemRecommendation> serde = new
> > > DynamicObjectSerDe<ItemRecommendation>(ItemRecommendation.class);
> > >
> > >     @Override
> > >     protected void map(LongWritable key, BytesWritable value, Context
> > > context) throws IOException, InterruptedException {
> > >         checkPreconditions(key, value);
> > >         hbaseKey.set(Bytes.toBytes(key.get()));
> > >
> > >         ItemRecommendation item = serde.deserialize(value.getBytes());
> > >         checkPreconditions(item);
> > >         Put put = PutFactory.createPut(serde, item, getColumnFamily());
> > >
> > >         context.write(hbaseKey, put); //Exception here
> > >     }
> > >
> > > Whatcan i do in order to make unit-test pass?
> > >
> >
> >
> >
> > --
> > Sean
> >
>



-- 
Sean

Reply via email to