HBase write problem

Hi all.

I have a problem writing to HBase.

I am using a slightly modified example of this class to proof the concept:
https://github.com/apache/flink/blob/master/flink-batch-connectors/flink-hbase/src/test/java/org/apache/flink/addons/hbase/example/HBaseWriteExample.java

However all the HBase-specific stuff is exactly the same as in the 
HBaseWriteExample.

The problem I see is that the job never completes (been running for more than 
an hour now) and it is only 13 key/value pairs that is to be written to HBase 
:-)
I have tested the map/reduce stuff works if I replace the HBase connection 
stuff with just a write to a text file - works OK. I have also tested that I 
can insert data in HBase from a similar Hadoop MapReduce job.

Here is the part of the code where I guess the problem is:

      @Override
      public Tuple2<Text, Mutation> map(Tuple2<String, Integer> t) throws 
Exception {
        LOG.info("Tuple2 map() called");
        reuse.f0 = new Text(t.f0);
        Put put = new Put(t.f0.getBytes());
        put.add(MasterConstants.CF_SOME, MasterConstants.COUNT, 
Bytes.toBytes(t.f1));
        reuse.f1 = put;
        return reuse;
      }
    }).output(new HadoopOutputFormat<Text, Mutation>(new 
TableOutputFormat<Text>(), job));

    env.execute("Flink HBase Event Count Hello World Test");
 
This code matches the code in the HBaseWriteExample.java I should think.
 
The "Tuple2" log line I see exactly the 13 times I expect, and the last log 
line I see is this:
2016-05-10 21:48:42,715 INFO  
org.apache.hadoop.hbase.mapreduce.TableOutputFormat           - Created table 
instance for event_type_count

Any suggestions to what the problem could be?

Thanks,
Palle

Reply via email to