There're several 0.92 releases, can you be more specific ?

Thanks

On Tue, Feb 5, 2013 at 10:46 AM, Bing Li <lbl...@gmail.com> wrote:

> Dear Ted,
>
> My HBase is 0.92.
>
> Thanks!
> Bing
>
> On Wed, Feb 6, 2013 at 2:45 AM, Ted Yu <yuzhih...@gmail.com> wrote:
> > To help us more easily correlate line numbers, can you tell us the
> version
> > of HBase you're using ?
> >
> > Thanks
> >
> > On Tue, Feb 5, 2013 at 10:39 AM, Bing Li <lbl...@gmail.com> wrote:
> >
> >> Dear all,
> >>
> >> To raise the performance of writing data into HBase, the
> >> "synchronized" is removed from the writing method.
> >>
> >> But after "synchronized" is removed from the method of writing, I get
> >> the following exceptions when reading. Before the removal, no such
> >> exceptions.
> >>
> >> Could you help me how to solve it?
> >>
> >> Thanks so much!
> >>
> >> Best wishes,
> >> Bing
> >>
> >>       Feb 6, 2013 12:21:31 AM
> >> org.apache.hadoop.hbase.ipc.HBaseClient$Connection run
> >>       WARNING: Unexpected exception receiving call responses
> >> java.lang.NullPointerException
> >>           at
> >>
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:521)
> >>           at
> >>
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readFields(HbaseObjectWritable.java:297)
> >>           at
> >>
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:593)
> >>           at
> >>
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:505)
> >>       Feb 6, 2013 12:21:31 AM
> >> org.apache.hadoop.hbase.client.ScannerCallable close
> >>       WARNING: Ignore, probably already closed
> >>       java.io.IOException: Call to greatfreeweb/127.0.1.1:60020
> >> failed on local exception: java.io.IOException: Unexpected exception
> >> receiving call responses
> >>           at
> >>
> org.apache.hadoop.hbase.ipc.HBaseClient.wrapException(HBaseClient.java:934)
> >>           at
> >> org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:903)
> >>           at
> >>
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:150)
> >>           at $Proxy6.close(Unknown Source)
> >>           at
> >>
> org.apache.hadoop.hbase.client.ScannerCallable.close(ScannerCallable.java:112)
> >>           at
> >>
> org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:74)
> >>           at
> >>
> org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:39)
> >>           at
> >>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getRegionServerWithRetries(HConnectionManager.java:1325)
> >>           at
> >>
> org.apache.hadoop.hbase.client.HTable$ClientScanner.nextScanner(HTable.java:1167)
> >>           at
> >>
> org.apache.hadoop.hbase.client.HTable$ClientScanner.next(HTable.java:1296)
> >>           at
> >>
> org.apache.hadoop.hbase.client.HTable$ClientScanner$1.hasNext(HTable.java:1356)
> >>           at
> >>
> com.greatfree.hbase.rank.NodeRankRetriever.LoadNodeGroupNodeRankRowKeys(NodeRankRetriever.java:348)
> >>           at
> >>
> com.greatfree.ranking.PersistNodeGroupNodeRanksThread.run(PersistNodeGroupNodeRanksThread.java:29)
> >>           at
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> >>           at
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> >>           at java.lang.Thread.run(Thread.java:662) Caused by:
> >> java.io.IOException: Unexpected exception receiving call responses
> >>           at
> >>
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:509)
> >>       Caused by: java.lang.NullPointerException
> >>           at
> >>
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:521)
> >>           at
> >>
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readFields(HbaseObjectWritable.java:297)
> >>           at
> >>
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:593)
> >>           at
> >>
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:505)
> >>
> >> The writing method is as follows.
> >>
> >>
> >> // The "synchronized" is removed to raise the performance.
> >> // public synchronized void AddNodeViewGroupNodeRanks(String
> >> hostNodeKey, String groupKey, int timingScale, Map<String, Double>
> >> groupNodeRankMap)
> >>
> >> public void AddNodeViewGroupNodeRanks(String hostNodeKey, String
> >> groupKey, int timingScale, Map<String, Double> groupNodeRankMap)
> >> {
> >>         List<Put> puts = new ArrayList<Put>();
> >>         Put hostNodeKeyPut;
> >>         Put groupKeyPut;
> >>         Put timingScalePut;
> >>         Put nodeKeyPut;
> >>         Put rankPut;
> >>
> >>         byte[] groupNodeRankRowKey;
> >>
> >>         for (Map.Entry<String, Double> nodeRankEntry :
> >> groupNodeRankMap.entrySet())
> >>         {
> >>                groupNodeRankRowKey = Bytes.toBytes(...);
> >>
> >>                hostNodeKeyPut = new Put(groupNodeRankRowKey);
> >>                hostNodeKeyPut.add(...);
> >>                 puts.add(hostNodeKeyPut);
> >>                ......
> >>
> >>                 rankPut = new Put(groupNodeRankRowKey);
> >>                rankPut.add(...);
> >>                 puts.add(rankPut);
> >>         }
> >>
> >>         try
> >>         {
> >>                 this.rankTable.put(puts);
> >>         }
> >>         catch (IOException e)
> >>         {
> >>                 e.printStackTrace();
> >>         }
> >> }
> >>
> >>
> >> The reading method that causes the exceptions is as follows.
> >>
> >>         public Set<String> LoadNodeGroupNodeRankRowKeys(String
> >> hostNodeKey, String groupKey, int timingScale)
> >>         {
> >>                 List<Filter> nodeGroupFilterList = new
> ArrayList<Filter>();
> >>
> >>                 SingleColumnValueFilter hostNodeKeyFilter = new
> >> SingleColumnValueFilter(...);
> >>                 hostNodeKeyFilter.setFilterIfMissing(true);
> >>                 nodeGroupFilterList.add(hostNodeKeyFilter);
> >>
> >>                 ......
> >>
> >>                 FilterList nodeGroupFilter = new
> >> FilterList(nodeGroupFilterList);
> >>                 Scan scan = new Scan();
> >>                 scan.setFilter(nodeGroupFilter);
> >>                 scan.setCaching(Parameters.CACHING_SIZE);
> >>                 scan.setBatch(Parameters.BATCHING_SIZE);
> >>
> >>                 Set<String> rowKeySet = Sets.newHashSet();
> >>                 try
> >>                 {
> >>                         ResultScanner scanner =
> >> this.rankTable.getScanner(scan);
> >>
> >>                         // EXCEPTIONS are raised at the following line.
> >>                         for (Result result : scanner)
> >>                         {
> >>                                 for (KeyValue kv : result.raw())
> >>                                 {
> >> rowKeySet.add(Bytes.toString(kv.getRow()));
> >>                                         break;
> >>                                 }
> >>                         }
> >>                         scanner.close();
> >>                 }
> >>                 catch (IOException e)
> >>                 {
> >>                         e.printStackTrace();
> >>                 }
> >>                 return rowKeySet;
> >>        }
> >>
>

Reply via email to