Very strange.

Are you able to use the shell?  $HBASE_HOME/bin/hbase shell

Type 'help' to see the options.  To scan your table, type:  scan 'tableName'

Zheng Lv wrote:
Hello J.G,
    Thank you for your reply.
    My hbase version is the newest : 0.20.0.

    I have two tables, both having 10000 cells, one having 100 rows and 100
columns per row, another having 10000 rows and 1 column per row.  I wanted
to test the speed scanning all the cells in 2 tables. When I scanned this 2
tables, at "scanner = table.getScanner(s);", the same exceptions in last
mail were both thrown . And following is the complete codes:

public class HbaseScannerTest {

// private final String TableNameA = "ttest1";
// private final String TableNameB = "ttest2";
 private final String ColumnFamily = "cf_test";
 private final String Column = "c_test";

 private void scan(String tableName) {

  HBaseConfiguration config = new HBaseConfiguration();
  HTable table = null;
  ResultScanner scanner = null;
  try {

   table = new HTable(config, tableName);
   Scan s = new Scan();

   s = s.addFamily(Bytes.toBytes(ColumnFamily));
   scanner = table.getScanner(s);
   for (Result rr = scanner.next(); rr != null; rr = scanner.next()) {

    NavigableMap<byte[], byte[]> map =
rr.getFamilyMap(Bytes.toBytes(ColumnFamily));
    Set<byte[]> set = map.keySet();
    Iterator<byte[]> iter = set.iterator();
    while(iter.hasNext()){

     byte[] key = iter.next();
     String strKey = Bytes.toString(key);
     String strValue = Bytes.toString(map.get(key));
     System.out.println("key:" + strKey + ", value:" + strValue);
    }
   }
  } catch(Exception e){
   e.printStackTrace();
  } finally {

   try{

    if(scanner != null){
     scanner.close();
    }

    if(table != null){
     table.close();
    }
   }catch(Exception e){
    e.printStackTrace();
   }
  }
 }
 public static void main(String[] args) {
  if(args == null || args.length == 0){
   System.out.println("param error.");
   return;
  }else{

   HbaseScannerTest scannerTest = new HbaseScannerTest();
   long t1 = System.currentTimeMillis();
   scannerTest.scan(args[0]);
   long t2 = System.currentTimeMillis();
   System.out.println("*********scan " + args[0] + " need : " + (t2 - t1));
  }
 }
}


 When the exceptions were thrown, the log in master was normal, and the logs
in other region servers were normal too, but in
/192.168.33.6:60020(sometimes another server) which pointed in the
exceptions, the log is
abnormal. I think I have put all the relational logs in last mail, because
the remained logs looks like unrelated at all. The following is the more
complete logs:

2009-09-18 11:08:37,050 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: MSG_REGION_CLOSE:
TestTable,0002215349,1251708925379: Overloaded
2009-09-18 11:08:37,051 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: MSG_REGION_CLOSE:
TestTable,0001302275,1251708861978: Overloaded
2009-09-18 11:08:37,051 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: MSG_REGION_CLOSE:
TestTable,0000884288,1251705503037: Overloaded
2009-09-18 11:08:37,051 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: MSG_REGION_CLOSE:
TestTable,0000065336,1251705454231: Overloaded
2009-09-18 11:08:37,051 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: MSG_REGION_CLOSE:
TestTable,0001062941,1251708879165: Overloaded
2009-09-18 11:08:37,051 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: Worker:
MSG_REGION_CLOSE: TestTable,0002215349,1251708925379: Overloaded
2009-09-18 11:08:37,052 INFO org.apache.hadoop.hbase.regionserver.HRegion:
Closed TestTable,0002215349,1251708925379
2009-09-18 11:08:37,052 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: Worker:
MSG_REGION_CLOSE: TestTable,0001302275,1251708861978: Overloaded
2009-09-18 11:08:37,052 INFO org.apache.hadoop.hbase.regionserver.HRegion:
Closed TestTable,0001302275,1251708861978
2009-09-18 11:08:37,053 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: Worker:
MSG_REGION_CLOSE: TestTable,0000884288,1251705503037: Overloaded
2009-09-18 11:08:37,053 INFO org.apache.hadoop.hbase.regionserver.HRegion:
Closed TestTable,0000884288,1251705503037
2009-09-18 11:08:37,053 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: Worker:
MSG_REGION_CLOSE: TestTable,0000065336,1251705454231: Overloaded
2009-09-18 11:08:37,053 INFO org.apache.hadoop.hbase.regionserver.HRegion:
Closed TestTable,0000065336,1251705454231
2009-09-18 11:08:37,053 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: Worker:
MSG_REGION_CLOSE: TestTable,0001062941,1251708879165: Overloaded
2009-09-18 11:08:37,054 INFO org.apache.hadoop.hbase.regionserver.HRegion:
Closed TestTable,0001062941,1251708879165
2009-09-18 11:08:49,109 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: MSG_REGION_OPEN:
TestTable,0000952473,1251709074516
2009-09-18 11:08:49,109 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: MSG_REGION_OPEN:
TestTable,0000595460,1251708629215
2009-09-18 11:08:49,109 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: MSG_REGION_OPEN:
webpage,http:\x2F\x2Fnews.163.com
\x2F09\x2F0803\x2F01\x2F5FOO155J0001124J.html1251254047232_16208,1251254433708
2009-09-18 11:08:49,109 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: MSG_REGION_OPEN:
TestTable,0002805208,1251709024537
2009-09-18 11:08:49,109 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: Worker: MSG_REGION_OPEN:
TestTable,0000952473,1251709074516
2009-09-18 11:08:49,148 INFO org.apache.hadoop.hbase.regionserver.HRegion:
region TestTable,0000952473,1251709074516/1184073320 available; sequence id
is 5922826
2009-09-18 11:08:49,148 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: Worker: MSG_REGION_OPEN:
TestTable,0000595460,1251708629215
2009-09-18 11:08:49,191 INFO org.apache.hadoop.hbase.regionserver.HRegion:
region TestTable,0000595460,1251708629215/1982730697 available; sequence id
is 5937085
2009-09-18 11:08:49,191 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: Worker: MSG_REGION_OPEN:
webpage,http:\x2F\x2Fnews.163.com
\x2F09\x2F0803\x2F01\x2F5FOO155J0001124J.html1251254047232_16208,1251254433708
2009-09-18 11:08:49,285 INFO org.apache.hadoop.hbase.regionserver.HRegion:
region 
webpage,http:\x2F\x2Fnews.163.com\x2F09\x2F0803\x2F01\x2F5FOO155J0001124J.html1251254047232_16208,1251254433708/174964099
available; sequence id is 173787
2009-09-18 11:08:49,285 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: Worker: MSG_REGION_OPEN:
TestTable,0002805208,1251709024537
2009-09-18 11:08:49,355 INFO org.apache.hadoop.hbase.regionserver.HRegion:
region TestTable,0002805208,1251709024537/1263036978 available; sequence id
is 5922825
2009-09-18 11:18:04,909 INFO
org.apache.hadoop.hbase.regionserver.HRegionServer: compactions no longer
limited
2009-09-18 11:32:03,523 ERROR
org.apache.hadoop.hbase.regionserver.HRegionServer:

org.apache.hadoop.hbase.UnknownScannerException: Name: -1
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1905)
        at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:650)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:913)

2009-09-18 11:32:03,529 INFO org.apache.hadoop.ipc.HBaseServer: IPC Server
handler 1 on 60020, call next(-1, 1) from 192.168.33.7:43810: error:
org.apache.hadoop.hbase.UnknownScannerException: Name: -1

org.apache.hadoop.hbase.UnknownScannerException: Name: -1
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1905)
        at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:650)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:913)

 Any suggestion?
LvZheng

Reply via email to