[ 
https://issues.apache.org/jira/browse/PHOENIX-1551?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14272193#comment-14272193
 ] 

James Taylor commented on PHOENIX-1551:
---------------------------------------

It looks like this is caused by HTablePool wrapping it's exception in a 
RuntimeException since pool.getTable(tableName) isn't declared to throw an 
IOException. Can you unwrap this here in ServerUtil.getTableFromSingletonPool() 
declaring it to throw an IOException instead so that we're consistent with the 
other code path for newer versions of HBase (for 
ServerUtil.getHTableForCoprocessorScan()) and don't need to propagate this 
unwrapping elsewhere?
{code}
    private static HTableInterface 
getTableFromSingletonPool(RegionCoprocessorEnvironment env, byte[] tableName) {
        // It's ok to not ever do a pool.close() as we're storing a single
        // table only. The HTablePool holds no other resources that this table
        // which will be closed itself when it's no longer needed.
        @SuppressWarnings("resource")
        HTablePool pool = new HTablePool(env.getConfiguration(),1);
        return pool.getTable(tableName);
    }
{code}

> Upgrading from Phoenix 4.0.x to 4.2.2 and throw can't find SYSTEM.STATS
> -----------------------------------------------------------------------
>
>                 Key: PHOENIX-1551
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-1551
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.2.2
>         Environment: hbase 0.96.1.1 cdh5.0.2
>            Reporter: Feng
>            Assignee: Jeffrey Zhong
>            Priority: Blocker
>              Labels: Upgrading
>             Fix For: 4.2.2
>
>         Attachments: PHOENIX-1551.patch
>
>
> I download phoenix-4.2.2-bin.tar.gz and put phoenix-4.2.2-server.jar into the 
> HBase lib directory and remove any phoenix-4.0.jar,restart hbase
> When I used bin/sqlline.py to connection phoenix throw
> Error: org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG: 
> org.apache.hadoop.hbase.TableNotFoundException: SYSTEM.STATS
>       at 
> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>       at 
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:794)
>       at 
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:7763)
>       at 
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:5296)
>       at 
> org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3205)
>       at 
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:26933)
>       at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2175)
>       at 
> org.apache.hadoop.hbase.ipc.RpcServer$Handler.run(RpcServer.java:1879)
> Caused by: java.lang.RuntimeException: 
> org.apache.hadoop.hbase.TableNotFoundException: SYSTEM.STATS
>       at 
> org.apache.hadoop.hbase.client.HTableFactory.createHTableInterface(HTableFactory.java:41)
>       at 
> org.apache.hadoop.hbase.client.HTablePool.createHTable(HTablePool.java:271)
>       at 
> org.apache.hadoop.hbase.client.HTablePool.findOrCreateTable(HTablePool.java:201)
>       at 
> org.apache.hadoop.hbase.client.HTablePool.getTable(HTablePool.java:180)
>       at 
> org.apache.hadoop.hbase.client.HTablePool.getTable(HTablePool.java:219)
>       at 
> org.apache.phoenix.util.ServerUtil.getTableFromSingletonPool(ServerUtil.java:151)
>       at 
> org.apache.phoenix.util.ServerUtil.getHTableForCoprocessorScan(ServerUtil.java:165)
>       at 
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:609)
>       at 
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:359)
>       at 
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:665)
>       at 
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:743)
>       ... 6 more
> Caused by: org.apache.hadoop.hbase.TableNotFoundException: SYSTEM.STATS
>       at 
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1181)
>       at 
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1054)
>       at 
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1011)
>       at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:326)
>       at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:192)
>       at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:165)
>       at 
> org.apache.hadoop.hbase.client.HTableFactory.createHTableInterface(HTableFactory.java:39)
>       ... 16 more (state=08000,code=101)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to