[ 
https://issues.apache.org/jira/browse/PHOENIX-395?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15290902#comment-15290902
 ] 

Ian Hellstrom commented on PHOENIX-395:
---------------------------------------

It's not a solution but in case it causes problems you can always disable and 
drop the table from HBase's shell. Worked for me on 4.4 when I had the same 
issue.

> Create and drop table throw exception: ArrayIndexOutOfBoundsException & 
> DoNotRetryIOException
> ---------------------------------------------------------------------------------------------
>
>                 Key: PHOENIX-395
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-395
>             Project: Phoenix
>          Issue Type: Task
>            Reporter: Yanhui Ma
>
> 0: jdbc:phoenix:****> drop table M_INTERFACE_JOB
> . . . . . . . . . . . . . . . . . . . . . . .> ;
>  jbdc phoenixprepdsttmnt 728 execute drop table M_INTERFACE_JOB
> Error: org.apache.hadoop.hbase.DoNotRetryIOException: M_INTERFACE_JOB: 34
>       at 
> com.salesforce.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:83)
>       at 
> com.salesforce.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:429)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>       at java.lang.reflect.Method.invoke(Method.java:597)
>       at org.apache.hadoop.hbase.regionserver.HRegion.exec(HRegion.java:5476)
>       at 
> org.apache.hadoop.hbase.regionserver.HRegionServer.execCoprocessor(HRegionServer.java:3719)
>       at sun.reflect.GeneratedMethodAccessor41.invoke(Unknown Source)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>       at java.lang.reflect.Method.invoke(Method.java:597)
>       at 
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:320)
>       at 
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1426)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 34
>       at com.salesforce.phoenix.schema.PTableImpl.init(PTableImpl.java:126)
>       at com.salesforce.phoenix.schema.PTableImpl.<init>(PTableImpl.java:96)
>       at 
> com.salesforce.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:261)
>       at 
> com.salesforce.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:82)
>       at 
> com.salesforce.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:385)
>       ... 11 more (state=08000,code=101)
> When I created and drop my table, the exception is thrown as above. My create 
> table cause is like this:
> CREATE TABLE m_interface_job( 
> data.addtime VARCHAR  ,
> data.dir VARCHAR  ,
> data.end_time VARCHAR  ,
> data.file VARCHAR  ,
> data.fk_log VARCHAR  ,
> data.host VARCHAR  ,
> data.msg VARCHAR  ,
> data.row VARCHAR  ,
> data.size VARCHAR  ,
> data.start_time VARCHAR  ,
> data.stat_date DATE  ,
> data.stat_hour VARCHAR  ,
> data.stat_minute VARCHAR  ,
> data.state VARCHAR  ,
> data.title VARCHAR  ,
> data.user VARCHAR  ,
> data.host VARCHAR  ,
> data.inrow VARCHAR  ,
> data.jobid VARCHAR  ,
> data.jobtype VARCHAR  ,
> data.level VARCHAR  ,
> data.msg VARCHAR  ,
> data.outrow VARCHAR  ,
> data.pass_time VARCHAR  ,
> data.type VARCHAR  ,
> id INTEGER not null primary key desc
> ) 
> What is the problem?
> And now, I can not drop this table!!! Can you tell me how to fix this?
> My env:
> phoenix-1.2.1
> hbase-0.94.6.1
> hadoop-1.0.4
> zookeeper-3.4.3
> Many thanks!



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to