Which version of HBase?

The below logs that you have attached says about a different table right '
deu_ivytest,,1348826121781.985d6ca9986d7d8cfaf82daf523fcd45.'
And the one you are trying to drop is ' ivytest_deu’

Regards
Ram



> -----Original Message-----
> From: 唐 颖 [mailto:ivytang0...@gmail.com]
> Sent: Tuesday, October 16, 2012 1:23 PM
> To: user@hbase.apache.org
> Subject: hbase can't drop a table
> 
> I disable this table ivytest_deu , drop it .Error occurs.
> 
> 
> ERROR: java.io.IOException: java.io.IOException: HTableDescriptor
> missing for ivytest_deu
>       at
> org.apache.hadoop.hbase.master.handler.TableEventHandler.getTableDescri
> ptor(TableEventHandler.java:174)
>       at
> org.apache.hadoop.hbase.master.handler.DeleteTableHandler.<init>(Delete
> TableHandler.java:44)
>       at
> org.apache.hadoop.hbase.master.HMaster.deleteTable(HMaster.java:1143)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.ja
> va:39)
>       at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccesso
> rImpl.java:25)
>       at java.lang.reflect.Method.invoke(Method.java:597)
>       at
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEn
> gine.java:364)
>       at
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:13
> 76)
> 
> Here is some help for this command:
> Drop the named table. Table must first be disabled. If table has
> more than one region, run a major compaction on .META.:
> 
>   hbase> major_compact ".META."
> 
> The major_compact ".META." doesn't work.
> Then i try to create it ,but HBase says it .
> 
> ERROR: Table already exists: ivytest_deu!
> 
> After checking the region server log , the region server is always
> trying to load this region.
> 
> 2012-10-16 00:00:00,308 INFO
> org.apache.hadoop.hbase.regionserver.HRegionServer: Received request to
> open region:
> deu_ivytest,,1348826121781.985d6ca9986d7d8cfaf82daf523fcd45.
> 2012-10-16 00:00:00,313 WARN
> org.apache.hadoop.hbase.util.FSTableDescriptors: The following folder
> is in HBase's root directory and doesn't contain a table descriptor, do
> consider deleting it: deu_ivytest
> 2012-10-16 00:00:00,358 DEBUG
> org.apache.hadoop.hbase.regionserver.HRegion: Opening region: {NAME =>
> 'deu_ivytest,,1348826121781.985d6ca9986d7d8cfaf82daf523fcd45.',
> STARTKEY => '', ENDKEY => '', ENCODED =>
> 985d6ca9986d7d8cfaf82daf523fcd45,}
> 2012-10-16 00:00:00,358 DEBUG
> org.apache.hadoop.hbase.regionserver.HRegion: Registered protocol
> handler:
> region=deu_ivytest,,1348826121781.985d6ca9986d7d8cfaf82daf523fcd45.
> protocol=com.xingcloud.adhocprocessor.hbase.coprocessor.DEUColumnAggreg
> ationProtocol
> 2012-10-16 00:00:00,358 ERROR
> org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler: Failed
> open of
> region=deu_ivytest,,1348826121781.985d6ca9986d7d8cfaf82daf523fcd45.,
> starting to roll back the global memstore size.
> 2012-10-16 00:00:00,358 INFO
> org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler: Opening
> of region {NAME =>
> 'deu_ivytest,,1348826121781.985d6ca9986d7d8cfaf82daf523fcd45.',
> STARTKEY => '', ENDKEY => '', ENCODED =>
> 985d6ca9986d7d8cfaf82daf523fcd45,} failed, marking as FAILED_OPEN in ZK
> 
> And we have a endpoint in base .After the base tried to load this table
> ivy test_deu for 90,000 times ,the endpoint class also has been loaded
> for 90,000 times.
> The jvm memory has been filled.
> The gcutil shows
> S0C    S1C    S0U    S1U      EC       EU        OC         OU       PC
> PU    YGC     YGCT    FGC    FGCT     GCT
> 34880.0 34880.0 34648.1  0.0   209472.0 209472.0 2792768.0  2792768.0
> 71072.0 41461.5 129770 3448.191 24598 28469.996 31918.187
> 34880.0 34880.0 34648.1  0.0   209472.0 209472.0 2792768.0  2792768.0
> 71072.0 41461.5 129770 3448.191 24598 28469.996 31918.187
> 34880.0 34880.0 34648.1  0.0   209472.0 209472.0 2792768.0  2792768.0
> 71072.0 41461.5 129770 3448.191 24598 28469.996 31918.187
> 34880.0 34880.0 34648.1  0.0   209472.0 209472.0 2792768.0  2792768.0
> 71072.0 41461.5 129770 3448.191 24598 28469.996 31918.187
> 34880.0 34880.0 34880.0  0.0   209472.0 209472.0 2792768.0  2792768.0
> 71072.0 41461.5 129770 3448.191 24600 28481.974 31930.165
> 34880.0 34880.0 34880.0  0.0   209472.0 209472.0 2792768.0  2792768.0
> 71072.0 41461.5 129770 3448.191 24600 28481.974 31930.165
> 
> The jmap dump file shows
> 
> 3982039 instances of class org.apache.hadoop.hbase.KeyValue
> 191050 instances of class org.apache.hadoop.fs.Path
> 187364 instances of class
> org.cliffc.high_scale_lib.ConcurrentAutoTable$CAT
> 187301 instances of class org.cliffc.high_scale_lib.Counter
> 102272 instances of class net.sf.ehcache.concurrent.ReadWriteLockSync
> 93652 instances of class org.apache.hadoop.hbase.HRegionInfo
> 93650 instances of class
> com.google.common.collect.MutableClassToInstanceMap
> 93650 instances of class DEUColumnAggregationEndpoint
> 
> DEUColumnAggregationEndpoint is our endpoint class.
> 
> We guess the 90,000 times check this table and load endpoint class
> leads this memory leak.
> 
> But how to drop this table?
> 
> 
> 
> 
> 
> 
> 
> 


Reply via email to