[ 
https://issues.apache.org/jira/browse/HBASE-10961?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Enis Soztutar resolved HBASE-10961.
-----------------------------------

    Resolution: Not a Problem

Resolving this since it turned out to be a partially ported patch from 10591. 
The tests passes now: 
{code}
Running org.apache.hadoop.hbase.client.TestFromClientSide
2014-04-11 12:56:01.389 java[9240:1003] Unable to load realm info from 
SCDynamicStore
Tests run: 78, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 98.345 sec
Running org.apache.hadoop.hbase.client.TestFromClientSide3
2014-04-11 12:57:39.956 java[9418:1003] Unable to load realm info from 
SCDynamicStore
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.258 sec
Running org.apache.hadoop.hbase.client.TestFromClientSideNoCodec
2014-04-11 12:58:52.862 java[9458:1003] Unable to load realm info from 
SCDynamicStore
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.737 sec
Running org.apache.hadoop.hbase.client.TestFromClientSideWithCoprocessor
2014-04-11 12:59:01.266 java[9477:1003] Unable to load realm info from 
SCDynamicStore
Tests run: 78, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 95.691 sec
{code}

> TestFromClientSideWithCoprocessor#testGetClosestRowBefore fails due to 
> invalid block size
> -----------------------------------------------------------------------------------------
>
>                 Key: HBASE-10961
>                 URL: https://issues.apache.org/jira/browse/HBASE-10961
>             Project: HBase
>          Issue Type: Sub-task
>            Reporter: Ted Yu
>            Assignee: Ted Yu
>            Priority: Minor
>             Fix For: hbase-10070
>
>         Attachments: 10961-v1.txt
>
>
> I saw the following test failure:
> {code}
> testGetClosestRowBefore(org.apache.hadoop.hbase.client.TestFromClientSideWithCoprocessor)
>   Time elapsed: 0.003 sec  <<< ERROR!
> org.apache.hadoop.hbase.DoNotRetryIOException: 
> org.apache.hadoop.hbase.DoNotRetryIOException: Block size for column family 
> info  must be between 1K and 16MB Set hbase.table.sanity.checks to false at 
> conf or table descriptor if you want to bypass sanity checks
>         at 
> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1828)
>         at 
> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1747)
>         at 
> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1873)
> {code}
> This was due to block size of 64 passed to TEST_UTIL.createTable() call.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to