Which version of Phoenix you are trying with?
Is it build from laster master branch on your own and tried with
hbase-1.0.0?
I think assignment of SYSTEM.CATALOG table regions might be failed.
You can check hbase master UI to see any regions in transition and
also It would be better to check HBase
Hi Sun, There are region splitting related issues in HBase 0.98.6-0.98.8.
So it will cause inconsistencies in the cluster if any split failures.
So better to upgrade HBase to latest versions.
Thanks,
Rajeshbabu.
On Thu, Feb 26, 2015 at 8:28 AM, su...@certusnet.com.cn
su...@certusnet.com.cn
Hi Naga,
I have uploaded patch at PHOENIX-1248 please apply the patch and try now if
possible.
Thanks,
Rajeshbabu.
On Thu, Jan 22, 2015 at 8:57 PM, Naga Vijayapuram naga_vijayapu...@gap.com
wrote:
Thanks Rajeshbabu !
On Jan 22, 2015, at 8:50 PM, Rajeshbabu Chintaguntla
chrajeshbab
Hi Siddharth,
They are using both Hindex and Phoenix depending upon customer requirements.
Thanks,
Rajeshbabu.
On Tue, Jan 6, 2015 at 7:30 PM, Siddharth Ubale siddharth.ub...@syncoms.com
wrote:
HI Rajeshbabu,
Is Huawei still using HINDEX or are u using Phoenix as a solution to hbase
Hi Siddharth,
Index table also should be updated automatically in your case.
But there is a case if any updates failed to index table then we are disabling
the index and later automatically rebuild after 5 mins.
FYI PHOENIX-1112(https://issues.apache.org/jira/browse/PHOENIX-1112)
Can you please
...@certusnet.com.cn]
Sent: Tuesday, November 25, 2014 7:47 AM
To: user; rajeshbabu chintaguntla
Subject: Re: Re: local index creation catch error with specifying
DEFAULT_COLUMN_FAMILY when creating data table
Hi, Rajeshbabu
Thanks for your reply. My cluster contains hbase 0.98.6-cdh5.2.0
on cols,
5 load another group data to Hbase by spoon;
6 i can't get the new data in step 5;
whether only use phoenix client to insert data can be got ?
-- Original --
From: rajeshbabu chintaguntla;rajeshbabu.chintagun...@huawei.com;
Date: Thu, Oct 30, 2014 05:05 PM
bq. I am trying to load data into the phoenix table, as Phoenix may not support
index related
data bulkload, I am tring to upsert data into phoenix through JDBC
statements.
In 4.1 release CSVBulkLoadTool can be used to build indexes when loading data.
See [1].
And also some more work is