Hi James,
Due to a typo we forgot to put CF name as prefix to CQ name in 1/1100
column of that table. That led to creation of CF with name "0". After
fixing the typo, we only have 2 CF.
Thanks,
Anil Gupta
On Thu, Feb 18, 2016 at 11:20 AM, James Taylor
wrote:
> Hi Anil,
When trying to run update status on an existing table in hbase, I get error:
Update stats:
UPDATE STATISTICS "ops_csv" ALL
error:
ERROR 504 (42703): Undefined column. columnName=REGION_NAME
Looks like the meta data information is messed up, ie. there is no column with
name REGION_NAME in this
Hi Vamsi,
I can't answer your question abotu the Phoenix-Spark plugin (although
I'm sure that someone else here can).
However, I can tell you that the CsvBulkLoadTool does not write to the
WAL or to the Memstore. It simply writes HFiles and then hands those
HFiles over to HBase, so the memstore
Team,
Does phoenix CsvBulkLoadTool write to HBase WAL/Memstore?
Phoenix-Spark plugin:
Does saveToPhoenix method on RDD[Tuple] write to HBase WAL/Memstore?
Thanks,
Vamsi Attluri
--
Vamsi Attluri