Yes, on that simple example you can change it to VARCHAR or INTEGER and it
works fine but my main objective is to use pig to read a binary avro file
and store it on an HBase table managed by phoenix.
Are you a phoenix developer?, could you create and issue on jira for this?
Daniel Rodriguez
On
You can try to use hbase shell to manually add "_0" column family into
your destination hbase table. Phoenix 4.0 from Apache can't work on
hbase0.96. You can check discussions in
https://issues.apache.org/jira/browse/PHOENIX-848 to see if your hbase is
good for phoenix 4.0.
Thanks,
-Jeffrey
On
This seems a bug to me. Could you give a try to change your binary column
type from VARBINARY to VARCHAR to work around this issue?
From: Daniel Rodriguez
Reply-To:
Date: Tuesday, June 24, 2014 8:49 AM
To:
Subject: Pig StoreFunc using VARBINARY
Hello,
I was able to successfully insert "
Sorry I missed this, thanks for the response. Yeah I worked around it for
now, sometimes these sql access layers set things that make sense but are
out of your control. That seems like a good nuclear option, but maybe if
no one else cares Ill just stick with what I have. BTW, I was using this
li
Hello,
I was able to successfully insert "basic" data types (int and varchar)
using the Pig StoreFunc but I have not been able to insert a pig bytearray
into a phoenix VARBINARY column.
Example:
CREATE TABLE IF NOT EXISTS binary (id BIGINT NOT NULL, binary VARBINARY
CONSTRAINT my_pk PRIMARY KEY
Hi
We're currently running Phoenix 2.2 on HBase 0.94 CDH 4.4 and slowly
preparing to move to Phoenix 4 and HBase 0.96 CDH 5.
For my first tests I wanted to simply copy data from 0.94 to 0.96,
which works fine for regular hbase table using the following commands:
$ hbase org.apache.hadoop.hbase.m