Hi Russel,
Apparently, Phoenix 4.0.0 leverages few API methods of HBase 0.98.4 v
which aren't present within 0.98.1 that comes with CDH 5.1 . That's the
primary cause for the build issues.
Regards
Ravi
On Mon, Aug 18, 2014 at 5:56 PM, Russell Jurney
wrote:
> Talking to myself, but hopeful
Talking to myself, but hopefully creating good docs. Replacing the previous
hadoop version with one I found here:
https://repository.cloudera.com/artifactory/cloudera-repos/org/apache/hadoop/hadoop-core/,
2.3.0-mr1-cdh5.1.0,
makes things get a little further.
I can't get past some build errors, ho
Ok, so it is clear to me what I have to do. I have to edit my pom.xml to
point at CDH 5.1, which translates into:
Add the cloudera repo:
cloudera
https://repository.cloudera.com/artifactory/cloudera-repos/
Then change the hadoop and hbase versions:
0.98.1-cdh5.1.
When I try to store data into Phoenix from Pig, I get this error. I am on
CDH 5.1, and Phoenix 4.0.
Anyone know how to resolve this issue?
2014-08-18 17:11:25,165 INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader:
Current split being processed
hdfs://cluster1-srv1.
Hi all,
I'm having problems creating a join table when one of the fields involved
is a CHAR. I have a reproducible test case below:
-- Create source table
CREATE TABLE IF NOT EXISTS SOURCE_TABLE(
TID CHAR(3) NOT NULL,
A UNSIGNED_INT NOT NULL,
B UNSIGNED_INT NOT NULL
CONSTRAINT pk PRIMARY
Thanks!
On Monday, August 18, 2014, Nick Dimiduk wrote:
> Hi Russell,
>
> CDH 5.1 is HBase 0.98. You'll need Phoenix 4.x -- the 3.x line is for
> HBase 0.94 series.
>
> -n
>
>
> On Mon, Aug 18, 2014 at 11:55 AM, Russell Jurney > wrote:
>
>> I added hbase-server.jar and it fixed that error. Now
Hi Russell,
CDH 5.1 is HBase 0.98. You'll need Phoenix 4.x -- the 3.x line is for HBase
0.94 series.
-n
On Mon, Aug 18, 2014 at 11:55 AM, Russell Jurney
wrote:
> I added hbase-server.jar and it fixed that error. Now I get:
>
> java.lang.NoSuchMethodError:
> org.apache.hadoop.hbase.client.Dele
I added hbase-server.jar and it fixed that error. Now I get:
java.lang.NoSuchMethodError:
org.apache.hadoop.hbase.client.Delete.addDeleteMarker(Lorg/apache/hadoop/hbase/KeyValue;)Lorg/apache/hadoop/hbase/client/Delete;
at
org.apache.phoenix.hbase.index.util.KeyValueBuilder.deleteQuietly(KeyValueB
I run SQLLine, configured to work with CDH 5.1, via the following command:
java -cp
> lib/*:target/uber-phoenix-debug-1.0-SNAPSHOT.jar:target/phoenix-debug-1.0-SNAPSHOT.jar
> com.hivedata.phoenix.PhoenixSqlline -u jdbc:phoenix:cluster1-srv2 -n none
> -p none --color=true --fastConnect=true --silen
Hi All,
I created the table with having 10 version of each column ,how I can get
the all version of the values in the table.
Exxample :
I created the table : create table test (mykey integer not null primary
key, mycolumn varchar);
and Inserted the Values Like : upsert into test values (1,'
10 matches
Mail list logo