Re: How to map sparse hbase table with dynamic columns into Phoenix

2016-12-11 Thread Arvind S
Note: as your columns are all in small remember to qualify the table name,
column family and column names in double quotes "".
Below example i created should help with the rest ..  i have used capital
letters for table names and cols to make it easy ;)
=

*in hbase *...

create 'TESTER', {NAME=>'F1'}, {NAME=>'F2'}
put 'TESTER', '1','F1:C1', 'duku'
put 'TESTER', '1','F1:C2', 'count duku'
put 'TESTER', '1','F2:COL1', 'yoda'
put 'TESTER', '1','F2:COL2', 'master yoda'

*in Phoneix* ...
CREATE TABLE IF NOT EXISTS TESTER (
  ROW varchar PRIMARY KEY,
  F1.C1 varchar,
  F1.C2 varchar,
  F2.COL1 varchar,
  F2.COL2 varchar
  );
[image: Inline images 2]


now to make dynamic column view
*in hbase *... add new columns

put 'TESTER', '1','F1:C3', 'skywalker'
put 'TESTER', '1','F2:COL3', 'luke skywalker'

[image: Inline images 3]

*in Phoneix* ... make the dynamic view

CREATE VIEW TESTER_VIEW(
  F1.C3 varchar,
  F2.COL3 varchar) AS
SELECT * FROM TESTER;


[image: Inline images 4]


On 2016-12-10 05:28 ( 0530), "Sethuramaswamy, Suresh CWR" <
s...@credit-suisse.com> wrote:
> All,>
>
> We have a sparse hbase table with 3 column families and variable  column
qualifiers in each row. Can someone help me how to create a phoenix view to
map this hbase table into phoenix ?>
>
> Sample of Hbase table :>
>
> Row1:   Key cf1.name cf1.id cf2.age cf3.salary>
> Row2:   Key cf1.id cf1.dept   cf3.salary>
> Row3:   Key cf1.client_name   cf1.client_address
  cf3. Start_date>
>
>
>
>
> Suresh Sethuramaswamy>
> CREDIT SUISSE>
> Information Technology | Client Intelligence, KFLI 5>
> One Madison Avenue | 10010 New York | Americas>
> Phone  1 212 325 1060>
> suresh.sethuramasw...@credit-suisse.com |
www.credit-suisse.com>
>
>
===
>
> Please access the attached hyperlink for an important electronic
communications disclaimer: >
> http://www.credit-suisse.com/legal/en/disclaimer_email_ib.html >
>
===
>
>


Re: phoenix upsert select query fails with : java.util.concurrent.ExecutionException: java.lang.ArrayIndexOutOfBoundsException

2016-12-11 Thread Josh Elser

What's the rest of the stacktrace? You cut off the cause.

venkata subbarayudu wrote:


I faced a strange issue, that, Phoenix hbase upsert query fails with
ArrayIndexOutOfBounds exception.

Query looks like:
   upsert into table (pk,col1, col2, col3) select a.pk
,a.col1,a.col2,a.col3 from table a inner join table b on
a.pk =b.pk  where b.col2='value'

we use hbase-1.1.2
with phoneix-4.8.1

Below is the stack trace of the error
---

[main] org.apache.phoenix.iterate.BaseResultIterators: Failed to execute task 
during cancel
java.util.concurrent.ExecutionException: 
java.lang.ArrayIndexOutOfBoundsException
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:192)
at 
org.apache.phoenix.iterate.BaseResultIterators.close(BaseResultIterators.java:882)
at 
org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:819)
at 
org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:721)
at 
org.apache.phoenix.iterate.RoundRobinResultIterator.getIterators(RoundRobinResultIterator.java:176)
at 
org.apache.phoenix.iterate.RoundRobinResultIterator.next(RoundRobinResultIterator.java:91)
at 
org.apache.phoenix.iterate.DelegateResultIterator.next(DelegateResultIterator.java:44)
at 
org.apache.phoenix.compile.UpsertCompiler$2.execute(UpsertCompiler.java:815)
at 
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:344)
at 
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:332)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
at 
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:331)
at 
org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1423)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.lang.ArrayIndexOutOfBoundsException

But re-trying the same query, it got succeded. I'm trying to understand the 
root cause of the problem and how to resolve the same

any help is much appreciated.

--
/*Venkata Subbarayudu Amanchi.*/


phoenix upsert select query fails with : java.util.concurrent.ExecutionException: java.lang.ArrayIndexOutOfBoundsException

2016-12-11 Thread venkata subbarayudu
I faced a strange issue, that, Phoenix hbase upsert query fails with
ArrayIndexOutOfBounds exception.

Query looks like:
  upsert into table (pk,col1, col2, col3) select a.pk,a.col1,a.col2,a.col3
from table a inner join table b on a.pk=b.pk where b.col2='value'

we use hbase-1.1.2
with phoneix-4.8.1

Below is the stack trace of the error
---

[main] org.apache.phoenix.iterate.BaseResultIterators: Failed to
execute task during cancel
java.util.concurrent.ExecutionException:
java.lang.ArrayIndexOutOfBoundsException
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:192)
at 
org.apache.phoenix.iterate.BaseResultIterators.close(BaseResultIterators.java:882)
at 
org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:819)
at 
org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:721)
at 
org.apache.phoenix.iterate.RoundRobinResultIterator.getIterators(RoundRobinResultIterator.java:176)
at 
org.apache.phoenix.iterate.RoundRobinResultIterator.next(RoundRobinResultIterator.java:91)
at 
org.apache.phoenix.iterate.DelegateResultIterator.next(DelegateResultIterator.java:44)
at 
org.apache.phoenix.compile.UpsertCompiler$2.execute(UpsertCompiler.java:815)
at 
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:344)
at 
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:332)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
at 
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:331)
at 
org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1423)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.lang.ArrayIndexOutOfBoundsException

But re-trying the same query, it got succeded. I'm trying to
understand the root cause of the problem and how to resolve the same

any help is much appreciated.

-- 
*Venkata Subbarayudu Amanchi.*