Issue while joining data using pheonix

2015-08-11 Thread Nipur Patodi
Hi All, I am trying to join data in hbase phoenix tables. How ever I am getting this exception. *Error: Encountered exception in sub plan [0] execution. (state=,code=0)* *java.sql.SQLException: Encountered exception in sub plan [0] execution.* * at org.apache.phoenix.execute.HashJoinPlan.iterato

passing hbase scan start row in spark_phoenix

2015-08-11 Thread Hafiz Mujadid
Hi all! Can we use spark_phoenix in a way just like in normal java api we can pass start row to filter data as follow *//creating a scan object with start and stop row keys* *Scan scan = new Scan(Bytes.ToBytes("a.b.x|1"),Bytes.toBytes("a.b.x|2"); *

Re: passing hbase scan start row in spark_phoenix

2015-08-11 Thread Yuhao Bi
Hi, Here is some official document which may help. 1.http://phoenix.apache.org/skip_scan.html 2.Wanna do some Pagination-like scan? Please refer to http://phoenix.apache.org/paged.html Thanks. 2015-08-11 20:47 GMT+08:00 Hafiz Mujadid : > Hi all! > > > Can we use spark_phoenix in

Re: passing hbase scan start row in spark_phoenix

2015-08-11 Thread Hafiz Mujadid
Yes I want to do pagination and I am confused how to achieve pagination? On Tue, Aug 11, 2015 at 7:08 PM, Yuhao Bi wrote: > Hi, > > Here is some official document which may help. > > 1.http://phoenix.apache.org/skip_scan.html > 2.Wanna do some Pagination-like scan? > Please refer to

Re: passing hbase scan start row in spark_phoenix

2015-08-11 Thread Yuhao Bi
Hi, 1.Image we create the test table with following sql: CREATE TABLE library ( title varchar not null, author varchar not null, isbn varchar not null, published_date integer, description varchar, CONSTRAINT pk PRIMARY KEY(title, author, isbn) ) 2.After insert some test data, our tabl

Re: passing hbase scan start row in spark_phoenix

2015-08-11 Thread Hafiz Mujadid
Hi Yuhao! Why record number 6 should be omitted? On Tue, Aug 11, 2015 at 9:08 PM, Yuhao Bi wrote: > Hi, > > 1.Image we create the test table with following sql: > CREATE TABLE library ( > title varchar not null, > author varchar not null, > isbn varchar not null, > published_date intege

Re: Issue while joining data using pheonix

2015-08-11 Thread Thomas D'Silva
Nipur, Are you sure the config change is getting picked up? The exception says the maximum allowed size is (104857664 bytes ~ 0.1GB) not 1GB. Thanks, Thomas On Tue, Aug 11, 2015 at 12:43 AM, Nipur Patodi wrote: > Hi All, > > I am trying to join data in hbase phoenix tables. How ever I am gettin

Re: passing hbase scan start row in spark_phoenix

2015-08-11 Thread Yuhao Bi
Hi Hafiz, Sorry about the mistake, In 2), the returned records should be 6ccc2bbb ccc 2012NULL 8ccc4bbb ccc 2014NULL I hope I did not confuse you. Thanks. 2015-08-12 3:16 GMT+08:00 Hafiz Mujadid : > Hi Yuhao! > > Why reco

Re: Mismatched output with 2 UDFs in a query

2015-08-11 Thread Anchal Agrawal
Hi all, To reiterate, I've been getting mismatched output when I use two different UDFs on the pk column in the same query statement. If two UDFs are in the SELECT clause, only the first one is picked up. If there's a UDF in the WHERE clause, it's picked up instead of the UDF in the SELECT claus

Re: passing hbase scan start row in spark_phoenix

2015-08-11 Thread Hafiz Mujadid
thanks :) On Wed, Aug 12, 2015 at 5:58 AM, Yuhao Bi wrote: > Hi Hafiz, > > Sorry about the mistake, In 2), the returned records should be > 6ccc2bbb ccc 2012NULL > 8ccc4bbb ccc 2014NULL > > I hope I did not confuse you. > >

RE: Issue while joining data using pheonix

2015-08-11 Thread Nipur Patodi
Hey Thomas, I made those changes in hbase-site.xml on each region server. I have crosschecked and looks like this file is in class path of sqlline.py. But still looks like updated config are not picked. Is there any way to apply these config ( by cmdline if possible) in phoenix sqlline? Thank