Phoenix and Tableau

2016-01-28 Thread Riesland, Zack
Hey folks,

Everything I've read online about connecting Phoenix and Tableau is at least a 
year old.

Has there been any progress on an ODBC driver?

Any simple hacks to accomplish this?

Thanks!



Re: Announcing phoenix-for-cloudera 4.6.0

2016-01-28 Thread Andrew Purtell
Looking today


On Tue, Jan 26, 2016 at 11:00 PM, Kumar Palaniappan <
kpalaniap...@marinsoftware.com> wrote:

> Andrew, any updates? Seem HBase-11544 impacted the Phoenix and CDH 5.5.1
> isnt working.
>
> On Sun, Jan 17, 2016 at 11:25 AM, Andrew Purtell  > wrote:
>
>> This looks like something easy to fix up. Maybe I can get to it next week.
>>
>> > On Jan 15, 2016, at 9:07 PM, Krishna  wrote:
>> >
>> > On the branch:  4.5-HBase-1.0-cdh5, I set cdh version to 5.5.1 in pom
>> and
>> > building the package produces following errors.
>> > Repo: https://github.com/chiastic-security/phoenix-for-cloudera
>> >
>> > [ERROR]
>> >
>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/util/Tracing.java:[176,82]
>> > cannot find symbol
>> > [ERROR] symbol:   method getParentId()
>> > [ERROR] location: variable span of type org.apache.htrace.Span
>> > [ERROR]
>> >
>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[129,31]
>> > cannot find symbol
>> > [ERROR] symbol:   variable ROOT_SPAN_ID
>> > [ERROR] location: interface org.apache.htrace.Span
>> > [ERROR]
>> >
>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[159,38]
>> > cannot find symbol
>> > [ERROR] symbol:   variable ROOT_SPAN_ID
>> > [ERROR] location: interface org.apache.htrace.Span
>> > [ERROR]
>> >
>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[162,31]
>> > cannot find symbol
>> > [ERROR] symbol:   variable ROOT_SPAN_ID
>> > [ERROR] location: interface org.apache.htrace.Span
>> > [ERROR]
>> >
>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[337,38]
>> > cannot find symbol
>> > [ERROR] symbol:   variable ROOT_SPAN_ID
>> > [ERROR] location: interface org.apache.htrace.Span
>> > [ERROR]
>> >
>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[339,42]
>> > cannot find symbol
>> > [ERROR] symbol:   variable ROOT_SPAN_ID
>> > [ERROR] location: interface org.apache.htrace.Span
>> > [ERROR]
>> >
>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[359,58]
>> > cannot find symbol
>> > [ERROR] symbol:   variable ROOT_SPAN_ID
>> > [ERROR] location: interface org.apache.htrace.Span
>> > [ERROR]
>> >
>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceMetricSource.java:[99,74]
>> > cannot find symbol
>> > [ERROR] symbol:   method getParentId()
>> > [ERROR] location: variable span of type org.apache.htrace.Span
>> > [ERROR]
>> >
>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceMetricSource.java:[110,60]
>> > incompatible types
>> > [ERROR] required: java.util.Map
>> > [ERROR] found:java.util.Map
>> > [ERROR]
>> >
>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java:[550,57]
>> > > > org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver$1> is
>> not
>> > abstract and does not override abstract method
>> >
>> nextRaw(java.util.List,org.apache.hadoop.hbase.regionserver.ScannerContext)
>> > in org.apache.hadoop.hbase.regionserver.RegionScanner
>> >
>> >
>> >> On Fri, Jan 15, 2016 at 6:20 PM, Krishna 
>> wrote:
>> >>
>> >> Thanks Andrew. Are binaries available for CDH5.5.x?
>> >>
>> >> On Tue, Nov 3, 2015 at 9:10 AM, Andrew Purtell 
>> >> wrote:
>> >>
>> >>> Today I pushed a new branch '4.6-HBase-1.0-cdh5' and the tag
>> >>> 'v4.6.0-cdh5.4.5' (58fcfa6) to
>> >>> https://github.com/chiastic-security/phoenix-for-cloudera. This is
>> the
>> >>> Phoenix 4.6.0 release, modified to build against CDH 5.4.5 and
>> possibly
>> >>> (but not tested) subsequent CDH releases.
>> >>>
>> >>> If you want release tarballs I built from this, get them here:
>> >>>
>> >>> Binaries
>> >>>
>> >>>
>> http://apurtell.s3.amazonaws.com/phoenix/phoenix-4.6.0-cdh5.4.5-bin.tar.gz
>> >>>
>> >>>
>> http://apurtell.s3.amazonaws.com/phoenix/phoenix-4.6.0-cdh5.4.5-bin.tar.gz.asc
>> >>> (signature)
>> >>>
>> >>>
>> http://apurtell.s3.amazonaws.com/phoenix/phoenix-4.6.0-cdh5.4.5-bin.tar.gz.md5
>> >>> (MD5 sum)
>> >>>
>> >>>
>> http://apurtell.s3.amazonaws.com/phoenix/phoenix-4.6.0-cdh5.4.5-bin.tar.gz.sha
>> >>> (SHA-1 sum)
>> >>>
>> >>>
>> >>> Source
>> >>>
>> >>>
>> http://apurtell.s3.amazonaws.com/phoenix/phoenix-4.6.0-cdh5.4.5-src.tar.gz
>> >>>
>> >>>
>> >>>
>> >>>
>> http://apurtell.s3.amazonaws.com/phoenix/phoenix-4.6.0-cdh5.4.5-src.tar.gz.asc
>> >>> (signature)
>> >>>
>> >>>
>> >>>
>> http://apurtell.s3.amazonaws.com/phoenix/phoenix-4.6.0-cdh5.4.5-src.tar.gz.md5
>> >>> 

Phoenix Query exception on few tables

2016-01-28 Thread kannan.ramanathan
We have started seeing PhoenixIOException exception from select query on few of 
our tables. The same query worked before.

The query is simple select query and the exception is below:

org.apache.phoenix.exception.PhoenixIOException: 
org.apache.phoenix.exception.PhoenixIOException: The system cannot find the 
path specified

If I run the scan command from HBase shell on the same table, it works fine.

It appears that Phoenix metadata for these tables might have been corrupted. 
What could be the problem and how do we fix it?

Thanks for your help.

Regards
Kannan.

___

This message is for information purposes only, it is not a recommendation, 
advice, offer or solicitation to buy or sell a product or service nor an 
official confirmation of any transaction. It is directed at persons who are 
professionals and is not intended for retail customer use. Intended for 
recipient only. This message is subject to the terms at: 
www.barclays.com/emaildisclaimer.

For important disclosures, please see: 
www.barclays.com/salesandtradingdisclaimer regarding market commentary from 
Barclays Sales and/or Trading, who are active market participants; and in 
respect of Barclays Research, including disclosures relating to specific 
issuers, please see http://publicresearch.barclays.com.

___


Re: Phoenix and Tableau

2016-01-28 Thread Thomas Decaux
Yeah me too  :/ i tried  Spark , it works fine with Tableau on Mac.

You should give a try!
Le 28 janv. 2016 8:30 PM, "Aaron Bossert"  a écrit :

> Nice!  It's a start...unfortunately, I use the OS X verSion.
>
> --
> Aaron
>
> On Jan 28, 2016, at 2:26 PM, Thomas Decaux  wrote:
>
> They said only for  windows OS
> Le 28 janv. 2016 6:36 PM, "Aaron Bossert"  a écrit :
>
>> Sorry for butting in, but do you mean that tableau supports JDBC
>> drivers?  I have wanted to connect Phoenix to tableau for some time now as
>> well, but have not seen any documentation from tableau to suggest that they
>> now support JDBC drivers.  Just references to using a JDBC-ODBC bridge
>> driver, which all discussions I have seen related to that have had very
>> negative outcomes.
>>
>> --
>> Aaron
>>
>> On Jan 28, 2016, at 12:14 PM, Thomas Decaux  wrote:
>>
>> You can use jdbc driver already, also, you could  use Spark as a proxy
>> between.
>> Le 28 janv. 2016 5:47 PM, "Riesland, Zack"  a
>> écrit :
>>
>>> Hey folks,
>>>
>>>
>>>
>>> Everything I’ve read online about connecting Phoenix and Tableau is at
>>> least a year old.
>>>
>>>
>>>
>>> Has there been any progress on an ODBC driver?
>>>
>>>
>>>
>>> Any simple hacks to accomplish this?
>>>
>>>
>>>
>>> Thanks!
>>>
>>>
>>>
>>


Re: Phoenix and Tableau

2016-01-28 Thread Thomas Decaux
They said only for  windows OS
Le 28 janv. 2016 6:36 PM, "Aaron Bossert"  a écrit :

> Sorry for butting in, but do you mean that tableau supports JDBC drivers?
> I have wanted to connect Phoenix to tableau for some time now as well, but
> have not seen any documentation from tableau to suggest that they now
> support JDBC drivers.  Just references to using a JDBC-ODBC bridge driver,
> which all discussions I have seen related to that have had very negative
> outcomes.
>
> --
> Aaron
>
> On Jan 28, 2016, at 12:14 PM, Thomas Decaux  wrote:
>
> You can use jdbc driver already, also, you could  use Spark as a proxy
> between.
> Le 28 janv. 2016 5:47 PM, "Riesland, Zack"  a
> écrit :
>
>> Hey folks,
>>
>>
>>
>> Everything I’ve read online about connecting Phoenix and Tableau is at
>> least a year old.
>>
>>
>>
>> Has there been any progress on an ODBC driver?
>>
>>
>>
>> Any simple hacks to accomplish this?
>>
>>
>>
>> Thanks!
>>
>>
>>
>


Re: Phoenix and Tableau

2016-01-28 Thread Josh Mahonin
Hey Thomas,

That's pretty neat if I read that right. You're able to use Tableau with
Phoenix using the Phoenix-Spark integration?

Thanks!

Josh

On Thu, Jan 28, 2016 at 2:31 PM, Thomas Decaux  wrote:

> Yeah me too  :/ i tried  Spark , it works fine with Tableau on Mac.
>
> You should give a try!
> Le 28 janv. 2016 8:30 PM, "Aaron Bossert"  a écrit :
>
>> Nice!  It's a start...unfortunately, I use the OS X verSion.
>>
>> --
>> Aaron
>>
>> On Jan 28, 2016, at 2:26 PM, Thomas Decaux  wrote:
>>
>> They said only for  windows OS
>> Le 28 janv. 2016 6:36 PM, "Aaron Bossert"  a écrit :
>>
>>> Sorry for butting in, but do you mean that tableau supports JDBC
>>> drivers?  I have wanted to connect Phoenix to tableau for some time now as
>>> well, but have not seen any documentation from tableau to suggest that they
>>> now support JDBC drivers.  Just references to using a JDBC-ODBC bridge
>>> driver, which all discussions I have seen related to that have had very
>>> negative outcomes.
>>>
>>> --
>>> Aaron
>>>
>>> On Jan 28, 2016, at 12:14 PM, Thomas Decaux  wrote:
>>>
>>> You can use jdbc driver already, also, you could  use Spark as a proxy
>>> between.
>>> Le 28 janv. 2016 5:47 PM, "Riesland, Zack"  a
>>> écrit :
>>>
 Hey folks,



 Everything I’ve read online about connecting Phoenix and Tableau is at
 least a year old.



 Has there been any progress on an ODBC driver?



 Any simple hacks to accomplish this?



 Thanks!



>>>


Re: Telco HBase POC

2016-01-28 Thread Vijay Vangapandu
Hi Guys,

There is small confusion in below email.
There is no deserialization issue in apache phoenix layer. Response times 
breakdown in the below email is by using phoenix and it’s pretty good.

What i am talking about is a java client library we created to deserialize the 
data from phoenix result set to model object.
We implemented a generic ORM kind of library on top of Phoenix for Java object 
mapping and built the support for DSL kind of queries and his is where the 
extra overhead is.

As I said I don't see any issue with Phoenix.

On Jan 20, 2016, at 9:00 AM, Vijay Vangapandu 
> wrote:

Hi guys,
We recently migrated one of our user facing use cases to HBase and we are using 
Phoenix as query layer.
We managed to get a singles record in 30Ms.

Here is the response times breakdown.
75th - 29MS
95th - 43MS
99th - 76 MS

We have about 6Billion records in store and each row contains around 30 columns.

We are using HortenWorks configuration with few config tweaks.

We enabled the block cache.

Our use case is to get all records associated to user to render in
List/card view . Each user has on an average 5K records.

Our biggest bottleneck is serialization.
Above response times are for single record reads from HBase, but with 
serialization and processing cost its averaging at 80MS.


Sent from my iPhone

On Jan 20, 2016, at 6:27 AM, Riesland, Zack 
> wrote:

I have a similar data pattern and 100ms response time is fairly consistent.

I’ve been trying hard to find the right set of configs to get closer to 10-20ms 
with no luck, but I’m finding that 100ms average is pretty reasonable.

From: Willem Conradie [mailto:willem.conra...@pbtgroup.co.za]
Sent: Wednesday, January 20, 2016 8:31 AM
To: jamestay...@apache.org
Cc: user@phoenix.apache.org
Subject: RE: Telco HBase POC

Hi James,

Thanks for being willing to assist.

This is what the input data record will look like (test data) :
UserID

DateTime

TXNID

DeviceID

IPAddress

UsageArray

URIArray

12345678901

20151006124945

992194978

123456789012345

111.111.111.111

10-4000:26272:1019324|0-4000:0:0|10-4000:25780:498309|420-4000:152:152|500-500:1258:2098|9001-9001:120:0|0-4000:0:0|502-4000:154:0|10-4000:73750:448374|420-4000:608:608|1-4000:364:550|358-4000:40:52

www.facebook.com|www.whatsapp.com|www.google.co.nz|ssl.gstatic.com|www.google.com


Unique key on record is “UserID,DateTime,TXNID”.

Read access pattern is as follows:
User queries by UserID, DateTime range to supply them with usage stats (derived 
from ‘UsageArray’) for websites (derived from ‘URIArray’) visited over their 
selected time range.

Just to recap the data volumes:
Expected data volume :  60 000 files per day
  1 –to 10 MB per file
  500 million records per day
   500 GB total volume per day

I need  to be flexible in the amount of data stored. Initially it will be 5 
days, but can increase to 30 days and possibly 90 days.

One concern I have (not founded in any way) with the phoenix client is whether 
it will be able to support data access for above queries within 100ms range.

Regards,
Willem

From: James Taylor [mailto:jamestay...@apache.org]
Sent: 19 January 2016 10:07 PM
To: user >
Subject: Re: Telco HBase POC

Hi Willem,
Let us know how we can help as you start getting into this, in particular with 
your schema design based on your query requirements.
Thanks,
James

On Mon, Jan 18, 2016 at 8:50 AM, Pariksheet Barapatre 
> wrote:

Hi Willem,

Use Phoenix bulk load. I guess your source is csv so phoenixcsvbulk loader can 
be used.

How frequently you want to load these files. If you can wait for certain 
interval to merge these files and map reduce will bulk load to Phoenix table.

Cheers
Pari

On 18-Jan-2016 4:17 pm, "Willem Conradie" 
> wrote:
Hi Pari,

My comments in blue.

Few notes from my experience :
1. Use bulk load rather than psql.py. Load larger files(merge) instead of small 
files.
Are you referring to native HBase bulk load or Phoenix MapReduce bulk load? 
Unfortunately we can’t change how the files are received from source. Must we 
pre-process to merge the files before running the bulk load utility?

2. Increase HBase block cache
3. Turn off HBase auto compaction
4. Select primary key correctly
5. Don't use salting . As table will be huge, your phoenix query will fork may 
scanners. Try something like hash on userid.
6. Define TTL to purge data periodically


Regards,

Re: Phoenix and Tableau

2016-01-28 Thread Aaron Bossert
Sorry for butting in, but do you mean that tableau supports JDBC drivers?  I 
have wanted to connect Phoenix to tableau for some time now as well, but have 
not seen any documentation from tableau to suggest that they now support JDBC 
drivers.  Just references to using a JDBC-ODBC bridge driver, which all 
discussions I have seen related to that have had very negative outcomes.

--
Aaron

> On Jan 28, 2016, at 12:14 PM, Thomas Decaux  wrote:
> 
> You can use jdbc driver already, also, you could  use Spark as a proxy 
> between.
> 
> Le 28 janv. 2016 5:47 PM, "Riesland, Zack"  a écrit 
> :
>> Hey folks,
>> 
>>  
>> 
>> Everything I’ve read online about connecting Phoenix and Tableau is at least 
>> a year old.
>> 
>>  
>> 
>> Has there been any progress on an ODBC driver?
>> 
>>  
>> 
>> Any simple hacks to accomplish this?
>> 
>>  
>> 
>> Thanks!


Re: Find and kill long-running queries

2016-01-28 Thread James Taylor
This sounds like a good idea. Please file a JIRA and we'll get this on the
roadmap. What tooling are you using, and would support for
Statement.cancel() do the trick?

On Wed, Jan 27, 2016 at 7:27 PM, Ken Hampson  wrote:

> I would be interested in this as well, knowing how often we have had the
> need in the past with similar queries on Postgres, and suspecting a similar
> need as we ramp up Phoenix usage.
>
> - Ken
>
>
> On Wed, Jan 27, 2016, 20:38 hongbin ma  wrote:
>
>> i'm also interested in this, allowing query killing would be nice
>>
>> On Wed, Jan 27, 2016 at 10:40 PM, jwilkinson 
>> wrote:
>>
>>> We are interested in being able to find and kill queries that have gone
>>> on
>>> for a long time.
>>>
>>> So far, it looks like there is no way other than killing the client, and
>>> no
>>> way to find running queries other than to have our client track them. Is
>>> that right, or have I missed something?
>>>
>>> In particular, I've looked at hbase tasks and found that phoenix queries
>>> like "select * from bigtable" appear as many independent rpcs.
>>>
>>> Any insights appreciated!
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-phoenix-user-list.1124778.n5.nabble.com/Find-and-kill-long-running-queries-tp966.html
>>> Sent from the Apache Phoenix User List mailing list archive at
>>> Nabble.com.
>>>
>>
>>
>>
>> --
>> Regards,
>>
>> *Bin Mahone | 马洪宾*
>> Apache Kylin: http://kylin.io
>> Github: https://github.com/binmahone
>>
>


Re: Phoenix and Tableau

2016-01-28 Thread Thomas Decaux
They (Tableau) said  yes, i didnt try yet
Le 28 janv. 2016 9:08 PM, "James Taylor"  a écrit :

> @Thomas - so Tableau has support for JDBC on Windows OS?
>
> On Thu, Jan 28, 2016 at 11:48 AM, Alex Kamil  wrote:
>
>> Zack, you can also use Simba ODBC SDK, I did a proof-of-concept awhile
>> back connecting Phoenix to Tableu and PowerBI, it worked like a charm :
>> http://www.simba.com/drivers/simba-engine-sdk/
>> The license costs $$ though.
>>
>> Cheers
>> Alex
>>
>> On Thu, Jan 28, 2016 at 2:37 PM, Josh Mahonin  wrote:
>>
>>> Hey Thomas,
>>>
>>> That's pretty neat if I read that right. You're able to use Tableau with
>>> Phoenix using the Phoenix-Spark integration?
>>>
>>> Thanks!
>>>
>>> Josh
>>>
>>> On Thu, Jan 28, 2016 at 2:31 PM, Thomas Decaux 
>>> wrote:
>>>
 Yeah me too  :/ i tried  Spark , it works fine with Tableau on Mac.

 You should give a try!
 Le 28 janv. 2016 8:30 PM, "Aaron Bossert"  a
 écrit :

> Nice!  It's a start...unfortunately, I use the OS X verSion.
>
> --
> Aaron
>
> On Jan 28, 2016, at 2:26 PM, Thomas Decaux  wrote:
>
> They said only for  windows OS
> Le 28 janv. 2016 6:36 PM, "Aaron Bossert"  a
> écrit :
>
>> Sorry for butting in, but do you mean that tableau supports JDBC
>> drivers?  I have wanted to connect Phoenix to tableau for some time now 
>> as
>> well, but have not seen any documentation from tableau to suggest that 
>> they
>> now support JDBC drivers.  Just references to using a JDBC-ODBC bridge
>> driver, which all discussions I have seen related to that have had very
>> negative outcomes.
>>
>> --
>> Aaron
>>
>> On Jan 28, 2016, at 12:14 PM, Thomas Decaux 
>> wrote:
>>
>> You can use jdbc driver already, also, you could  use Spark as a
>> proxy between.
>> Le 28 janv. 2016 5:47 PM, "Riesland, Zack" 
>> a écrit :
>>
>>> Hey folks,
>>>
>>>
>>>
>>> Everything I’ve read online about connecting Phoenix and Tableau is
>>> at least a year old.
>>>
>>>
>>>
>>> Has there been any progress on an ODBC driver?
>>>
>>>
>>>
>>> Any simple hacks to accomplish this?
>>>
>>>
>>>
>>> Thanks!
>>>
>>>
>>>
>>
>>>
>>
>


Re: Using Sqoop to load HBase tables , Data not visible via Phoenix

2016-01-28 Thread rafa
Hi Manya,

see this thread:

http://mail-archives.apache.org/mod_mbox/incubator-phoenix-user/201512.mbox/%3CCAOnY4Jd6u9T8-Ce2Lp54CbH_a8zj41FVc=iXT=z8hp8-mxv...@mail.gmail.com%3E

http://phoenix.apache.org/faq.html#How_I_map_Phoenix_table_to_an_existing_HBase_table

regards,
rafa


On Thu, Jan 28, 2016 at 1:26 PM, manya cancerian 
wrote:

>
>
>
> Looking for some help in the following scenario -
>
> - I have created on Phoenix table which created underneath HBase table.
>
> - Then used sqoop command to move data from relational database database
> table (Teradata) to underneath HBase table successfully
>
> - I can view the data through HBase but it is not visible in Phoenix table.
>
> What i missing here ?
>
> Regards
> Manya
>
>
>
>
>


Re: Using Sqoop to load HBase tables , Data not visible via Phoenix

2016-01-28 Thread Ravi Kiran
Hi Manya,
  We are working with the Sqoop team to have our patch[1] that enables data
imports to Phoenix table directly.  In the mean time,   you can apply the
patch to Sqoop 1.4.6 source and give it a try.

Please do let us know how it goes .

[1] https://issues.apache.org/jira/browse/SQOOP-2649

Regards
Ravi

On Thu, Jan 28, 2016 at 4:42 AM, rafa  wrote:

> Hi Manya,
>
> see this thread:
>
>
> http://mail-archives.apache.org/mod_mbox/incubator-phoenix-user/201512.mbox/%3CCAOnY4Jd6u9T8-Ce2Lp54CbH_a8zj41FVc=iXT=z8hp8-mxv...@mail.gmail.com%3E
>
>
> http://phoenix.apache.org/faq.html#How_I_map_Phoenix_table_to_an_existing_HBase_table
>
> regards,
> rafa
>
>
> On Thu, Jan 28, 2016 at 1:26 PM, manya cancerian  > wrote:
>
>>
>>
>>
>> Looking for some help in the following scenario -
>>
>> - I have created on Phoenix table which created underneath HBase table.
>>
>> - Then used sqoop command to move data from relational database database
>> table (Teradata) to underneath HBase table successfully
>>
>> - I can view the data through HBase but it is not visible in Phoenix
>> table.
>>
>> What i missing here ?
>>
>> Regards
>> Manya
>>
>>
>>
>>
>>
>


Re: to_date not working as expected

2016-01-28 Thread James Taylor
Hi Binu,
Phoenix has never supported HBase 0.96, so I'm not sure where you got the
release from.

I recommend upgrading to a later, supported version of HBase and a later
version of Phoenix. Give the 4.7.0 RC a try.

One other tip in particular for views you create over existing HBase
tables. Use the UNSIGNED types documented here[1] as these use the same
serialization as the Bytes methods provided by HBase. If you tell Phoenix
the wrong type, it won't know so would produce erroneous data and queries.

Thanks,
James

[1] https://phoenix.apache.org/language/datatypes.html

On Thu, Jan 28, 2016 at 5:57 PM, Binu Mathew  wrote:

> Phoenix version 4.4.0
>
> Issues with Phoenix when used with HBase 0.96.0.2.0
>
> 2 Issues:
>
> *ISSUE:* to_date Function is not converting string data types in valid
> date formats to a DATE data type when used in the WHERE clause for date
> comparison.
>
> Below is a query I ran against a Phoenix view in which I use the ‘to_date’
> function to convert 2 VARCHAR columns to date.
> 1. column ‘created_at_ts’ stored as VARCHAR in format such as 2009-05-05
> 15:40:10.000
> 2. column ‘created_at_date’ stored as VARCHAR in format such as 2009-05-05
>
> Observe that the ‘to_date’ function coverts the 2 VARCHAR columns to dates:
>
>
> select to_date("created_at_ts"), to_date("created_at_date") from
> "gp_subscriptions" limit 5;
>
>
> +-+---+
> | TO_DATE(subscriber."created_at_ts", null, null) |
> TO_DATE(subscriber."created_at_date", null, null) |
>
> +-+---+
> | 2009-05-05 15:40:10.000 | 2009-05-05
> 00:00:00.000   |
> | 2012-11-22 07:37:34.000 | 2012-11-22
> 00:00:00.000   |
> | 2010-07-24 14:12:33.000 | 2010-07-24
> 00:00:00.000   |
> | 2012-11-22 07:38:04.000 | 2012-11-22
> 00:00:00.000   |
> | 2012-11-22 07:38:10.000 | 2012-11-22
> 00:00:00.000   |
>
> +-+---+
>
>
> Here is another query in which I’m using the ‘to_date’ function on string
> literals in the WHERE clause for date comparison .
>
> Observer that the ‘to_date’ function coverts the string literals to dates
> and the the date comparison correctly evaluates:
>
> select '1' from "gp_subscriptions" where to_date('2009-05-05
> 15:40:10.000') = to_date('2009-05-05 15:40:10.000') limit 2;
> 2 rows selected (0.035 seconds)
>
>
> Now when I try the date comparison using the columns from my view, it
> fails:
>
> select '1' from "gp_subscriptions" where to_date("created_at_ts") =
> to_date('2009-05-05 15:40:10.000') limit 2;
>
> Caused by: org.apache.hadoop.hbase.DoNotRetryIOException:
> BooleanExpressionFilter failed during reading: Could not initialize class
> org.apache.phoenix.util.DateUtil$ISODateFormatParser
> Caused by: java.lang.NoClassDefFoundError: Could not initialize class
> org.apache.phoenix.util.DateUtil$ISODateFormatParser
>
> Also fails with same error when I try: select '1' from "gp_subscriptions"
> where to_date("created_at_ts") = to_date('2009-05-05') limit 2;
>
> Caused by: org.apache.hadoop.hbase.DoNotRetryIOException:
> BooleanExpressionFilter failed during reading: Could not initialize class
> org.apache.phoenix.util.DateUtil$ISODateFormatParser
> Caused by: java.lang.NoClassDefFoundError: Could not initialize class
> org.apache.phoenix.util.DateUtil$ISODateFormatParser
>
>
> *ISSUE: *Date comparisons on string literals are not evaluating correctly
> such that dates in the future get interpreted as being less than dates in
> the past.
>
> Test case 1:
> 2009-05-05 15:40:10.000 is greater than (in the future) 2005-05-05
> 15:40:10.000
>
> The following query should return 2 rows, however, it does not return any
> rows:
>
> select '1' from "gp_subscriptions" where to_date('2009-05-05
> 15:40:10.000') > to_date('2005-05-05 15:40:10.000') limit 2;
> No rows selected (0.024 seconds)
>
>
> The following query should return no rows, however, it returns 2 rows:
>
> select '1' from "gp_subscriptions" where to_date('2009-05-05
> 15:40:10.000') < to_date('2005-05-05 15:40:10.000') limit 2;
> 2 rows selected (0.033 seconds)
>
> Test case 2:
> 2009-05-05 is greater than (in the future) than 1970-05-05
>
> The following query should return 2 rows, however, it does not return any
> rows:
>
> select '1' from "gp_subscriptions" where to_date('2009-05-05') >
> to_date('1970-05-05') limit 2;
> No rows selected (0.024 seconds)
>
>
> The following query should return no rows, however, it returns 2 rows:
>
> select '1' from "gp_subscriptions" where to_date('2009-05-05') <
> 

Re: Announcing phoenix-for-cloudera 4.6.0

2016-01-28 Thread Andrew Purtell
I pushed a new branch for CDH 5.5 (5.5.1) as
https://github.com/chiastic-security/phoenix-for-cloudera/tree/4.6-HBase-1.0-cdh5.5
 and renamed the branch for CDH 5.4 to
https://github.com/chiastic-security/phoenix-for-cloudera/tree/4.6-HBase-1.0-cdh5.4

The changes in 4.6-HBase-1.0-cdh5.5 pass unit and integration tests for me
(except a silly date test that hardcodes the expected year to 2015).


On Thu, Jan 28, 2016 at 11:23 AM, Andrew Purtell 
wrote:

> Looking today
>
>
> On Tue, Jan 26, 2016 at 11:00 PM, Kumar Palaniappan <
> kpalaniap...@marinsoftware.com> wrote:
>
>> Andrew, any updates? Seem HBase-11544 impacted the Phoenix and CDH 5.5.1
>> isnt working.
>>
>> On Sun, Jan 17, 2016 at 11:25 AM, Andrew Purtell <
>> andrew.purt...@gmail.com> wrote:
>>
>>> This looks like something easy to fix up. Maybe I can get to it next
>>> week.
>>>
>>> > On Jan 15, 2016, at 9:07 PM, Krishna  wrote:
>>> >
>>> > On the branch:  4.5-HBase-1.0-cdh5, I set cdh version to 5.5.1 in pom
>>> and
>>> > building the package produces following errors.
>>> > Repo: https://github.com/chiastic-security/phoenix-for-cloudera
>>> >
>>> > [ERROR]
>>> >
>>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/util/Tracing.java:[176,82]
>>> > cannot find symbol
>>> > [ERROR] symbol:   method getParentId()
>>> > [ERROR] location: variable span of type org.apache.htrace.Span
>>> > [ERROR]
>>> >
>>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[129,31]
>>> > cannot find symbol
>>> > [ERROR] symbol:   variable ROOT_SPAN_ID
>>> > [ERROR] location: interface org.apache.htrace.Span
>>> > [ERROR]
>>> >
>>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[159,38]
>>> > cannot find symbol
>>> > [ERROR] symbol:   variable ROOT_SPAN_ID
>>> > [ERROR] location: interface org.apache.htrace.Span
>>> > [ERROR]
>>> >
>>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[162,31]
>>> > cannot find symbol
>>> > [ERROR] symbol:   variable ROOT_SPAN_ID
>>> > [ERROR] location: interface org.apache.htrace.Span
>>> > [ERROR]
>>> >
>>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[337,38]
>>> > cannot find symbol
>>> > [ERROR] symbol:   variable ROOT_SPAN_ID
>>> > [ERROR] location: interface org.apache.htrace.Span
>>> > [ERROR]
>>> >
>>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[339,42]
>>> > cannot find symbol
>>> > [ERROR] symbol:   variable ROOT_SPAN_ID
>>> > [ERROR] location: interface org.apache.htrace.Span
>>> > [ERROR]
>>> >
>>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[359,58]
>>> > cannot find symbol
>>> > [ERROR] symbol:   variable ROOT_SPAN_ID
>>> > [ERROR] location: interface org.apache.htrace.Span
>>> > [ERROR]
>>> >
>>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceMetricSource.java:[99,74]
>>> > cannot find symbol
>>> > [ERROR] symbol:   method getParentId()
>>> > [ERROR] location: variable span of type org.apache.htrace.Span
>>> > [ERROR]
>>> >
>>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceMetricSource.java:[110,60]
>>> > incompatible types
>>> > [ERROR] required: java.util.Map
>>> > [ERROR] found:java.util.Map
>>> > [ERROR]
>>> >
>>> ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java:[550,57]
>>> > >> > org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver$1> is
>>> not
>>> > abstract and does not override abstract method
>>> >
>>> nextRaw(java.util.List,org.apache.hadoop.hbase.regionserver.ScannerContext)
>>> > in org.apache.hadoop.hbase.regionserver.RegionScanner
>>> >
>>> >
>>> >> On Fri, Jan 15, 2016 at 6:20 PM, Krishna 
>>> wrote:
>>> >>
>>> >> Thanks Andrew. Are binaries available for CDH5.5.x?
>>> >>
>>> >> On Tue, Nov 3, 2015 at 9:10 AM, Andrew Purtell 
>>> >> wrote:
>>> >>
>>> >>> Today I pushed a new branch '4.6-HBase-1.0-cdh5' and the tag
>>> >>> 'v4.6.0-cdh5.4.5' (58fcfa6) to
>>> >>> https://github.com/chiastic-security/phoenix-for-cloudera. This is
>>> the
>>> >>> Phoenix 4.6.0 release, modified to build against CDH 5.4.5 and
>>> possibly
>>> >>> (but not tested) subsequent CDH releases.
>>> >>>
>>> >>> If you want release tarballs I built from this, get them here:
>>> >>>
>>> >>> Binaries
>>> >>>
>>> >>>
>>> http://apurtell.s3.amazonaws.com/phoenix/phoenix-4.6.0-cdh5.4.5-bin.tar.gz
>>> >>>
>>> >>>
>>> http://apurtell.s3.amazonaws.com/phoenix/phoenix-4.6.0-cdh5.4.5-bin.tar.gz.asc
>>> >>> 

Re: Phoenix and Tableau

2016-01-28 Thread Thomas Decaux
You can use jdbc driver already, also, you could  use Spark as a proxy
between.
Le 28 janv. 2016 5:47 PM, "Riesland, Zack"  a
écrit :

> Hey folks,
>
>
>
> Everything I’ve read online about connecting Phoenix and Tableau is at
> least a year old.
>
>
>
> Has there been any progress on an ODBC driver?
>
>
>
> Any simple hacks to accomplish this?
>
>
>
> Thanks!
>
>
>


Re: Announcing phoenix-for-cloudera 4.6.0

2016-01-28 Thread Kumar Palaniappan
Andrew, is it HBase 1.1?


https://github.com/chiastic-security/phoenix-for-cloudera/tree/4.6-HBase-1.0-cdh5.5

On Thu, Jan 28, 2016 at 6:51 PM, Andrew Purtell  wrote:

> I pushed a new branch for CDH 5.5 (5.5.1) as
> https://github.com/chiastic-security/phoenix-for-cloudera/tree/4.6-HBase-1.0-cdh5.5
>  and renamed the branch for CDH 5.4 to
> https://github.com/chiastic-security/phoenix-for-cloudera/tree/4.6-HBase-1.0-cdh5.4
>
> The changes in 4.6-HBase-1.0-cdh5.5 pass unit and integration tests for me
> (except a silly date test that hardcodes the expected year to 2015).
>
>
> On Thu, Jan 28, 2016 at 11:23 AM, Andrew Purtell 
> wrote:
>
>> Looking today
>>
>>
>> On Tue, Jan 26, 2016 at 11:00 PM, Kumar Palaniappan <
>> kpalaniap...@marinsoftware.com> wrote:
>>
>>> Andrew, any updates? Seem HBase-11544 impacted the Phoenix and CDH 5.5.1
>>> isnt working.
>>>
>>> On Sun, Jan 17, 2016 at 11:25 AM, Andrew Purtell <
>>> andrew.purt...@gmail.com> wrote:
>>>
 This looks like something easy to fix up. Maybe I can get to it next
 week.

 > On Jan 15, 2016, at 9:07 PM, Krishna  wrote:
 >
 > On the branch:  4.5-HBase-1.0-cdh5, I set cdh version to 5.5.1 in pom
 and
 > building the package produces following errors.
 > Repo: https://github.com/chiastic-security/phoenix-for-cloudera
 >
 > [ERROR]
 >
 ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/util/Tracing.java:[176,82]
 > cannot find symbol
 > [ERROR] symbol:   method getParentId()
 > [ERROR] location: variable span of type org.apache.htrace.Span
 > [ERROR]
 >
 ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[129,31]
 > cannot find symbol
 > [ERROR] symbol:   variable ROOT_SPAN_ID
 > [ERROR] location: interface org.apache.htrace.Span
 > [ERROR]
 >
 ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[159,38]
 > cannot find symbol
 > [ERROR] symbol:   variable ROOT_SPAN_ID
 > [ERROR] location: interface org.apache.htrace.Span
 > [ERROR]
 >
 ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[162,31]
 > cannot find symbol
 > [ERROR] symbol:   variable ROOT_SPAN_ID
 > [ERROR] location: interface org.apache.htrace.Span
 > [ERROR]
 >
 ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[337,38]
 > cannot find symbol
 > [ERROR] symbol:   variable ROOT_SPAN_ID
 > [ERROR] location: interface org.apache.htrace.Span
 > [ERROR]
 >
 ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[339,42]
 > cannot find symbol
 > [ERROR] symbol:   variable ROOT_SPAN_ID
 > [ERROR] location: interface org.apache.htrace.Span
 > [ERROR]
 >
 ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[359,58]
 > cannot find symbol
 > [ERROR] symbol:   variable ROOT_SPAN_ID
 > [ERROR] location: interface org.apache.htrace.Span
 > [ERROR]
 >
 ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceMetricSource.java:[99,74]
 > cannot find symbol
 > [ERROR] symbol:   method getParentId()
 > [ERROR] location: variable span of type org.apache.htrace.Span
 > [ERROR]
 >
 ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceMetricSource.java:[110,60]
 > incompatible types
 > [ERROR] required: java.util.Map
 > [ERROR] found:java.util.Map
 > [ERROR]
 >
 ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java:[550,57]
 > >>> > org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver$1> is
 not
 > abstract and does not override abstract method
 >
 nextRaw(java.util.List,org.apache.hadoop.hbase.regionserver.ScannerContext)
 > in org.apache.hadoop.hbase.regionserver.RegionScanner
 >
 >
 >> On Fri, Jan 15, 2016 at 6:20 PM, Krishna 
 wrote:
 >>
 >> Thanks Andrew. Are binaries available for CDH5.5.x?
 >>
 >> On Tue, Nov 3, 2015 at 9:10 AM, Andrew Purtell 
 >> wrote:
 >>
 >>> Today I pushed a new branch '4.6-HBase-1.0-cdh5' and the tag
 >>> 'v4.6.0-cdh5.4.5' (58fcfa6) to
 >>> https://github.com/chiastic-security/phoenix-for-cloudera. This is
 the
 >>> Phoenix 4.6.0 release, modified to build against CDH 5.4.5 and
 possibly
 >>> (but not tested) subsequent CDH 

Re: Announcing phoenix-for-cloudera 4.6.0

2016-01-28 Thread Andrew Purtell
The HBase version in the 5.5 POM is 1.0.0-cdh-5.5.1. It appears to be HBase 
1.0.x plus some patches backported from trunk. 


> On Jan 28, 2016, at 9:07 PM, Kumar Palaniappan 
>  wrote:
> 
> Andrew, is it HBase 1.1?
> 
>  
> https://github.com/chiastic-security/phoenix-for-cloudera/tree/4.6-HBase-1.0-cdh5.5
> 
>> On Thu, Jan 28, 2016 at 6:51 PM, Andrew Purtell  wrote:
>> I pushed a new branch for CDH 5.5 (5.5.1) as 
>> https://github.com/chiastic-security/phoenix-for-cloudera/tree/4.6-HBase-1.0-cdh5.5
>>   and renamed the branch for CDH 5.4 to 
>> https://github.com/chiastic-security/phoenix-for-cloudera/tree/4.6-HBase-1.0-cdh5.4
>> 
>> The changes in 4.6-HBase-1.0-cdh5.5 pass unit and integration tests for me 
>> (except a silly date test that hardcodes the expected year to 2015). 
>> 
>> 
>>> On Thu, Jan 28, 2016 at 11:23 AM, Andrew Purtell  
>>> wrote:
>>> Looking today
>>> 
>>> 
 On Tue, Jan 26, 2016 at 11:00 PM, Kumar Palaniappan 
  wrote:
 Andrew, any updates? Seem HBase-11544 impacted the Phoenix and CDH 5.5.1 
 isnt working. 
 
> On Sun, Jan 17, 2016 at 11:25 AM, Andrew Purtell 
>  wrote:
> This looks like something easy to fix up. Maybe I can get to it next week.
> 
> > On Jan 15, 2016, at 9:07 PM, Krishna  wrote:
> >
> > On the branch:  4.5-HBase-1.0-cdh5, I set cdh version to 5.5.1 in pom 
> > and
> > building the package produces following errors.
> > Repo: https://github.com/chiastic-security/phoenix-for-cloudera
> >
> > [ERROR]
> > ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/util/Tracing.java:[176,82]
> > cannot find symbol
> > [ERROR] symbol:   method getParentId()
> > [ERROR] location: variable span of type org.apache.htrace.Span
> > [ERROR]
> > ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[129,31]
> > cannot find symbol
> > [ERROR] symbol:   variable ROOT_SPAN_ID
> > [ERROR] location: interface org.apache.htrace.Span
> > [ERROR]
> > ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[159,38]
> > cannot find symbol
> > [ERROR] symbol:   variable ROOT_SPAN_ID
> > [ERROR] location: interface org.apache.htrace.Span
> > [ERROR]
> > ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[162,31]
> > cannot find symbol
> > [ERROR] symbol:   variable ROOT_SPAN_ID
> > [ERROR] location: interface org.apache.htrace.Span
> > [ERROR]
> > ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[337,38]
> > cannot find symbol
> > [ERROR] symbol:   variable ROOT_SPAN_ID
> > [ERROR] location: interface org.apache.htrace.Span
> > [ERROR]
> > ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[339,42]
> > cannot find symbol
> > [ERROR] symbol:   variable ROOT_SPAN_ID
> > [ERROR] location: interface org.apache.htrace.Span
> > [ERROR]
> > ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceReader.java:[359,58]
> > cannot find symbol
> > [ERROR] symbol:   variable ROOT_SPAN_ID
> > [ERROR] location: interface org.apache.htrace.Span
> > [ERROR]
> > ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceMetricSource.java:[99,74]
> > cannot find symbol
> > [ERROR] symbol:   method getParentId()
> > [ERROR] location: variable span of type org.apache.htrace.Span
> > [ERROR]
> > ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/trace/TraceMetricSource.java:[110,60]
> > incompatible types
> > [ERROR] required: java.util.Map
> > [ERROR] found:java.util.Map
> > [ERROR]
> > ~/phoenix_related/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java:[550,57]
> >  > org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver$1> is 
> > not
> > abstract and does not override abstract method
> > nextRaw(java.util.List,org.apache.hadoop.hbase.regionserver.ScannerContext)
> > in org.apache.hadoop.hbase.regionserver.RegionScanner
> >
> >
> >> On Fri, Jan 15, 2016 at 6:20 PM, Krishna  wrote:
> >>
> >> Thanks Andrew. Are binaries available for CDH5.5.x?
> >>
> >> On Tue, Nov 3, 2015 at 9:10 AM, Andrew Purtell 
> >> wrote:
> >>
> >>> Today I pushed a new branch