Re: How to config zookeeper quorum in sqlline command?

2017-02-15 Thread Juvenn Woo
Hi Chanh, I think you need only specify one node: ./sqlline.py zoo1:2182 ./sqlline.py zoo1 Best, -- Juvenn Woo Sent with Sparrow (http://www.sparrowmailapp.com/?sig) On Thursday, 16 February 2017 at 12:41 PM, Chanh Le wrote: > Hi everybody, > I am a newbie start using phoenix for a

How to config zookeeper quorum in sqlline command?

2017-02-15 Thread Chanh Le
Hi everybody, I am a newbie start using phoenix for a few days after did some research about config zookeeper quorum and still stuck I finally wanna ask directly into the community. Current zk quorum of mine a little odd "hbase.zookeeper.quorum", "zoo1:2182,zoo1:2183,zoo2:2182" I edited the

Phoenix Query Server tenant_id

2017-02-15 Thread Michael Young
Is it possible to pass the TenantID attribute on the URL when using the phoenix query server? For example, /usr/hdp/2.5.0.0-1245/phoenix/bin/sqlline-thin.py http://pqshost.myhost.com:8765;TenantId=tenant1 This works fine for me when connecting via jdbc. Just didn't seem to work with the query

Differences between the date/time types

2017-02-15 Thread Cheyenne Forbes
I cant find the difference between the date/time types, arent all of them the same? also should I parse them as int or string? TIME Type TIME The time data type. The format is -MM-dd hh:mm:ss, with both the date and time parts maintained. Mapped to java.sql.Time. The binary representation is

Re: Protobuf serialized column

2017-02-15 Thread Josh Elser
No, PQS is just a proxy to the Phoenix (thick) JDBC driver. You are still limited to the capabilities of the Phoenix JDBC driver. You might be able to do something with a custom UDF, but I'm not sure. Sudhir Babu Pothineni wrote: Sorry for not asking the question properly, my understanding

Re: Can I use protobuf2 with Phoenix instead of protobuf3?

2017-02-15 Thread Josh Elser
This is a non-issue... Avatica's use of protobuf is completely shaded (relocated classes). You can use whatever version of protobuf in your client application you'd like. Mark Heppner wrote: If Cheyenne is talking about the query server, I'm not sure where you're getting that from, Ted. It

Re: FW: Failing on writing Dataframe to Phoenix

2017-02-15 Thread Josh Mahonin
Hi, Spark is unable to load the Phoenix classes it needs. If you're using a recent version of Phoenix, please ensure the "fat" *client* JAR (or for older versions of Phoenix, the Phoenix *client*-spark JAR) is on your Spark driver and executor classpath [1]. The 'phoenix-spark' JAR is

FW: Failing on writing Dataframe to Phoenix

2017-02-15 Thread Nimrod Oren
Hi, I'm trying to write a simple dataframe to Phoenix: df.save("org.apache.phoenix.spark", SaveMode.Overwrite, Map("table" -> "TEST_SAVE", "zkUrl" -> "zk.internal:2181")) I have the following in my pom.xml: org.apache.phoenix phoenix-spark