Hi Chanh,
I think you need only specify one node:
./sqlline.py zoo1:2182
./sqlline.py zoo1
Best,
--
Juvenn Woo
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
On Thursday, 16 February 2017 at 12:41 PM, Chanh Le wrote:
> Hi everybody,
> I am a newbie start using phoenix for a
Hi everybody,
I am a newbie start using phoenix for a few days after did some research about
config zookeeper quorum and still stuck I finally wanna ask directly into the
community.
Current zk quorum of mine a little odd "hbase.zookeeper.quorum",
"zoo1:2182,zoo1:2183,zoo2:2182"
I edited the
Is it possible to pass the TenantID attribute on the URL when using the
phoenix query server? For example,
/usr/hdp/2.5.0.0-1245/phoenix/bin/sqlline-thin.py
http://pqshost.myhost.com:8765;TenantId=tenant1
This works fine for me when connecting via jdbc. Just didn't seem to work
with the query
I cant find the difference between the date/time types, arent all of them
the same? also should I parse them as int or string?
TIME Type
TIME
The time data type. The format is -MM-dd hh:mm:ss, with both the date
and time parts maintained. Mapped to java.sql.Time. The binary
representation is
No, PQS is just a proxy to the Phoenix (thick) JDBC driver.
You are still limited to the capabilities of the Phoenix JDBC driver.
You might be able to do something with a custom UDF, but I'm not sure.
Sudhir Babu Pothineni wrote:
Sorry for not asking the question properly, my understanding
This is a non-issue...
Avatica's use of protobuf is completely shaded (relocated classes). You
can use whatever version of protobuf in your client application you'd like.
Mark Heppner wrote:
If Cheyenne is talking about the query server, I'm not sure where you're
getting that from, Ted. It
Hi,
Spark is unable to load the Phoenix classes it needs. If you're using a
recent version of Phoenix, please ensure the "fat" *client* JAR (or for
older versions of Phoenix, the Phoenix *client*-spark JAR) is on your Spark
driver and executor classpath [1]. The 'phoenix-spark' JAR is
Hi,
I'm trying to write a simple dataframe to Phoenix:
df.save("org.apache.phoenix.spark", SaveMode.Overwrite,
Map("table" -> "TEST_SAVE", "zkUrl" -> "zk.internal:2181"))
I have the following in my pom.xml:
org.apache.phoenix
phoenix-spark