Hi Rafa,
Dfs name service issue for phoenix got resolved after setting class path of
Hadoop configuration and HBase configuration. This can be done by setting
environment variable named HADOOP_HOME and HBASE_HOME in the respective
machines.
Thanks for your support.
Regards,
Mallieswari
On Thu,
You cannot use "hacluster" if that hostname is not resolved to a IP. Is
what I tried to explain in my last mail.
Use the ip of te machine that is running query server or its hostname
Regards
Rafa
El 12 oct. 2017 6:19, "Mallieswari Dineshbabu"
escribió:
> Hi Rafa,
>
> Still, faced “UnKnownHost
Hi Rafa,
Still, faced “UnKnownHostException:hacluster” when started the query server
with cluster name 'hacluster' and try to connect phoenix client like below.
bin>python sqlline-thin.py http://hacluster:8765
Setting property: [incremental, false]
Setting property: [isolation, TRANSACTION_
Hi Mallieswari,
The hbase.rootdir is a filesystem resource. If you have a HA NAmenode and a
created nameservice you can point to the active namenode automatically. As
far as I know it is not related to the HBase Master HA.
The "hacluster" used in this command : python sqlline-thin.py
http://haclu