Hi Murugesan,

What preconditions would I need on the server to execute the python script?
I have Python 2.7.5 installed on the zookeeper server. If I just copy the
sqlline script to the /etc/hbase/conf directory and execute it I get the
below import errors. Note this time I had 4.5.2-HBase-1.0 version server
and core phoenix jars in HBase/lib directory on the master and region
servers.

Traceback (most recent call last):
  File "./sqlline.py", line 25, in <module>
    import phoenix_utils
ImportError: No module named phoenix_utils

Pardon me for my knowledge about python.

Thanks,
Amit

On Fri, Feb 26, 2016 at 11:26 PM, Murugesan, Rani <ranmu...@visa.com> wrote:

> Did you test and confirm your phoenix shell from the zookeeper server?
>
> cd /etc/hbase/conf
>
> > phoenix-sqlline.py <zookeeperserver>:2181
>
>
>
>
>
> *From:* Amit Shah [mailto:amits...@gmail.com]
> *Sent:* Friday, February 26, 2016 4:45 AM
> *To:* user@phoenix.apache.org
> *Subject:* HBase Phoenix Integration
>
>
>
> Hello,
>
>
>
> I have been trying to install phoenix on my cloudera hbase cluster.
> Cloudera version is CDH5.5.2 while HBase version is 1.0.
>
>
>
> I copied the server & core jar (version 4.6-HBase-1.0) on the master and
> region servers and restarted the hbase cluster. I copied the corresponding
> client jar on my SQuirrel client but I get an exception on connect. Pasted
> below. The connection url is “jdbc:phoenix:<zookeeper-server-name>:2181".
>
> I even tried compiling the source by adding cloudera dependencies as
> suggested on this post
> <http://stackoverflow.com/questions/31849454/using-phoenix-with-cloudera-hbase-installed-from-repo>but
> didn't succeed.
>
>
>
> Any suggestions to make this work?
>
>
>
> Thanks,
>
> Amit.
>
>
>
> ________________________________________________________________
>
>
>
> Caused by:
> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException):
> org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG:
> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
>
>             at
> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:87)
>
>             at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1319)
>
>             at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:11715)
>
>             at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7388)
>
>             at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1776)
>
>             at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1758)
>
>             at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>
>             at
> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2034)
>
>             at
> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
>
>             at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>
>             at
> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>
>             at java.lang.Thread.run(Thread.java:745)
>
> Caused by: java.lang.NoSuchMethodError:
> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
>
>             at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(MetaDataEndpointImpl.java:1016)
>
>             at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1092)
>
>             at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1266)
>
>             ... 10 more
>
>
>
> P.S - The full stacktrace is attached in the mail.
>

Reply via email to