You might consider providing more information instead of sending new messages without any additional context. What you have provided is not sufficient for anyone but a wizard to debug :)

Increase the log4j level. Post your HBase logs somewhere. Post the Phoenix client logs somewhere. Please make a larger effort instead of just spamming the list.

Chetan Khatri wrote:
Hello Guys,

I am still stuck to start Phoenix with Pseudo Hadoop + Hive + Spark
cluster, can anybody help me please.

Thanks.

On Tue, Jan 17, 2017 at 2:03 AM, Chetan Khatri
<[email protected] <mailto:[email protected]>> wrote:

    Hello Josh,

    Thank you for reply, As you suggested.

    1) phoenix-4.8.2-HBase-1.2-server.jar at HBase/lib
    *Checked by :* bin/hbase classpath | grep 'phoenix'


    *2) Errors*
    *
    *
    *

    ​
    3) Nothing can be found at HBase RegionServer logs*



    How can above problem can be resolved ?

    Thanks.
    ​​


    On Mon, Jan 16, 2017 at 10:22 PM, Josh Elser <[email protected]
    <mailto:[email protected]>> wrote:

        Did you check the RegionServers logs I asked in the last message?

        Chetan Khatri wrote:

            Any updates for the above error guys ?


            On Fri, Jan 13, 2017 at 9:35 PM, Josh Elser
            <[email protected] <mailto:[email protected]>
            <mailto:[email protected] <mailto:[email protected]>>> wrote:

                 (-cc dev@phoenix)

                 phoenix-4.8.2-HBase-1.2-server.jar in the top-level
            binary tarball
                 of Apache Phoenix 4.8.0 is the jar which is meant to be
            deployed to
                 all HBase's classpath.

                 I would check the RegionServer logs -- I'm guessing
            that it never
                 started correctly or failed. The error message is
            saying that
                 certain regions in the system were never assigned to a
            RegionServer
                 which only happens in exceptional cases.

                 Chetan Khatri wrote:

                     Hello Community,

                     I have installed and configured Apache Phoenix on
            Single Node
                     Ubuntu 16.04
                     machine:
                     - Hadoop 2.7
                     - HBase 1.2.4
                     - Phoenix -4.8.2-HBase-1.2

                     Copied phoenix-core-4.8.2-HBase-1.2.jar to
            hbase/lib and confirmed
                     with bin/hbase classpath | grep 'phoenix' and I am
            using embedded
                     zookeeper, so my hbase-site.xml looks like below:

            <configuration>
            <property>
            <name>hbase.rootdir</name>
            <value>file:///home/hduser/hbase</value>
            </property>
            </configuration>

                     I am able to read / write to HBase from shell and
            Apache Spark.

                     *Errors while accessing with **sqlline**:*


                     1) bin/sqlline.py localhost:2181

                     Error:

                     1. Command made process hang.
                     2.
                     Error: ERROR 1102 (XCL02): Cannot get all table
            regions.
                     (state=XCL02,code=1102)
                     java.sql.SQLException: ERROR 1102 (XCL02): Cannot
            get all table
                     regions.
                     at

            
org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:455)
                     at

            
org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
                     at

            
org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:546)
                     at

            
org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:1162)
                     at

            
org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1068)
                     at

            
org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1388)
                     at

            
org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2298)
                     at

            
org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:940)
                     at

            
org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:193)
                     at

            
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:344)
                     at

            
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:332)
                     at
            org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
                     at

            
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:331)
                     at

            
org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1423)
                     at

            
org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2352)
                     at

            
org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2291)
                     at

            
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
                     at

            
org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2291)
                     at

            
org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:232)
                     at

            
org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:147)
                     at

            
org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
                     at
            sqlline.DatabaseConnection.connect(DatabaseConnection.java:157)
                     at

            
sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:203)
                     at sqlline.Commands.connect(Commands.java:1064)
                     at sqlline.Commands.connect(Commands.java:996)
                     at
            sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                     at

            
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                     at

            
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                     at java.lang.reflect.Method.invoke(Method.java:498)
                     at

            
sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:36)
                     at sqlline.SqlLine.dispatch(SqlLine.java:803)
                     at sqlline.SqlLine.initArgs(SqlLine.java:588)
                     at sqlline.SqlLine.begin(SqlLine.java:656)
                     at sqlline.SqlLine.start(SqlLine.java:398)
                     at sqlline.SqlLine.main(SqlLine.java:292)
                     Caused by:

            org.apache.hadoop.hbase.client.NoServerForRegionException: No
                     server address listed in hbase:meta for region

            SYSTEM.CATALOG,,1484293041241.0b74311f417f83abe84ae1be4e823de8.
                     containing
                     row
                     at

            
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1318)
                     at

            
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1181)
                     at

            
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1152)
                     at

            
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1136)
                     at

            
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:957)
                     at

            
org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:531)
                     ... 32 more
                     sqlline version 1.1.9

                     Kindly let me know how to fix this error.

                     Thanks,




Reply via email to