As a general statement, the protobuf serialization is much better maintained and comes with a degree of backwards compatibility (which the JSON serialization guarantees none).

Thanks for sharing the solution.

On 4/20/18 9:53 AM, Lu Wei wrote:
I did some digging, and the reason is because I started PQS using JSON serialization, rather than PROTOBUF.

When I switch to PROTOBUF serialization, the 'select  * from testarray' query works fine.


There is not type for numbers in Json, so an Json Array[100] is parsed to an array containing an integer value. when getting items from the Sql result set, there is an convert from 100 (an integer) to long (type defined in table), so a conversion exception happened.


I guess we should better use Protobuf rather than Json as serialization for PQS.

------------------------------------------------------------------------
*From:* sergey.solda...@gmail.com <sergey.solda...@gmail.com> on behalf of Sergey Soldatov <sergeysolda...@gmail.com>
*Sent:* Friday, April 20, 2018 5:22:47 AM
*To:* user@phoenix.apache.org
*Subject:* Re: 答复: phoenix query server java.lang.ClassCastException for BIGINT ARRAY column Definitely, someone who is maintaining CDH branch should take a look. I don't observer that behavior on the master branch:

0: jdbc:phoenix:thin:url=http://localhost:876> create table if not exists testarray(id bigint not null, events bigint array constraint pk primary key (id));
No rows affected (2.4 seconds)
0: jdbc:phoenix:thin:url=http://localhost:876> upsert into testarray values (1, array[1,2]);
1 row affected (0.056 seconds)
0: jdbc:phoenix:thin:url=http://localhost:876> select * from testarray;
+-----+---------+
| ID  | EVENTS  |
+-----+---------+
| 1   | [1, 2]  |
+-----+---------+
1 row selected (0.068 seconds)
0: jdbc:phoenix:thin:url=http://localhost:876>


Thanks,
Sergey

On Thu, Apr 19, 2018 at 12:57 PM, Lu Wei <wey...@outlook.com <mailto:wey...@outlook.com>> wrote:

    by the way, all the queries are shot in sqlline-thin.py



    ------------------------------------------------------------------------
    *发件人:* Lu Wei
    *发送时间:* 2018年4月19日 6:51:15
    *收件人:* user@phoenix.apache.org <mailto:user@phoenix.apache.org>
    *主题:* 答复: phoenix query server java.lang.ClassCastException for
    BIGINT ARRAY column

    ## Version:
    phoenix: 4.13.2-cdh5.11.2
    hive: 1.1.0-cdh5.11.2

    to reproduce:

    -- create table

    create table if not exists testarray(id bigint not null, events
    bigint array constraint pk primary key (id))


    -- upsert data:

    upsert into testarray values (1, array[1,2]);


    -- query:

    select id from testarray;   -- fine

    select * from testarray;    -- error

    ------------------------------------------------------------------------
    *发件人:* sergey.solda...@gmail.com
    <mailto:sergey.solda...@gmail.com> <sergey.solda...@gmail.com
    <mailto:sergey.solda...@gmail.com>> 代表 Sergey Soldatov
    <sergeysolda...@gmail.com <mailto:sergeysolda...@gmail.com>>
    *发送时间:* 2018年4月19日 6:37:06
    *收件人:* user@phoenix.apache.org <mailto:user@phoenix.apache.org>
    *主题:* Re: phoenix query server java.lang.ClassCastException for
    BIGINT ARRAY column
    Could you please be more specific? Which version of phoenix are you
    using? Do you have a small script to reproduce? At first glance it
    looks like a PQS bug.

    Thanks,
    Sergey

    On Thu, Apr 19, 2018 at 8:17 AM, Lu Wei <wey...@outlook.com
    <mailto:wey...@outlook.com>> wrote:

        Hi there,

        I have a phoenix table containing an BIGINT ARRAY column. But
        when querying query server (through sqlline-thin.py), there is
        an exception:

        java.lang.ClassCastException: java.lang.Integer cannot be cast
        to java.lang.Long

        BTW, when query through sqlline.py, everything works fine. And
        data in HBase table are of Long type, so why does the Integer to
        Long cast happen?


        ## Table schema:

        create table if not exists gis_tracking3(tracking_object_id
        bigint not null, lat double, lon double, speed double, bearing
        double, time timestamp not null, events bigint array constraint
        pk primary key (tracking_object_id, time))


        ## when query events[1], it works fine:

        0: jdbc:phoenix:thin:url=http://10.10.13.87:8
        <http://10.10.13.87:8>> select  events[1]+1 from gis_tracking3;
        +------------------------------+
        | (ARRAY_ELEM(EVENTS, 1) + 1)  |
        +------------------------------+
        | 11                           |
        | 2223                         |
        | null                         |
        | null                         |
        | 10001                        |
        +------------------------------+


        ## when querying events, it throws exception:

        0: jdbc:phoenix:thin:url=http://10.10.13.87:8
        <http://10.10.13.87:8>> select  events from gis_tracking3;
        java.lang.ClassCastException: java.lang.Integer cannot be cast
        to java.lang.Long
           at org.apache.phoenix.shaded.org
        
<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$LongAccessor.getLong(AbstractCursor.java:550)
           at org.apache.phoenix.shaded.org
        
<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.convertValue(AbstractCursor.java:1310)
           at org.apache.phoenix.shaded.org
        
<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.getObject(AbstractCursor.java:1289)
           at org.apache.phoenix.shaded.org
        
<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.getArray(AbstractCursor.java:1342)
           at org.apache.phoenix.shaded.org
        
<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.getString(AbstractCursor.java:1354)
           at org.apache.phoenix.shaded.org
        
<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.AvaticaResultSet.getString(AvaticaResultSet.java:257)
           at sqlline.Rows$Row.<init>(Rows.java:183)
           at sqlline.BufferedRows.<init>(BufferedRows.java:38)
           at sqlline.SqlLine.print(SqlLine.java:1660)
           at sqlline.Commands.execute(Commands.java:833)
           at sqlline.Commands.sql(Commands.java:732)
           at sqlline.SqlLine.dispatch(SqlLine.java:813)
           at sqlline.SqlLine.begin(SqlLine.java:686)
           at sqlline.SqlLine.start(SqlLine.java:398)
           at sqlline.SqlLine.main(SqlLine.java:291)
           at
        
org.apache.phoenix.queryserver.client.SqllineWrapper.main(SqllineWrapper.java:93)


        I guess there is some issue in query sever, but can't figure out
        why.

        Any suggestions?



        Thanks,

        Wei



Reply via email to