Hi, all
    I hava a problem. I create a table named "tblA" in c* and create a 
materialized view name viewA on tblA. I run spark job to processing data from 
'viewA'.
    In the beginning, it works well. But in the next day, the spark job failed. 
And when I select data from the 'viewA' and 'tblA' using cql, it throw the 
follwing exception.
    query from viewA:
         "ServerError: <ErrorMessage code=0000 [Server error] 
message="java.lang.ArrayIndexOutOfBoundsException">"
    and query from tblA:
         "ServerError: <ErrorMessage code=0000 [Server error] 
message="java.io.IOError: java.io.EOFException: EOF after 13889 bytes out of 
460861">"


    My system version is :
        Cassandra 3.7  +   spark1.6.2   +  Spark Cassandra Connector 1.6


If anyone know about this problem? Look forward to your reply.


Thanks

Reply via email to