Not sure there is enough information here to help.  One thing of note is post 
phoenix 4.14 it is now required to use the phoenix-connectors project as many 
connectors like spark were moved out.  
https://github.com/apache/phoenix-connectors.  Also consider using a more 
recent released phoenix version from the phoenix 5.x release line.  

On 2021/08/23 14:27:56, Ankit Joshi <ankit.joshi00...@gmail.com> wrote: 
> Hello,
> 
> I have upgrade my phoenix 4.14.0 to 5.0.0 with HBase 2.0 and also upgrade
> spark 1.6 to 2.4.
> Now I am trying to load the data from phoenix to Hadoop using dataset<row> and
> getting below exception in cluster.
> 
> at org.apache.phoenix.utilServerutil.parseServerException
> org.apache.phoenix.query.connectionQueryServiceImpl.ensureTableCreated
> org.apache.phoenix.query.connectionQueryServiceImpl.createTable
> org.apache.phoenix.jdbc.PhoenixStatememt.java
> at org.apache.phoenix.jdbc.PhoenixStatement.executionMutation
> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate
> org.apache.phoenix.jdbc.PhoenixDriver.connect
> org.apache.spark.sql.execution.datasource.jdbc.DriverWrapper.connect
> org.apache.spark.sql.execution.datasource.jdbcUtils
> org.apache.spark.rdd.RDD.computeorReadChexkpoint
> org.apache.spark.rdd.RDD.iterator
> MapPartitionRDD.compute
> 
> P.S - Able to open phoenix shell and upsert the records.
> 
> 
> Thanks & Regards,
> Ankit Joshi
> 

Reply via email to