Hello, since spark 2.x can not use Phoenix Spark Interpreter to load data,
so I want to use JDBC, but when I want to get a *thin connection*, I got
the following Error info while using *direct connection is ok* , I ran it
in spark-shell, scala 2.11.8, so can anyone give a solution?

   Phoenix : 4.8.1-HBase-1.2

scala>
> val jdbcDf = spark.read
> .format("jdbc")
> .option("driver","org.apache.phoenix.queryserver.client.Driver")
> .option("url","jdbc:phoenix:thin:url=http://192.168.6.131:
> 8765;serialization=PROTOBUF")
> .option("dbtable","imos")
> .load()
>
> java.sql.SQLException: While closing connection
>   at org.apache.calcite.avatica.Helper.createException(Helper.java:39)
>   at org.apache.calcite.avatica.AvaticaConnection.close(
> AvaticaConnection.java:156)
>   at org.apache.spark.sql.execution.datasources.jdbc.
> JDBCRDD$.resolveTable(JDBCRDD.scala:167)
>   at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(
> JDBCRelation.scala:117)
>   at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.
> createRelation(JdbcRelationProvider.scala:53)
>   at org.apache.spark.sql.execution.datasources.
> DataSource.resolveRelation(DataSource.scala:345)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122)
>   ... 53 elided
> Caused by: java.lang.RuntimeException: response code 500
>   at org.apache.calcite.avatica.remote.RemoteService.apply(
> RemoteService.java:45)
>   at org.apache.calcite.avatica.remote.JsonService.apply(
> JsonService.java:227)
>   at org.apache.calcite.avatica.remote.RemoteMeta.
> closeConnection(RemoteMeta.java:78)
>   at org.apache.calcite.avatica.AvaticaConnection.close(
> AvaticaConnection.java:153)
>   ... 59 more

Reply via email to