Thank you Josh .

2017-01-12 

lk_phoenix 



发件人:Josh Mahonin <jmaho...@gmail.com>
发送时间:2017-01-11 22:56
主题:Re: how to write spark2 dataframe to phoenix?
收件人:"user"<user@phoenix.apache.org>
抄送:

Hi,


Spark 2.x isn't currently supported in a released Phoenix version, but is 
slated for the upcoming 4.10.0 release.


If you'd like to compile your own version in the meantime, you can find the 
ticket/patch here:
https://issues.apache.org/jira/browse/PHOENIX-3333

or:
https://github.com/apache/phoenix/commit/a0e5efcec5a1a732b2dce9794251242c3d66eea6

Josh


On Tue, Jan 10, 2017 at 10:27 PM, lk_phoenix <lk_phoe...@163.com> wrote:

hi,all:
        I try to write a dataframe to phoenix with spark2.1:
            
df.write.format("org.apache.phoenix.spark").mode(SaveMode.Overwrite).options(Map("table"
 -> "biz","zkUrl" -> "slave1:2181,slave2:2181")).save()
        but it's not work,I got error:
        java.lang.AbstractMethodError: 
org.apache.phoenix.spark.DefaultSource.createRelation(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/Dataset;)Lorg/apache/spark/sql/sources/BaseRelation;
  at 
org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:426)
  at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:215)
  ... 60 elided

2017-01-11


lk_phoenix 

Reply via email to