Hello friends..,

I am very new to Apache Phoenix and i just started running sample phoenix
spark example in spark 1.6 version. it was successful and now i want to run
this example in spark version 2.0.0. Is phoenix provides support for
spark-2.0.0?

previously i used this command:

DataFrame fromPhx = context.read().format("org.apache.phoenix.spark")

.options(ImmutableMap.of("driver", "org.apache.phoenix.jdbc.PhoenixDriver",
"zkUrl",

"jdbc:phoenix:localhost:2181", "table", "SAMPLE"))

.load();


In spark 2.0.0:


org.apache.spark.sql.Dataset<Row> df  = spark.read().format(
"org.apache.phoenix.spark")

.options(ImmutableMap.of("driver", "org.apache.phoenix.jdbc.PhoenixDriver",
"zkUrl",

"jdbc:phoenix:localhost:2181", "table", "SAMPLE"))

.load();


This is correct or i need to change any code?


please help me out.

Reply via email to