Hi,
Over on HBase we are working on direct integration with spark. That may (or
may not) be a better option when you already have code using the HBase API
directly. Please see https://issues.apache.org/jira/browse/HBASE-13992
On Saturday, July 18, 2015, Josh Mahonin wrote:
> Hi,
>
> The phoeni
Hi,
The phoenix-spark integration is a thin wrapper around the
phoenix-mapreduce integration, which under the hood just uses Phoenix's
'UPSERT' functionality for saving. As far as I know, there's no provisions
for checkAndPut functionality there, so if you require it, I suggest
sticking to the HBa
I am using Spark 1.3, HBase 1.1 and Phoenix 4.4. I have this in my code:val rdd
= processedRdd.map(r => Row.fromSeq(r))
val dataframe = sqlContext.createDataFrame(rdd, schema)
dataframe.save("org.apache.phoenix.spark", SaveMode.Overwrite,
Map("table" -> HTABLE, "zkUrl" -> zkQuorum))This code w