Have you looked at spark-packges.org? There are several different HBase
connectors there, not sure if any meet you need or not.
https://spark-packages.org/?q=hbase
HTH,
-Todd
On Tue, Aug 30, 2016 at 5:23 AM, ayan guha wrote:
> You can use rdd level new hadoop format api and pass on appropr
You can use rdd level new hadoop format api and pass on appropriate
classes.
On 30 Aug 2016 19:13, "Mich Talebzadeh" wrote:
> Hi,
>
> Is there an existing interface to read from and write to Hbase table in
> Spark.
>
> Similar to below for Parquet
>
> val s = spark.read.parquet("oraclehadoop.sale
Hi,
Is there an existing interface to read from and write to Hbase table in
Spark.
Similar to below for Parquet
val s = spark.read.parquet("oraclehadoop.sales2")
s.write.mode("overwrite").parquet("oraclehadoop.sales4")
Or need too write Hive table which is already defined over Hbase?
Thanks