Hi there
AFAIK, spark can smoothly read hive data using HiveContext, or use dataframe 
and data source api to read any 
external data source to tranform that into dataframe. So I would recommend 
using phoenix-spark module to 
achieve this goal. And you can simply choose to write spark dataframe to what 
kind of storage. 

Best,
Sun.





CertusNet 

From: Buntu Dev
Date: 2015-06-17 14:56
To: user
Subject: Phoenix and Hive
I got quite a bit of data in a Hive managed tables and I looking for ways to 
join those tables with the one I create in Phoenix. I'm aware of HBase and Hive 
integration but not sure if there is any current support for Phoenix and Hive 
integration. Please let me know.

Thanks!

Reply via email to