Hi,

You can create Hive external  tables on top of existing Hbase table using
the property

STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'

Example

hive> show create table hbase_table;
OK
CREATE TABLE `hbase_table`(
  `key` int COMMENT '',
  `value1` string COMMENT '',
  `value2` int COMMENT '',
  `value3` int COMMENT '')
ROW FORMAT SERDE
  'org.apache.hadoop.hive.hbase.HBaseSerDe'
STORED BY
  'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES (
  'hbase.columns.mapping'=':key,a:b,a:c,d:e',
  'serialization.format'='1')
TBLPROPERTIES (
  'transient_lastDdlTime'='1472370939')

 Then try to access this Hive table from Spark which is giving me grief at
the moment :(

scala> HiveContext.sql("use test")
res9: org.apache.spark.sql.DataFrame = []
scala> val hbase_table= spark.table("hbase_table")
16/09/02 23:31:07 ERROR log: error in initSerDe:
java.lang.ClassNotFoundException Class
org.apache.hadoop.hive.hbase.HBaseSerDe not found

HTH


Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 2 September 2016 at 23:08, KhajaAsmath Mohammed <mdkhajaasm...@gmail.com>
wrote:

> Hi Kim,
>
> I am also looking for same information. Just got the same requirement
> today.
>
> Thanks,
> Asmath
>
> On Fri, Sep 2, 2016 at 4:46 PM, Benjamin Kim <bbuil...@gmail.com> wrote:
>
>> I was wondering if anyone has tried to create Spark SQL tables on top of
>> HBase tables so that data in HBase can be accessed using Spark Thriftserver
>> with SQL statements? This is similar what can be done using Hive.
>>
>> Thanks,
>> Ben
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>>
>

Reply via email to