Drop an HBase backed table
Hello, How can I drop a Hive table which was created using "CREATE EXTERNAL TABLE..."? I tried "DROP TABLE ;" but the shell hangs. The underlying HBase table should not be deleted. I am using Hive 0.9 Thank you, /David
Re: Mapping existing HBase table with many columns to Hive.
Hello, I tried the shell command which Swarnim kindly provided and it allows me to map an existing HBase table into Hive. However, since my qualifiers are long but map only accepts string as a key, the result is garbled. Even with the suggested patch which allows binary keys, the resulting datatype in Hive would not be long but binary, making it hard to query from shell. It seems there is no API for now, right? Currently, is there any way to map HBase byte[] to Hive datatypes? The assumption is, that all byte[] were generated using Hadoop's Byte.toBytes() method and that either all row keys, qualifiers and values share the same data type respectively (for example: row keys are ints, qualifiers are longs and values are strings). Thank you, /David On Thu, Dec 6, 2012 at 9:23 PM, David Koch wrote: > Hello Swarnim, > > Thank you for your answer. I will try the options you pointed out. > > /David > > > On Thu, Dec 6, 2012 at 9:10 PM, kulkarni.swar...@gmail.com < > kulkarni.swar...@gmail.com> wrote: > >> map > > >
Re: Mapping existing HBase table with many columns to Hive.
Hello Swarnim, Thank you for your answer. I will try the options you pointed out. /David On Thu, Dec 6, 2012 at 9:10 PM, kulkarni.swar...@gmail.com < kulkarni.swar...@gmail.com> wrote: > map
Mapping existing HBase table with many columns to Hive.
Hello, How can I map an HBase table with the following layout to Hive using the "CREATE EXTERNAL TABLE" command from shell (or another programmatic way): The HBase table's layout is as follows: Rowkey=16 bytes, a UUID that had the "-" removed, and the 32hex chars converted into two 8byte longs. Columns (qualifiers): timestamps, i.e the bytes of a long which were converted using Hadoop's Bytes.toBytes(long). There can be many of those in a single row. Values: The bytes of a Java string. I am unsure of which datatypes to use. I am pretty sure there is no way I can sensible map the row key to anything other than "binary" but maybe the columns - which are longs and the values which are strings can be mapped to their according Hive datatypes. I include an extract of what a row looks like in HBase shell below: Thank you, /David hbase(main):009:0> scan "hits" ROW COLUMN+CELL \x00\x00\x06\xB1H\x89N\xC3\xA5\x83\x0F\xDD\x1E\xAE&\xDC column=t:\x00\x00\x01;2\xE6Q\x06, timestamp=1267737987733, value=blahaha \x00\x00\x06\xB1H\x89N\xC3\xA5\x83\x0F\xDD\x1E\xAE&\xDC column=t:\x00\x00\x01;2\xE6\xFB@, timestamp=1354012104967, value=testtest