Hi,
I want to use custom writable data type written in hadoop as a column data
type for some table in hive. Any idea How to register the custom data type
of hadoop in hive , so that it can be used as a column data type.
Thanks,
Reena Upadhyay
Hello
My simple mapreduce program takes text as input and outputs the same text
to different directories depending on the values in each record using
Multipleoutputs API where i specify the baseoutputpath.
it works fine for small data sets. But when it takes 2 GB It strucks either
at
MAP 10
Hi,
We just upgraded from hive 0.11.0 to 0.13.0 (finally!!)
So I noticed that for hive-exec jar, guava is packaged in the jar v/s
previously it wasn't.
Any reason it is package now ?
Secondly, is there anything that is stopping to bump the guava version from
11.0 to the latest one ?
Given that
This is a bug. Would you mind opening a hive jira for this ? If you
can provide a patch as well, I will be happy to review it.
On Fri, Sep 26, 2014 at 3:17 AM, Charles Bonneau
wrote:
>
> Hello,
>
> We are using Hive 0.12.
> While using WebHCat, I stumbled upon a problem regarding the name of ta
Hello,
We are using Hive 0.12.
While using WebHCat, I stumbled upon a problem regarding the name of
tables.
We generate table names according to Hive’s documentation :
“ In Hive 0.12 and earlier, only alphanumeric and underscore
characters are allowed in table and column names.”
However, when t
Hi all:
hdfs ha support multi namenode,but in hive metastore,location is the
table's proprety.the url likes hdfs://nn1:9900/wh/table1.
if the nn1 shutdown,the hive could not connect anther nn,how could i
do?
thanks.I need your help