Hi,
I have the following requirement from a Hive table below.
CustNumActivityDatesRates
10010-Aug-13,12-Aug-13,20-Aug-1310,15,20
The data above says that
From 10 Aug to 11 Aug the rate is 10.
From 12 Aug to 19 Aug the rate is 15.
From 20-Aug to till date the rate is 20.
Note : The order is m
thanks Ed. And on a separate tact lets look at Hiveserver2.
@OP>
*I've tried to look around on how i can change the thrift heap size but
haven't found anything.*
looking at my hiveserver2 i find this:
$ ps -ef | grep -i hiveserver2
dwr 9824 20479 0 12:11 pts/100:00:00 grep -i
Final table compression should not effect the de serialized size of the
data over the wire.
On Fri, Jan 31, 2014 at 2:49 PM, Stephen Sprague wrote:
> Excellent progress David. So. What the most important thing here we
> learned was that it works (!) by running hive in local mode and that thi
Excellent progress David. So. What the most important thing here we
learned was that it works (!) by running hive in local mode and that this
error is a limitation in the HiveServer2. That's important.
so textfile storage handler and having issues converting it to ORC. hmmm.
follow-ups.
1. w
Ok, so here are some news :
I tried to boost the HADOOP_HEAPSIZE to 8192,
I also setted the mapred.child.java.opts to 512M
And it doesn't seem's to have any effect.
--
I tried it using an ODBC driver => fail after few minutes.
Using a local JDBC (beeline) => running forever without any erro
Hello ,
I m trying to get some statistics using Hive and insert them into new table
Those statics are the count of each field on another table grouped by the
column "usrlvl"
I need to do all of that in single request, so I ve googled how to do that
and found the following link
https://cwiki.apac
I figured out a workaround. Instead of making /apps/hive/warehouse
permissions 000, make it 222 (only "write" permissions). This satisfies
Hive's requirement to be able to write something to that directory when
it's creating external tables. (It actually never writes anything there
during this proc
Hi,
I'm running hive 0.12 on yarn and I'm trying to convert a common join into
a map join. My map join fails
and from the logs I can see that the memory limit is very low:
Starting to launch local task to process map join; maximum memory =
514523136
How can I increase the maximum memory?
I'