I've booked this on https://issues.apache.org/jira/browse/HIVE-5025.
2013/7/22 Felix.徐 ygnhz...@gmail.com:
Hi all,
Is there any api to retrieve the parameter's column name in GenericUDF?
For example:
Select UDFTEST(columnA,columnB) from test;
I want to get the column names(columnA and
Hello,
I am started to run
Hive with Lzo compression on Hortonworks 1.2
I have managed to
install/configure Lzo and hive -e
set io.compression.codecs shows me the Lzo Codecs:
io.compression.codecs=
org.apache.hadoop.io.compress.GzipCodec,
org.apache.hadoop.io.compress.DefaultCodec,
Hi,
I encountered a problem with Right Outer Join in Hive.
Here is where is the problem :
FROM default.ca ca
JOIN default.kpi_magasin mtransf
ON mtransf.co_societe = (CASE WHEN ca.co_societe = 1 THEN 1 ELSE
2 END)
AND mtransf.id_magasin = ca.id_magasin
Hi,
I am using Hive 0.11.0 over Hadoop 1.0.4.
Recently I have started investigating the user of Templeton and I have managed
to get
most of the services working. Specifically I can access resources like these:
http://hpcluster1:50111/templeton/v1/version
You should add CREATE TABLE () STORED AS SEQUENCEFILE. You probably do not
want LZO files, you probably want sequence files with LZO block compression.
On Thu, Aug 8, 2013 at 5:02 AM, w00t w00t w00...@yahoo.de wrote:
Hello,
I am started to run Hive with Lzo compression on Hortonworks 1.2
Please refer this documentation here
Let me know if u need more clarifications so that we can make this document
better and complete
Thanks
sanjay
From: w00t w00t w00...@yahoo.demailto:w00...@yahoo.de
Reply-To: user@hive.apache.orgmailto:user@hive.apache.org
Hi all:
I'm currently testing hive11 and encounter one bug with hive.auto.convert.join,
I construct a testcase so everyone can reproduce it(or you can reach the
testcase
here:https://gist.github.com/code6/6187569#file-hive11_auto_convert_join_bug):
use test;
create table src ( `key` int,`val`
Hi guys
Perhaps u know this already but very useful. This directly creates a file based
on the output of this query to name_node_host_2 HDFS cluster
Regards
Sanjay
hive -h hive_server_host1 -e hive_query_string| hdfs dfs -put -
hdfs://name_node_host_2:port/path/to/ur/dir/your_file_name