SocketTimeoutException when insert into HBase by Hive

2012-07-23 Thread Cdy Chen
Hi all, When I usr 447 files which are 64M each one as input to insert into HBase, it throws SocketTimeoutException. But if I use smaller input, it works well. I guess it is related to Hadoop configuration. But how to configure? Thank you! Best Regards, Chen

Re: How would you write a query for a use-case I have

2012-07-17 Thread Cdy Chen
Use String functions https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-StringFunctions 2012/7/16 Guy Doulberg guy.doulb...@conduit.com Hi guys, I have a table with three fields, (uid,event_guid,timestamp) I need to find out for each event_guid the index

Re: How to CONCAT 18 columns

2012-07-17 Thread Cdy Chen
the numeric fields to STRING to able to use COALESCE. You can use CONCAT_WS to put delimiters in between each column. -Nicole From: Cdy Chen dongyong.c...@gmail.com Reply-To: user@hive.apache.org user@hive.apache.org Date: Tuesday, July 17, 2012 6:26 AM To: user@hive.apache.org user