Hi all,
When I usr 447 files which are 64M each one as input to insert into HBase,
it throws SocketTimeoutException.
But if I use smaller input, it works well.
I guess it is related to Hadoop configuration. But how to configure?
Thank you!
Best Regards,
Chen
Use String functions
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-StringFunctions
2012/7/16 Guy Doulberg guy.doulb...@conduit.com
Hi guys,
I have a table with three fields, (uid,event_guid,timestamp) I need to
find out for each event_guid the index
the numeric
fields to STRING to able to use COALESCE. You can use CONCAT_WS to put
delimiters in between each column.
-Nicole
From: Cdy Chen dongyong.c...@gmail.com
Reply-To: user@hive.apache.org user@hive.apache.org
Date: Tuesday, July 17, 2012 6:26 AM
To: user@hive.apache.org user