Try to  look hr: 
http://stackoverflow.com/questions/14032924/how-to-add-serde-jar

Another advice: insert your ADD JAR commands in your $HOME/.hiverc file and 
start hive. 
(http://mail-archives.apache.org/mod_mbox/hive-user/201303.mbox/%3ccamgr+0h3smdw4zhtpyo5b1b4iob05bpw8ls+daeh595qzid...@mail.gmail.com%3E)



From: Ted Yu <yuzhih...@gmail.com<mailto:yuzhih...@gmail.com>>
Reply-To: "user@hadoop.apache.org<mailto:user@hadoop.apache.org>" 
<user@hadoop.apache.org<mailto:user@hadoop.apache.org>>
Date: Wednesday, December 31, 2014 at 8:25 AM
To: "d...@hive.apache.org<mailto:d...@hive.apache.org>" 
<d...@hive.apache.org<mailto:d...@hive.apache.org>>
Subject: Fwd: way to add custom udf jar in hadoop 2.x version

Forwarding Niels' question to hive mailing list.

On Wed, Dec 31, 2014 at 1:24 AM, Niels Basjes 
<ni...@basjes.nl<mailto:ni...@basjes.nl>> wrote:

Thanks for the pointer.
This seems to work for functions. Is there something similar for CREATE 
EXTERNAL TABLE ??

Niels

On Dec 31, 2014 8:13 AM, "Ted Yu" 
<yuzhih...@gmail.com<mailto:yuzhih...@gmail.com>> wrote:
Have you seen this thread ?
http://search-hadoop.com/m/8er9TcALc/Hive+udf+custom+jar&subj=Best+way+to+add+custom+UDF+jar+in+HiveServer2

On Dec 30, 2014, at 10:56 PM, reena upadhyay 
<reena2...@gmail.com<mailto:reena2...@gmail.com>> wrote:

Hi,

I am using hadoop 2.4.0 version. I have created custom udf jar. I am trying to 
execute a simple select udf query using java hive jdbc client program. When 
hive execute the query using map reduce job, then the query execution get fails 
because the mapper is not able to locate the udf class.
So I wanted to add the udf jar in hadoop environment permanently. Please 
suggest me a way to add this external jar for single node and multi node hadoop 
cluster.

PS: I am using hive 0.13.1 version and I already have this custom udf jar added 
in HIVE_HOME/lib directory.


Thanks

This message, including any attachments, is the property of Sears Holdings 
Corporation and/or one of its subsidiaries. It is confidential and may contain 
proprietary or legally privileged information. If you are not the intended 
recipient, please delete it without reading the contents. Thank you.

Reply via email to