As I mention you can copy jar to your hadoop cluster at /usr/lib/hive/lib and 
then us directly in Hiveql.

Thank You,
Manish.
Sent from my BlackBerry, pls excuse typo

-----Original Message-----
From: Manu A <hadoophi...@gmail.com>
Date: Wed, 26 Sep 2012 15:01:14 
To: <user@hive.apache.org>
Reply-To: user@hive.apache.org
Subject: Re: Custom MR scripts using java in Hive

Hi Manish,
Thanks,I did like the same.but how to invoke the custom java map/reduce
functions ( com.hive.test.TestMapper ) since there is no script as it is a
jar file.The process looks bit different from UDF( used create temporary
function).


On Wed, Sep 26, 2012 at 12:25 PM, Manish.Bhoge <manish.bh...@target.com>wrote:

>  Sorry for late reply. ****
>
> ** **
>
> For anything which you want to run as MAP and REDUCE you have to extend
> map reduce classes for your functionality irrespective of language (Java,
> python or any other).  Once you have extended class move the jar to the
> Hadoop cluster. ****
>
> Bertrand has also mention about reflection. That is something new for me.
> You can give a try to reflection.****
>
> ** **
>
> Thank You,****
>
> Manish****
>
> ** **
>
> *From:* Tamil A [mailto:4tamil...@gmail.com]
> *Sent:* Tuesday, September 25, 2012 6:48 PM
> *To:* user@hive.apache.org
> *Subject:* Re: Custom MR scripts using java in Hive****
>
> ** **
>
> Hi Manish,****
>
>  ****
>
> Thanks for your help.I did the same using UDF.Now trying with
> Transform,Map and Reduce clauses.so is it mean by using java we have to
> goahead through UDF and for other languages using MapReduce Scripts i.e.,
> the Transform,Map and Reduce clauses.****
>
> Please correct me if am wrong.****
>
>  ****
>
>  ****
>
>  ****
>
> Thanks & Regards,****
>
> Manu ****
>
> ** **
>
> On Tue, Sep 25, 2012 at 5:19 PM, Manish.Bhoge <manish.bh...@target.com>
> wrote:****
>
> Manu,****
>
>  ****
>
> If you have written UDF in Java for Hive then you need to copy your JAR on
> your Hadoop cluster in /usr/lib/hive/lib/ folder to hive to use this JAR.
> ****
>
>  ****
>
> Thank You,****
>
> Manish****
>
>  ****
>
> *From:* Manu A [mailto:hadoophi...@gmail.com]
> *Sent:* Tuesday, September 25, 2012 3:44 PM
> *To:* user@hive.apache.org
> *Subject:* Custom MR scripts using java in Hive****
>
>  ****
>
> Hi All,****
>
> I am learning hive. Please let me know if any one tried with custom Map
> Reduce scripts using java in hive or refer me some links and blogs with an
> example.****
>
>  ****
>
> when i tried i got the below error :****
>
>  ****
>
> Hadoop job information for Stage-1: number of mappers: 1; number of
> reducers: 0
> 2012-09-25 02:47:23,720 Stage-1 map = 0%,  reduce = 0%
> 2012-09-25 02:47:56,943 Stage-1 map = 100%,  reduce = 100%
> Ended Job = job_201209222231_0001 with errors
> Error during job, obtaining debugging information...
> Examining task ID: task_201209222231_0001_m_000002 (and more) from job
> job_201209222231_0001
> Exception in thread "Thread-51" java.lang.RuntimeException: Error while
> reading from task log url
>         at
> org.apache.hadoop.hive.ql.exec.errors.TaskLogProcessor.getErrors(TaskLogProcessor.java:130)
>         at
> org.apache.hadoop.hive.ql.exec.JobDebugger.showJobFailDebugInfo(JobDebugger.java:211)
>         at
> org.apache.hadoop.hive.ql.exec.JobDebugger.run(JobDebugger.java:81)
>         at java.lang.Thread.run(Thread.java:619)
> Caused by: java.io.IOException: Server returned HTTP response code: 400
> for URL: // removed as confidential****
>
>         at
> sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1313)
>         at java.net.URL.openStream(URL.java:1010)
>         at
> org.apache.hadoop.hive.ql.exec.errors.TaskLogProcessor.getErrors(TaskLogProcessor.java:120)
>         ... 3 more
> FAILED: Execution Error, return code 2 from
> org.apache.hadoop.hive.ql.exec.MapRedTask
> MapReduce Jobs Launched:
> Job 0: Map: 1   HDFS Read: 0 HDFS Write: 0 FAIL
> Total MapReduce CPU Time Spent: 0 msec****
>
>  ****
>
>  ****
>
>  ****
>
>  ****
>
> Thanks for ur help in advance :)****
>
>  ****
>
>  ****
>
>  ****
>
> Thanks & Regards,****
>
> Manu ****
>
>  ****
>
>  ****
>
>  ****
>
>  ****
>
>  ****
>
>  ****
>
>
>
>
> --
> *Thanks & Regards,* ****
>
> *Tamil*****
>
> ** **
>

Reply via email to