Re: UDAF terminatePartial structure

2013-07-30 Thread Robin Morris
@hive.apache.orgmailto:user@hive.apache.org Subject: Re: UDAF terminatePartial structure Hi Robin,igor Thanks for the suggestion and links. Based on examples I found, below is my UDF. However, I am getting following error when trying to run it. Not sure what the error means = ERROR

UDAF terminatePartial structure

2013-07-29 Thread Ritesh Agrawal
Hi all, I am writing my first UDAF. In my terminatePartial() function, I need to store different data having different data types. Below is a list of items that I need to store 1. C1 : list of doubles 2. C2: list of doubles 3. C3: double 4. Show: list of strings I am wondering can I use

Re: UDAF terminatePartial structure

2013-07-29 Thread Robin Morris
I believe a map will be passed correctly from the terminatePartial to the merge functions. But it seems a bit of overkill. Why not define a class within your UDAF which has 4 public data members, and return instances of that class from terminatePartial()? Robin On 7/29/13 3:19 PM, Ritesh

Re: UDAF terminatePartial structure

2013-07-29 Thread Ritesh Agrawal
Hi Robin, Thanks for the suggestion. I did find such an example in Hadoop The definitive guide book. However I am not total confused. The book extends UDAF instead of AbstractGenericUDAFResolver. Which one is recommended ? Also the example in the book uses DoubleWritable as a return type

Re: UDAF terminatePartial structure

2013-07-29 Thread Igor Tatarinov
I found this Cloudera example helpful: http://grepcode.com/file/repository.cloudera.com/content/repositories/releases/org.apache.hadoop.hive/hive-contrib/0.7.0-cdh3u0/org/apache/hadoop/hive/contrib/udaf/example/UDAFExampleMaxMinNUtil.java#UDAFExampleMaxMinNUtil.Evaluator igor decide.com On

Re: UDAF terminatePartial structure

2013-07-29 Thread Ritesh Agrawal
Hi Robin,igor Thanks for the suggestion and links. Based on examples I found, below is my UDF. However, I am getting following error when trying to run it. Not sure what the error means = ERROR FAILED: Hive Internal Error: