Thanks. Did you mean I should handle null in my udf or my serde?
I did try to check for null inside the code in my udf, but it fails even
before it gets called.
This is from when the udf fails:
....
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to
execute method public org.apache.hadoop.io.Text
com.company.hive.myfun.evaluate(java.lang.Object,java.lang.Object)
on objectcom.company.hive.myfun@1412332 of class
com.company.hive.myfun with arguments {0:java.lang.Object, null} of size 2
It looks like there is a null, or is this error message misleading?
On 04/12/2012 15:43, Edward Capriolo wrote:
There is no null argument. You should handle the null case in your code.
If (arga == null)
Or optionally you could use a generic udf but a regular one should
handle what you are doing.
On Tuesday, December 4, 2012, Søren <s...@syntonetic.com
<mailto:s...@syntonetic.com>> wrote:
> Hi Hive community
>
> I have a custom udf, say myfun, written in Java which I utilize like
this
>
> select myfun(col_a, col_b) from mytable where ....etc
>
> col_b is a string type and sometimes it is null.
>
> When that happens, my query crashes with
> ---------------
> java.lang.RuntimeException:
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error
while processing row
> {"col_a":"val","col_b":null}
> ...
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable
to execute method public org.apache.hadoop.io.Text
> ---------------
>
> public final class myfun extends UDF {
> public Text evaluate(final Text argA, final Text argB) {
>
> I'm unsure how this should be fixed in a proper way. Is the
framework looking for an overload of evaluate that would comply with
the null argument?
>
> I need to say that the table is declared using my own json serde
reading from S3. I'm not processing nulls in my serde in any special
way because Hive seems to handle null in the right way when not passed
to my own UDF.
>
> Are there anyone out there with ideas or experiences on this issue?
>
> thanks in advance
> Søren
>
>