[ 
https://issues.apache.org/jira/browse/HIVE-13423?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15589113#comment-15589113
 ] 

Aihua Xu commented on HIVE-13423:
---------------------------------

Doesn't seem to have standard for that.

I feel we can just return null if it's out of the range as well. Seems clean 
and also match what we have for other datatypes in Hive.

The attached patch is to return null if the sum is out of range. Can we commit 
this change?

> Handle the overflow case for decimal datatype for sum()
> -------------------------------------------------------
>
>                 Key: HIVE-13423
>                 URL: https://issues.apache.org/jira/browse/HIVE-13423
>             Project: Hive
>          Issue Type: Bug
>          Components: Query Processor
>    Affects Versions: 2.0.0
>            Reporter: Aihua Xu
>            Assignee: Aihua Xu
>         Attachments: HIVE-13423.1.patch
>
>
> When a column col1 defined as decimal and if the sum of the column overflows, 
> we will try to increase the decimal precision by 10. But if it's reaching 38 
> (the max precision), the overflow still could happen. Right now, if such case 
> happens, the following exception will throw since hive is writing incorrect 
> data.
> {noformat}
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 1
>         at 
> org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:314)
>  ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:219)
>  ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)
>  ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to