[ 
https://issues.apache.org/jira/browse/HIVE-13423?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15589243#comment-15589243
 ] 

Xuefu Zhang commented on HIVE-13423:
------------------------------------

This sounds good to me. I'd +1 on this unless [~ctang.ma] has some objections.

> Handle the overflow case for decimal datatype for sum()
> -------------------------------------------------------
>
>                 Key: HIVE-13423
>                 URL: https://issues.apache.org/jira/browse/HIVE-13423
>             Project: Hive
>          Issue Type: Bug
>          Components: Query Processor
>    Affects Versions: 2.0.0
>            Reporter: Aihua Xu
>            Assignee: Aihua Xu
>         Attachments: HIVE-13423.1.patch
>
>
> When a column col1 defined as decimal and if the sum of the column overflows, 
> we will try to increase the decimal precision by 10. But if it's reaching 38 
> (the max precision), the overflow still could happen. Right now, if such case 
> happens, the following exception will throw since hive is writing incorrect 
> data.
> Follow the following steps to repro. 
> {noformat}
> CREATE TABLE DECIMAL_PRECISION(dec decimal(38,18));
> INSERT INTO DECIMAL_PRECISION VALUES(98765432109876543210.12345), 
> (98765432109876543210.12345);
> SELECT SUM(dec) FROM DECIMAL_PRECISION;
> {noformat}
> {noformat}
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 1
>         at 
> org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:314)
>  ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:219)
>  ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)
>  ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to