[ 
https://issues.apache.org/jira/browse/KYLIN-1345?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15168528#comment-15168528
 ] 

Edward Zhang commented on KYLIN-1345:
-------------------------------------

This is reproduced in my local machine, actually this assumption is wrong
        this.maxLength = 1 + 1 + (type.getPrecision() + 1) / 2;
For decimal(6,2), even the value with 7 digits will not trigger the exception 
of "        this.maxLength = 1 + 1 + (type.getPrecision() + 1) / 2;". 
This is because some 7 digits value for example 1234567 may only need 3 bytes, 
value.unscaledValue().toByteArray();

To solve this problem, we should relax maxLength, but keep scale to be same 
with whatever predefined.

> measure type expansion when dealing sum of decimal metric
> ---------------------------------------------------------
>
>                 Key: KYLIN-1345
>                 URL: https://issues.apache.org/jira/browse/KYLIN-1345
>             Project: Kylin
>          Issue Type: Improvement
>            Reporter: hongbin ma
>            Assignee: Edward Zhang
>              Labels: newbie
>
> suppose a metric column price is of type decimal (6,2), the sum aggregator of 
> it might exceed the maximal of decimal (6,2). Currently for metric 
> aggregators we inherite the column's type in hive. We should think auto 
> expanding decimal type to decimal(18,4) (just an example) for such cases



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to