[ https://issues.apache.org/jira/browse/KYLIN-1345?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15171704#comment-15171704 ]
hongbin ma commented on KYLIN-1345: ----------------------------------- hi [~yonzhang2012] I suggest to move your return type conversion logic into FunctionDesc itself. Otherwise it's hard to make sure that you have covered other cases, one example is https://github.com/binmahone/kylin/blob/daa294b679cf857422acce9cf2c86b6c950a5b67/engine-spark/src/main/java/org/apache/kylin/engine/spark/SparkCubing.java#L376-L376 > measure type expansion when dealing sum of decimal metric > --------------------------------------------------------- > > Key: KYLIN-1345 > URL: https://issues.apache.org/jira/browse/KYLIN-1345 > Project: Kylin > Issue Type: Improvement > Reporter: hongbin ma > Assignee: Edward Zhang > Labels: newbie > > suppose a metric column price is of type decimal (6,2), the sum aggregator of > it might exceed the maximal of decimal (6,2). Currently for metric > aggregators we inherite the column's type in hive. We should think auto > expanding decimal type to decimal(18,4) (just an example) for such cases -- This message was sent by Atlassian JIRA (v6.3.4#6332)