[ 
https://issues.apache.org/jira/browse/KYLIN-6045?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17930140#comment-17930140
 ] 

ASF subversion and git services commented on KYLIN-6045:
--------------------------------------------------------

Commit 21630ba2bd85d2e1441ad8554939c6a12e5fddcf in kylin's branch 
refs/heads/kylin5 from Yinghao Lin
[ https://gitbox.apache.org/repos/asf?p=kylin.git;h=21630ba2bd ]

KYLIN-6045 Fix sum decimal precision

* Replacement for calcite AggregateMergeRule


> SUM Query Decimal Precision Anomaly
> -----------------------------------
>
>                 Key: KYLIN-6045
>                 URL: https://issues.apache.org/jira/browse/KYLIN-6045
>             Project: Kylin
>          Issue Type: Bug
>    Affects Versions: 5.0.0
>            Reporter: Guoliang Sun
>            Priority: Major
>
> When generating the Spark plan for a query, a `cast` conversion is added for 
> the `sum` aggregation in `AggregatePlan.buildAgg`. At this point, the input 
> type is the column type, causing the precision of the `cast` to be reduced. 
> This results in the query returning `null`.
> h3. Example
> - The column precision in the Hive table is `decimal(19,6)`.  
> - The model measure precision is `decimal(29,6)`.  
> - When querying, the result will be `null`.  
> In the Spark event log for the query, the `cast` precision is 
> `decimal(19,6)`. Directly retrieving data from the Parquet file yields the 
> following:  
> - When the `cast` precision is `DECIMAL(19,6)`, the result is `null`.  
> - When the `cast` precision is `DECIMAL(29,6)`, the result is correct.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to