[ https://issues.apache.org/jira/browse/SPARK-40351?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17630199#comment-17630199 ]
Tymofii commented on SPARK-40351: --------------------------------- [~dwsmith1983] make sense. Thank you for pointing out to this doc > Spark Sum increases the precision of DecimalType arguments by 10 > ---------------------------------------------------------------- > > Key: SPARK-40351 > URL: https://issues.apache.org/jira/browse/SPARK-40351 > Project: Spark > Issue Type: Question > Components: Optimizer > Affects Versions: 3.2.0 > Reporter: Tymofii > Priority: Minor > > Currently in Spark automatically increases Decimal field by 10 (hard coded > value) after SUM aggregate operation - > [https://github.com/apache/spark/blob/branch-3.2/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/Optimizer.scala#L1877.] > There are a couple of questions: > # Why was 10 chosen as default one? > # Does it make sense to allow the user to override this value via > configuration? -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org