[ https://issues.apache.org/jira/browse/SPARK-33678?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-33678. ---------------------------------- Fix Version/s: 3.2.0 Resolution: Fixed Issue resolved by pull request 30745 [https://github.com/apache/spark/pull/30745] > Numerical product aggregation > ----------------------------- > > Key: SPARK-33678 > URL: https://issues.apache.org/jira/browse/SPARK-33678 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.4.7, 3.0.0, 3.1.0 > Reporter: Richard Penney > Assignee: Richard Penney > Priority: Minor > Fix For: 3.2.0 > > > There is currently no facility in {{spark.sql.functions}} to allow > computation of the product of all numbers in a grouping expression. Such a > facility would likely be useful when computing statistical quantities such as > the combined probability of a set of independent events, or in financial > applications when calculating a cumulative interest rate. > Although it is certainly possible to emulate this by an expression of the > form {{exp(sum(log(column)))}}, this has a number of significant drawbacks: > * It involves computationally costly functions (exp, log) > * It is more verbose than something like {{product(column)}} > * It is more prone to numerical inaccuracies when handling quantities that > are close to one than by directly multiplying a set of numbers > * It will not handle zeros or negative numbers cleanly > I am currently developing an addition to {{sql.functions}}, which involvesĀ [a > new Catalyst aggregation > expression|https://github.com/rwpenney/spark/blob/feature/agg-product/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/Product.scala]. > This needs some additional testing, and I hope to issue a pull-request soon. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org