maropu commented on issue #25137: [SPARK-28348][SQL] Decimal precision promotion for binary arithmetic with casted decimal type URL: https://github.com/apache/spark/pull/25137#issuecomment-511744139 Which one does follow the SQL standard? IIUC the current spark behaviour depends on the Hive one. On the other hand, PostgreSQL officially says [they follows the standard of "Implicit casting among the numeric data types](https://www.postgresql.org/docs/11/features-sql-standard.html) and the result is; ``` postgres=# select cast(c1 * cast(-34338492.215397047 as decimal(38, 18)) as decimal(38, 18)) as c1 from spark_28348; c1 ------------------------------------- 1179132047626883.596862135856320209 (1 row) postgres=# explain verbose select cast(c1 * cast(-34338492.215397047 as decimal(38, 18)) as decimal(38, 18)) as c1 from spark_28348; QUERY PLAN ----------------------------------------------------------------------------------- Seq Scan on public.spark_28348 (cost=0.00..31.00 rows=1400 width=30) Output: ((c1 * '-34338492.215397047000000000'::numeric(38,18)))::numeric(38,18) (2 rows) ``` mysql has the same result; ``` mysql> select cast(c1 * cast(-34338492.215397047 as decimal(38, 18)) as decimal(38, 18)) as c1 from spark_28348; +-------------------------------------+ | c1 | +-------------------------------------+ | 1179132047626883.596862135856320209 | +-------------------------------------+ ```
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org