Gengliang Wang created SPARK-40389:
--------------------------------------

             Summary: Decimals can't upcast as integral types if the cast can 
overflow
                 Key: SPARK-40389
                 URL: https://issues.apache.org/jira/browse/SPARK-40389
             Project: Spark
          Issue Type: Task
          Components: SQL
    Affects Versions: 3.4.0
            Reporter: Gengliang Wang
            Assignee: Gengliang Wang


In Spark SQL, the method "canUpCast" returns true iff we can safely up-cast the 
`from` type to `to` type without any truncating or precision lose or possible 
runtime failures.

Meanwhile, DecimalType(10, 0) is considered as "canUpCast" to Integer type. 
This is wrong, since casting 9000000000BD as Integer type will overflow.

As a result:
 * The optimizer rule SimplifyCasts replies on the method "canUpCast" and it 
will mistakenly convert "cast(cast(9000000000BD as int) as long)" as 
"cast(9000000000BD as long)"
 * The STRICT store assignment policy relies on this method too. With the 
policy enabled, inserting 9000000000BD into integer columns will pass compiling 
time check and unexpectedly cause runtime errors.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to