gengliangwang commented on a change in pull request #24806: 
[WIP][SPARK-27856][SQL] Only allow type upcasting when inserting table
URL: https://github.com/apache/spark/pull/24806#discussion_r292770658
 
 

 ##########
 File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/types/DecimalType.scala
 ##########
 @@ -89,6 +90,7 @@ case class DecimalType(precision: Int, scale: Int) extends 
FractionalType {
       (precision - scale) <= (dt.precision - dt.scale) && scale <= dt.scale
     case dt: IntegralType =>
       isTighterThan(DecimalType.forType(dt))
+    // For DoubleType/FloatType, the value can be NaN, PositiveInfinity or 
NegativeInfinity.
 
 Review comment:
   > For isTighterThan, I think it's safe to cast decimal to float/double if 
the precision doesn't exceed?
   
   Yes, it is. I was about to push the commit to fix tests.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to