Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20023#discussion_r161767060
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
    @@ -1048,6 +1048,16 @@ object SQLConf {
         .booleanConf
         .createWithDefault(true)
     
    +  val DECIMAL_OPERATIONS_ALLOW_PREC_LOSS =
    +    buildConf("spark.sql.decimalOperations.allowPrecisionLoss")
    +      .internal()
    +      .doc("When true, establishing the result type of an arithmetic 
operation happens " +
    +        "according to Hive behavior and SQL ANSI 2011 specification, ie. 
rounding the decimal " +
    +        "part of the result if an exact representation is not possible. 
Otherwise, NULL is" +
    +        "returned in those cases, as previously (default).")
    +      .booleanConf
    +      .createWithDefault(false)
    --- End diff --
    
    we should make it true by default as it's a more reasonable behavior and 
follows Hive/SQL standard.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to