GitHub user mgaido91 opened a pull request:

    https://github.com/apache/spark/pull/22450

    [SPARK-25454][SQL] Avoid precision loss in division with decimal with 
negative scale

    ## What changes were proposed in this pull request?
    
    Our rules for determine decimal precision and scale are reflecting Hive and 
SQLServer's. The problem is that in Spark we allow negative scale, whereas in 
those other systems this is not possible. So the rule we have for division 
doesn't take in account the case when the scale is negative.
    
    The PR makes our rule compatible with decimals having negative scale too.
    
    ## How was this patch tested?
    
    added UTs


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/mgaido91/spark SPARK-25454

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/22450.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #22450
    
----
commit 7c4b454c863b4e760a3c7df9f0d17f94e86a5a47
Author: Marco Gaido <marcogaido91@...>
Date:   2018-09-18T13:47:10Z

    [SPARK-25454][SQL] Avoid precision loss in division with decimal with 
negative scale

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to