GitHub user dilipbiswal opened a pull request:

    https://github.com/apache/spark/pull/22448

    [SPARK-25417][SQL] Improve findTightestCommonType to coerce Integral and 
decimal types

    ## What changes were proposed in this pull request?
    Currently `findTightestCommonType` is not able to coerce between integral 
and decimal types properly. For example, while trying to find a common type 
between (int, decimal) , it is able to find a common type only when the number 
of digits to the left of decimal point of the decimal number is >= 10. This PR 
enhances the logic to to correctly find a wider decimal type between the 
integral and decimal types.
    
    Here are some examples of the result of `findTightestCommonType`
    ```
    int, decimal(3, 2) => decimal(12, 2)
    int, decimal(4, 3) => decimal(13, 3)
    int, decimal(11, 3) => decimal(14, 3)
    int, decimal(38, 18) => decimal(38, 18)
    int, decimal(38, 29) => None 
    ```
    
    ## How was this patch tested?
    Added tests to TypeCoercionSuite.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/dilipbiswal/spark find_tightest

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/22448.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #22448
    
----
commit ea0707bea4744a5217baeb82219bed414458afeb
Author: Dilip Biswal <dbiswal@...>
Date:   2018-09-18T05:28:54Z

    Improve findTightestCommonType to coerce Integral and decimal types

commit 8946034e1b25fcf2f4595d27943c9df87c12e096
Author: Dilip Biswal <dbiswal@...>
Date:   2018-09-18T05:52:14Z

    remove space

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to