GitHub user gatorsmile opened a pull request:

    https://github.com/apache/spark/pull/11100

    [SPARK-13221] [SQL] Fixing GroupingSets when Aggregate Functions Containing 
GroupBy Columns

    Using GroupingSets will generate a wrong result when Aggregate Functions 
Containing GroupBy Columns. 
    
    This PR is to fix it. Since the code changes are very small. Maybe we also 
can merge it to 1.6
    
    Thanks!
    
    For example, the following query returns a wrong result:
    ```scala
    sql("select course, sum(earnings) as sum from courseSales group by course, 
earnings" +
         " grouping sets((), (course), (course, earnings))" +
         " order by course, sum").show()
    ```
    Before the fix, the results are like
    ```
    [null,null]
    [Java,null]
    [Java,20000.0]
    [Java,30000.0]
    [dotNET,null]
    [dotNET,5000.0]
    [dotNET,10000.0]
    [dotNET,48000.0]
    ```
    After the fix, the results are corrected:
    ```
    [null,113000.0]
    [Java,20000.0]
    [Java,30000.0]
    [Java,50000.0]
    [dotNET,5000.0]
    [dotNET,10000.0]
    [dotNET,48000.0]
    [dotNET,63000.0]
    ```

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/gatorsmile/spark groupingSets

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/11100.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #11100
    
----
commit 2f9eeb9d40e653119372a6d3b38106035f4fece9
Author: gatorsmile <gatorsm...@gmail.com>
Date:   2016-02-06T01:48:20Z

    fixing grouping sets

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to