GitHub user dilipbiswal opened a pull request:

    https://github.com/apache/spark/pull/21857

    [SPARK-21274] Implement EXCEPT ALL clause.

    ## What changes were proposed in this pull request?
    Implements EXCEPT ALL clause through query rewrites using existing 
operators in Spark. In this PR, an internal UDTF (replicate_rows) is added to 
aid in preserving duplicate rows. Please refer to 
[Link](https://drive.google.com/open?id=1nyW0T0b_ajUduQoPgZLAsyHK8s3_dko3ulQuxaLpUXE)
 for the design.
    
    **Note** This proposed UDTF is kept as a internal function that is purely 
used to aid with this particular rewrite to give us flexibility to change to a 
more generalized UDTF in future.
    
    Input Query
    ``` SQL
    SELECT c1 FROM ut1 EXCEPT ALL SELECT c1 FROM ut2
    ```
    Rewritten Query
    ```SQL
    SELECT c1
        FROM (
         SELECT replicate_rows(sum_val, c1) AS (sum_val, c1)
           FROM (
             SELECT c1, cnt, sum_val
               FROM (
                 SELECT c1, sum(vcol) AS sum_val
                   FROM (
                     SELECT 1L as vcol, c1 FROM ut1
                     UNION ALL
                     SELECT -1L as vcol, c1 FROM ut2
                  ) AS union_all
                GROUP BY union_all.c1
              )
            WHERE sum_val > 0
           )
       )
    ```
    
    ## How was this patch tested?
    Added test cases in SQLQueryTestSuite and DataFrameSuite

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/dilipbiswal/spark dkb_except_all_final

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/21857.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #21857
    
----
commit 5cf8c4caa8bce874c5336498a6dc805f0bec1681
Author: Dilip Biswal <dbiswal@...>
Date:   2018-05-07T08:18:17Z

    [SPARK-21274] Implement EXCEPT ALL clause.

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to