GitHub user liancheng opened a pull request:

    https://github.com/apache/spark/pull/12734

    Add PARTITION BY and BUCKET BY clause for "CREATE TABLE ... USING ..." 
syntax

    ## What changes were proposed in this pull request?
    
    Currently, we can only create persisted partitioned and/or bucketed data 
source tables using the Dataset API but not using SQL DDL. This PR implements 
the following syntax to add partitioning and bucketing support to the SQL DDL:
    
    ```sql
    CREATE TABLE <table-name>
    USING <provider> [OPTIONS (<key1> <value1>, <key2> <value2>, ...)]
    [PARTITIONED BY (col1, col2, ...)]
    [CLUSTERED BY (col1, col2, ...) [SORTED BY (col1, col2, ...)] INTO <n> 
BUCKETS]
    ```
    
    ## How was this patch tested?
    
    Test cases are added in `MetastoreDataSourcesSuite` to check the newly 
added syntax.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/liancheng/spark spark-14954

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/12734.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #12734
    
----
commit f51300c6ed22a54ff1dc49262cd046774166d957
Author: Cheng Lian <l...@databricks.com>
Date:   2016-04-27T13:06:53Z

    Add PARTITION BY and BUCKET BY clause for "CREATE TABLE ... USING ..." 
syntax

commit af973d64cf3e1079e6c8a185d826e2e43cb06532
Author: Cheng Lian <l...@databricks.com>
Date:   2016-04-27T13:33:13Z

    Moves test case to MetastoreDataSourcesSuite
    
    Also checks for metastore table properties

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to