GitHub user gengliangwang opened a pull request:

    https://github.com/apache/spark/pull/21667

    [SPARK-24691][SQL]Add new API `supportDataType` in FileFormat

    ## What changes were proposed in this pull request?
    
    In https://github.com/apache/spark/pull/21389,  data source schema is 
validated before actual read/write. However,
    
    1. Putting all the validations together in `DataSourceUtils` is tricky and 
hard to maintain. On second thought after review, I find that the 
`OrcFileFormat` in hive package is not matched, so that its validation wrong.
    2.  `DataSourceUtils.verifyWriteSchema` and 
`DataSourceUtils.verifyReadSchema` is not supposed to be called in every file 
format. We can move them to some upper entry.
    So, I propose we can add a new API `supportDataType` in FileFormat. Each 
file format can override the method to specify its supported/non-supported data 
types.
    
    ## How was this patch tested?
    
    Unit test


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/gengliangwang/spark refactorSchemaValidate

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/21667.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #21667
    
----
commit 7fdf6033b6778d06850e6ae5a0fd6e3fde76a5c2
Author: Gengliang Wang <gengliang.wang@...>
Date:   2018-06-28T16:32:44Z

    refactor schema validation

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to