GitHub user HyukjinKwon opened a pull request:

    https://github.com/apache/spark/pull/10502

    [SPARK-12355][SQL] Implement unhandledFilter interface for Parquet

    https://issues.apache.org/jira/browse/SPARK-12355
    This is similar with https://github.com/apache/spark/pull/10427.
    
    As discussed here https://github.com/apache/spark/pull/10221, this PR 
implemented `unhandledFilter` to remove duplicated Spark-side filtering.
    
    In case of Parquet, the columns referenced in pushed down filters should be 
given to `org.apache.spark.sql.parquet.row.requested_schema` whereas general 
datasources such as JDBC do not require the columns.
    
    However, the current interface of `DataSourceStrategy` removes the columns 
only referenced in pushed down filters. Therefore, this PR resolved this 
problem by manually generating the columns referenced in pushed down filters.
    


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/HyukjinKwon/spark SPARK-12355

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/10502.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #10502
    
----
commit a55ad54a820f095e5116df05979f169c0fe8e0cf
Author: hyukjinkwon <gurwls...@gmail.com>
Date:   2015-12-29T04:34:53Z

    Implement unhandled filters for Parquet

commit cf331a453c3f99ee40ee5ca6f5029dadee3d07f6
Author: hyukjinkwon <gurwls...@gmail.com>
Date:   2015-12-29T04:36:42Z

    Correct existing tests

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to