aokolnychyi commented on a change in pull request #26751: [SPARK-30107][SQL] 
Expose nested schema pruning to all V2 sources
URL: https://github.com/apache/spark/pull/26751#discussion_r357102082
 
 

 ##########
 File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/PushDownUtils.scala
 ##########
 @@ -76,28 +78,46 @@ object PushDownUtils extends PredicateHelper {
    * @return the created `ScanConfig`(since column pruning is the last step of 
operator pushdown),
    *         and new output attributes after column pruning.
    */
-  // TODO: nested column pruning.
   def pruneColumns(
       scanBuilder: ScanBuilder,
       relation: DataSourceV2Relation,
-      exprs: Seq[Expression]): (Scan, Seq[AttributeReference]) = {
+      projects: Seq[NamedExpression],
+      filters: Seq[Expression]): (Scan, Seq[AttributeReference]) = {
     scanBuilder match {
+      case r: SupportsPushDownRequiredColumns if 
SQLConf.get.nestedSchemaPruningEnabled =>
 
 Review comment:
   I don't think there is an API to find out whether a particular `ScanBuilder` 
supports nested schema pruning or not. Instead, we have 
`SupportsPushDownRequiredColumns` and data sources should use the passed schema 
as a reference and prune whatever they can. The flag that we added in this PR 
is specific to `FileScanBuilder` and I believe it will complicate the overall 
logic if we treat `FileScanBuilder` differently by introducing special branches 
or if conditions. Also, we might actually get rid of that flag soon, as 
mentioned in DB's 
[comment](https://github.com/apache/spark/pull/26751#discussion_r356771927).

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to