cloud-fan commented on a change in pull request #26231: [SPARK-29572][SQL] add v1 read fallback API in DS v2 URL: https://github.com/apache/spark/pull/26231#discussion_r341575272
########## File path: sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/V2ScanRelationPushDown.scala ########## @@ -51,6 +54,7 @@ object V2ScanRelationPushDown extends Rule[LogicalPlan] { """.stripMargin) val scanRelation = DataSourceV2ScanRelation(relation.table, scan, output) + scanRelation.setTagValue(PUSHED_FILTERS_TAG, pushedFilters) Review comment: It will be convenient if `Scan` can report pushed filters itself. But I'm not sure how to design the API to make it work. Here I just store the pushed filters in the `DataSourceV2ScanRelation`, so that I can use it when creating v1 physical scan node later, which needs `pushedFilters` to do equality check. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org