gengliangwang commented on a change in pull request #24598: [SPARK-27699][SQL] Partially push down disjunctive predicated in Parquet/ORC URL: https://github.com/apache/spark/pull/24598#discussion_r284119542
########## File path: sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFilters.scala ########## @@ -527,11 +527,22 @@ private[parquet] class ParquetFilters( } case sources.Or(lhs, rhs) => + // The Or predicate is convertible when both of its children can be pushed down. + // That is to say, if one/both of the children can be partially pushed down, the Or + // predicate can be partially pushed down as well. + // + // Here is an example used to explain the reason. + // Let's say we have + // (a1 AND a2) OR (b1 AND b2), + // a1 and b1 is convertible, while a2 and b2 is not. + // The predicate can be converted as + // (a1 OR b1) AND (a1 OR b2) AND (a2 OR b1) AND (a2 OR b2) Review comment: @dongjoon-hyun nice catch. However, here the `Not` predicate won't have a child as `Or` or `And` predicate because of `BooleanSimplification` optimization rule. I have created a PR for pushing down `Not` operator before(for double insurance), but seems the PR made thing too complex: https://github.com/apache/spark/pull/22687 > Also, it would be great to add more higher level test case in SQLQuerySuite.scala to show the benefit of this additional predicate pushdown a1 OR b1. Could you add that, too? How can we verify the predicate is pushed down? Match the `OrcScan` and check the `pushedFilters`? Only Orc V2 can be checked in this way currently. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org