[ https://issues.apache.org/jira/browse/SPARK-36889?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-36889. ---------------------------------- Fix Version/s: 3.3.0 Resolution: Fixed Fixed in https://github.com/apache/spark/pull/34140 > Respect `spark.sql.parquet.filterPushdown` by explain() for DSv2 > ---------------------------------------------------------------- > > Key: SPARK-36889 > URL: https://issues.apache.org/jira/browse/SPARK-36889 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.3.0 > Reporter: Max Gekk > Assignee: Max Gekk > Priority: Major > Fix For: 3.3.0 > > > When filters pushdown for parquet is disabled via the SQL config > spark.sql.parquet.filterPushdown, explain() still outputs pushed down filters: > {code} > == Parsed Logical Plan == > 'Filter ('c0 = 1) > +- RelationV2[c0#7] parquet > file:/private/var/folders/p3/dfs6mf655d7fnjrsjvldh0tc0000gn/T/spark-ff7e9a24-fd4e-4981-9c75-e1bcde78e91a > == Analyzed Logical Plan == > c0: int > Filter (c0#7 = 1) > +- RelationV2[c0#7] parquet > file:/private/var/folders/p3/dfs6mf655d7fnjrsjvldh0tc0000gn/T/spark-ff7e9a24-fd4e-4981-9c75-e1bcde78e91a > == Optimized Logical Plan == > Filter (isnotnull(c0#7) AND (c0#7 = 1)) > +- RelationV2[c0#7] parquet > file:/private/var/folders/p3/dfs6mf655d7fnjrsjvldh0tc0000gn/T/spark-ff7e9a24-fd4e-4981-9c75-e1bcde78e91a > == Physical Plan == > *(1) Filter (isnotnull(c0#7) AND (c0#7 = 1)) > +- *(1) ColumnarToRow > +- BatchScan[c0#7] ParquetScan DataFilters: [isnotnull(c0#7), (c0#7 = 1)], > Format: parquet, Location: InMemoryFileIndex(1 > paths)[file:/private/var/folders/p3/dfs6mf655d7fnjrsjvldh0tc0000gn/T/spark-ff..., > PartitionFilters: [], PushedFilters: [IsNotNull(c0), EqualTo(c0,1)], > ReadSchema: struct<c0:int>, PushedFilters: [IsNotNull(c0), EqualTo(c0,1)] > RuntimeFilters: [] > {code} > See PushedFilters: [IsNotNull(c0), EqualTo(c0,1)] -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org