[ https://issues.apache.org/jira/browse/SPARK-25579?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-25579: ---------------------------------- Description: This issue aims to fix an ORC performance regression at Spark 2.4.0 RCs from Spark 2.3.2. For column names with `.`, the pushed predicates are ignored. *Spark 2.3.2* {code:java} scala> val df = spark.range(Int.MaxValue).sample(0.2).toDF("col.with.dot") scala> df.write.mode("overwrite").orc("/tmp/orc") scala> spark.sql("set spark.sql.orc.impl=native") scala> spark.sql("set spark.sql.orc.filterPushdown=true") scala> spark.time(spark.read.orc("/tmp/orc").where("`col.with.dot` = 50000").count) Time taken: 803 ms {code} *Spark 2.4.0 RC2* {code:java} scala> spark.time(spark.read.orc("/tmp/orc").where("`col.with.dot` = 50000").count) Time taken: 2405 ms{code} was: This issue aims to fix an ORC performance regression at Spark 2.4.0 RCs from Spark 2.3.2. For column names with `.`, the pushed predicates are ignored. *Spark 2.3.2* {code:java} scala> val df = spark.range(Int.MaxValue).sample(0.2).toDF("col.with.dot") scala> df.write.mode("overwrite").orc("/tmp/orc") scala> df.write.mode("overwrite").parquet("/tmp/parquet") scala> spark.sql("set spark.sql.orc.impl=native") scala> spark.sql("set spark.sql.orc.filterPushdown=true") scala> spark.time(spark.read.orc("/tmp/orc").where("`col.with.dot` = 50000").count) Time taken: 803 ms scala> spark.time(spark.read.parquet("/tmp/parquet").where("`col.with.dot` = 50000").count) Time taken: 5573 ms {code} *Spark 2.4.0 RC2* {code:java} scala> spark.time(spark.read.orc("/tmp/orc").where("`col.with.dot` = 50000").count) Time taken: 2405 ms{code} > Use quoted attribute names if needed in pushed ORC predicates > ------------------------------------------------------------- > > Key: SPARK-25579 > URL: https://issues.apache.org/jira/browse/SPARK-25579 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.4.0 > Reporter: Dongjoon Hyun > Assignee: Dongjoon Hyun > Priority: Critical > > This issue aims to fix an ORC performance regression at Spark 2.4.0 RCs from > Spark 2.3.2. For column names with `.`, the pushed predicates are ignored. > *Spark 2.3.2* > {code:java} > scala> val df = spark.range(Int.MaxValue).sample(0.2).toDF("col.with.dot") > scala> df.write.mode("overwrite").orc("/tmp/orc") > scala> spark.sql("set spark.sql.orc.impl=native") > scala> spark.sql("set spark.sql.orc.filterPushdown=true") > scala> spark.time(spark.read.orc("/tmp/orc").where("`col.with.dot` = > 50000").count) > Time taken: 803 ms > {code} > *Spark 2.4.0 RC2* > {code:java} > scala> spark.time(spark.read.orc("/tmp/orc").where("`col.with.dot` = > 50000").count) > Time taken: 2405 ms{code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org