[ https://issues.apache.org/jira/browse/SPARK-36290?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17612017#comment-17612017 ]
Apache Spark commented on SPARK-36290: -------------------------------------- User 'wangyum' has created a pull request for this issue: https://github.com/apache/spark/pull/38071 > Push down join condition evaluation > ----------------------------------- > > Key: SPARK-36290 > URL: https://issues.apache.org/jira/browse/SPARK-36290 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 3.3.0 > Reporter: Yuming Wang > Priority: Major > > {code:scala} > val numRows = 1024 * 1024 * 15 > spark.sql(s"CREATE TABLE t1 using parquet AS select id as a, id as b from > range(${numRows}L)") > spark.sql(s"CREATE TABLE t2 using parquet AS select id as a, id as b from > range(${numRows}L)") > val benchmark = new Benchmark("Benchmark push down join condition > evaluation", numRows, minNumIters = 5) > Seq(false, true).foreach { pushDownEnabled => > val name = s"Join Condition Evaluation ${if (pushDownEnabled) > s"(Pushdown)" else ""}" > benchmark.addCase(name) { _ => > withSQLConf("spark.sql.pushDownJoinConditionEvaluationevaluation" -> > s"$pushDownEnabled") { > spark.sql("SELECT t1.* FROM t1 JOIN t2 ON translate(t1.a, '123', > 'abc') = translate(t2.a, '123', > 'abc')").write.format("noop").mode("Overwrite").save() > } > } > } > benchmark.run() > {code} > {noformat} > Java HotSpot(TM) 64-Bit Server VM 1.8.0_251-b08 on Mac OS X 10.15.7 > Intel(R) Core(TM) i9-9980HK CPU @ 2.40GHz > Benchmark push down join condition evaluation: Best Time(ms) Avg Time(ms) > Stdev(ms) Rate(M/s) Per Row(ns) Relative > ----------------------------------------------------------------------------------------------------------------------------- > Join Condition Evaluation 32459 34521 > 1465 0.5 2063.7 1.0X > Join Condition Evaluation (Pushdown) 19483 20350 > 812 0.8 1238.7 1.7X > {noformat} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org