viirya commented on a change in pull request #29950:
URL: https://github.com/apache/spark/pull/29950#discussion_r503666979



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -1926,6 +1926,19 @@ object SQLConf {
     .booleanConf
     .createWithDefault(true)
 
+  val MAX_COMMON_EXPRS_IN_COLLAPSE_PROJECT =
+    buildConf("spark.sql.optimizer.maxCommonExprsInCollapseProject")
+      .doc("An integer number indicates the maximum allowed number of a common 
expression " +

Review comment:
       Not exactly the same, but I revised the doc.

##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -1926,6 +1926,19 @@ object SQLConf {
     .booleanConf
     .createWithDefault(true)
 
+  val MAX_COMMON_EXPRS_IN_COLLAPSE_PROJECT =
+    buildConf("spark.sql.optimizer.maxCommonExprsInCollapseProject")
+      .doc("An integer number indicates the maximum allowed number of a common 
expression " +
+        "can be collapsed into upper Project from lower Project by optimizer 
rule " +
+        "`CollapseProject`. Normally `CollapseProject` will collapse adjacent 
Project " +
+        "and merge expressions. But in some edge cases, expensive expressions 
might be " +
+        "duplicated many times in merged Project by this optimization. This 
config sets " +
+        "a maximum number. Once an expression is duplicated more than this 
number " +
+        "if merging two Project, Spark SQL will skip the merging.")
+      .version("3.1.0")
+      .intConf

Review comment:
       Added, thanks.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to