Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/22847#discussion_r228483780 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala --- @@ -812,6 +812,17 @@ object SQLConf { .intConf .createWithDefault(65535) + val CODEGEN_METHOD_SPLIT_THRESHOLD = buildConf("spark.sql.codegen.methodSplitThreshold") + .internal() + .doc("The maximum source code length of a single Java function by codegen. When the " + + "generated Java function source code exceeds this threshold, it will be split into " + + "multiple small functions, each function length is spark.sql.codegen.methodSplitThreshold." + --- End diff -- `each function length is spark.sql.codegen.methodSplitThreshold` this is not true, the method size is always larger than the threshold. cc @kiszk any idea about the naming and description of this config?
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org