Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20612#discussion_r169224985
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/codegen/CodeGenerator.scala
 ---
    @@ -1226,14 +1226,24 @@ class CodegenContext {
     
       /**
        * Register a comment and return the corresponding place holder
    +   *
    +   * @param placeholderId an optionally specified identifier for the 
comment's placeholder.
    +   *                      The caller should make sure this identifier is 
unique within the
    +   *                      compilation unit. If this argument is not 
specified, a fresh identifier
    +   *                      will be automatically created and used as the 
placeholder.
    +   * @param force whether to force registering the comments
        */
    -  def registerComment(text: => String): String = {
    +   def registerComment(
    +       text: => String,
    +       placeholderId: String = "",
    +       force: Boolean = false): String = {
         // By default, disable comments in generated code because computing 
the comments themselves can
         // be extremely expensive in certain cases, such as deeply-nested 
expressions which operate over
         // inputs with wide schemas. For more details on the performance 
issues that motivated this
         // flat, see SPARK-15680.
    -    if (SparkEnv.get != null && 
SparkEnv.get.conf.getBoolean("spark.sql.codegen.comments", false)) {
    -      val name = freshName("c")
    +    if (force ||
    +      SparkEnv.get != null && 
SparkEnv.get.conf.getBoolean("spark.sql.codegen.comments", false)) {
    +      val name = if (placeholderId != "") placeholderId else freshName("c")
    --- End diff --
    
    although the caller should guarantee `placeholderId` is unique, shall we 
add a check here for safe? e.g. 
`assert(!placeHolderToComments.containsKey("placeholderId"))`


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to