Kris Mok created SPARK-30795:
--------------------------------

             Summary: Spark SQL codegen's code() interpolator should treat 
escapes like Scala's StringContext.s()
                 Key: SPARK-30795
                 URL: https://issues.apache.org/jira/browse/SPARK-30795
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.4.5, 2.4.4, 2.4.3, 2.4.2, 2.4.1, 2.4.0, 3.0.0
            Reporter: Kris Mok


The {{code()}} string interpolator in Spark SQL's code generator should treat 
escapes like Scala's builtin {{StringContext.s()}} interpolator, i.e. it should 
treat escapes in the code parts, and should not treat escapes in the input 
arguments.

For example,
{code}
val arg = "This is an argument."
val str = s"This is string part 1. $arg This is string part 2."
val code = code"This is string part 1. $arg This is string part 2."
assert(code.toString == str)
{code}
We should expect the {{code()}} interpolator produce the same thing as the 
{{StringContext.s()}} interpolator, where only escapes in the string parts 
should be treated, while the args should be kept verbatim.

But in the current implementation, due to the eager folding of code parts and 
literal input args, the escape treatment is incorrectly done on both code parts 
and literal args.
That causes a problem when an arg contains escape sequences and wants to 
preserve that in the final produced code string. For example, in {{Like}} 
expression's codegen, there's an ugly workaround for this bug:
{code}
      // We need double escape to avoid 
org.codehaus.commons.compiler.CompileException.
      // '\\' will cause exception 'Single quote must be backslash-escaped in 
character literal'.
      // '\"' will cause exception 'Line break in literal not allowed'.
      val newEscapeChar = if (escapeChar == '\"' || escapeChar == '\\') {
        s"""\\\\\\$escapeChar"""
      } else {
        escapeChar
      }
{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to