Github user maropu commented on the issue:

    https://github.com/apache/spark/pull/19977
  
    I found the optimizer rule can't combine nested concat like;
    ```
    scala>: psate
    val df = sql("""
    SELECT ((col1 || col2) || (col3 || col4)) col
    FROM (
      SELECT
        encode(string(id), 'utf-8') col1,
        encode(string(id + 1), 'utf-8') col2,
        string(id + 2) col3,
        string(id + 3) col4
      FROM range(10)
    )
    """)
    
    scala> df.explain(true)
    == Parsed Logical Plan ==
    'Project [concat(concat('col1, 'col2), concat('col3, 'col4)) AS col#4]+- 
'SubqueryAlias __auto_generated_subquery_name
       +- 'Project ['encode('string('id), utf-8) AS col1#0, 
'encode('string(('id + 1)), utf-8) AS col2#1, 'string(('id + 2)) AS col3#2, 
'string(('id + 3)) AS col4#3
    ]
          +- 'UnresolvedTableValuedFunction range, [10]
    
    == Analyzed Logical Plan ==
    col: string
    Project [concat(cast(concat(col1#0, col2#1) as string), concat(col3#2, 
col4#3)) AS col#4]
    +- SubqueryAlias __auto_generated_subquery_name
       +- Project [encode(cast(id#9L as string), utf-8) AS col1#0, 
encode(cast((id#9L + cast(1 as bigint)) as string), utf-8) AS col2#1, 
cast((id#9L + cast(2 as bigint)) as string) AS col3#2, cast((id#9L + cast(3 as 
bigint)) as string) AS col4#3]
          +- Range (0, 10, step=1, splits=None)
    
    == Optimized Logical Plan ==
    Project [concat(cast(concat(encode(cast(id#9L as string), utf-8), 
encode(cast((id#9L + 1) as string), utf-8)) as string), cast((id#9L + 2) as 
string), cast((id#9L + 3) as string)) AS col#4]
    +- Range (0, 10, step=1, splits=None)
    
    == Physical Plan ==
    *Project [concat(cast(concat(encode(cast(id#9L as string), utf-8), 
encode(cast((id#9L + 1) as string), utf-8)) as string), cast((id#9L + 2) as 
string), cast((id#9L + 3) as string)) AS col#4]
    +- *Range (0, 10, step=1, splits=4)
    ```
    We need to support the optimization in this case, too?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to