[ https://issues.apache.org/jira/browse/SPARK-34763?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-34763. --------------------------------- Fix Version/s: 3.0.3 3.1.2 3.2.0 Resolution: Fixed > col(), $"<name>" and df("name") should handle quoted column names properly. > --------------------------------------------------------------------------- > > Key: SPARK-34763 > URL: https://issues.apache.org/jira/browse/SPARK-34763 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.4.7, 3.0.2, 3.2.0, 3.1.1 > Reporter: Kousuke Saruta > Assignee: Kousuke Saruta > Priority: Major > Fix For: 3.2.0, 3.1.2, 3.0.3 > > > Quoted column names like `a``b.c` cannot be represented with col(), $"<name>" > and df("") because they don't handle such column names properly. > For example, if we have a following DataFrame. > {code} > val df1 = spark.sql("SELECT 'col1' AS `a``b.c`") > {code} > For the DataFrame, this query is successfully executed. > {code} > scala> df1.selectExpr("`a``b.c`").show > +-----+ > |a`b.c| > +-----+ > | col1| > +-----+ > {code} > But the following query will fail because df1("`a``b.c`") throws an exception. > {code} > scala> df1.select(df1("`a``b.c`")).show > org.apache.spark.sql.AnalysisException: syntax error in attribute name: > `a``b.c`; > at > org.apache.spark.sql.catalyst.analysis.UnresolvedAttribute$.e$1(unresolved.scala:152) > at > org.apache.spark.sql.catalyst.analysis.UnresolvedAttribute$.parseAttributeName(unresolved.scala:162) > at > org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveQuoted(LogicalPlan.scala:121) > at org.apache.spark.sql.Dataset.resolve(Dataset.scala:221) > at org.apache.spark.sql.Dataset.col(Dataset.scala:1274) > at org.apache.spark.sql.Dataset.apply(Dataset.scala:1241) > ... 49 elided > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org