[ https://issues.apache.org/jira/browse/SPARK-7551?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Reynold Xin updated SPARK-7551: ------------------------------- Description: DataFrame's resolve: {code} protected[sql] def resolve(colName: String): NamedExpression = { queryExecution.analyzed.resolve(colName.split("\\."), sqlContext.analyzer.resolver).getOrElse { throw new AnalysisException( s"""Cannot resolve column name "$colName" among (${schema.fieldNames.mkString(", ")})""") } } {code} We should not split the parts quoted by backticks (`). For example, `ab.cd`.`efg` should be split into two parts "ab.cd" and "efg". was: DataFrame's resolve: {code} protected[sql] def resolve(colName: String): NamedExpression = { queryExecution.analyzed.resolve(colName.split("\\."), sqlContext.analyzer.resolver).getOrElse { throw new AnalysisException( s"""Cannot resolve column name "$colName" among (${schema.fieldNames.mkString(", ")})""") } } {code} We should not split the parts quoted by backticks (`). > Don't split by dot if within backticks for DataFrame attribute resolution > ------------------------------------------------------------------------- > > Key: SPARK-7551 > URL: https://issues.apache.org/jira/browse/SPARK-7551 > Project: Spark > Issue Type: Sub-task > Components: SQL > Reporter: Reynold Xin > Assignee: Wenchen Fan > Priority: Critical > > DataFrame's resolve: > {code} > protected[sql] def resolve(colName: String): NamedExpression = { > queryExecution.analyzed.resolve(colName.split("\\."), > sqlContext.analyzer.resolver).getOrElse { > throw new AnalysisException( > s"""Cannot resolve column name "$colName" among > (${schema.fieldNames.mkString(", ")})""") > } > } > {code} > We should not split the parts quoted by backticks (`). > For example, `ab.cd`.`efg` should be split into two parts "ab.cd" and "efg". -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org