[ https://issues.apache.org/jira/browse/SPARK-33338?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun resolved SPARK-33338. ----------------------------------- Fix Version/s: 2.4.8 3.0.2 3.1.0 Resolution: Fixed Issue resolved by pull request 30246 [https://github.com/apache/spark/pull/30246] > GROUP BY using literal map should not fail > ------------------------------------------ > > Key: SPARK-33338 > URL: https://issues.apache.org/jira/browse/SPARK-33338 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.2, 2.1.3, 2.2.3, 2.3.4, 2.4.7, 3.0.1, 3.1.0 > Reporter: Dongjoon Hyun > Assignee: Dongjoon Hyun > Priority: Major > Fix For: 3.1.0, 3.0.2, 2.4.8 > > > Apache Spark 2.x ~ 3.0.1 raise`RuntimeException` for the following queries. > *SQL* > {code} > CREATE TABLE t USING ORC AS SELECT map('k1', 'v1') m, 'k1' k > SELECT map('k1', 'v1')[k] FROM t GROUP BY 1 > SELECT map('k1', 'v1')[k] FROM t GROUP BY map('k1', 'v1')[k] > SELECT map('k1', 'v1')[k] a FROM t GROUP BY a > {code} > *ERROR* > {code} > Caused by: java.lang.RuntimeException: Couldn't find k#3 in [keys: [k1], > values: [v1][k#3]#6] > at scala.sys.package$.error(package.scala:27) > at > org.apache.spark.sql.catalyst.expressions.BindReferences$$anonfun$bindReference$1$$anonfun$applyOrElse$1.apply(BoundAttribute.scala:85) > at > org.apache.spark.sql.catalyst.expressions.BindReferences$$anonfun$bindReference$1$$anonfun$applyOrElse$1.apply(BoundAttribute.scala:79) > at > org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52) > {code} > This is a regression from Apache Spark 1.6.x. > {code} > scala> sc.version > res1: String = 1.6.3 > scala> sqlContext.sql("SELECT map('k1', 'v1')[k] FROM t GROUP BY map('k1', > 'v1')[k]").show > +---+ > |_c0| > +---+ > | v1| > +---+ > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org