[ 
https://issues.apache.org/jira/browse/SPARK-33229?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-33229:
------------------------------------

    Assignee: Apache Spark

> UnsupportedOperationException when group by with cube
> -----------------------------------------------------
>
>                 Key: SPARK-33229
>                 URL: https://issues.apache.org/jira/browse/SPARK-33229
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0, 3.0.1, 3.1.0
>            Reporter: Yuming Wang
>            Assignee: Apache Spark
>            Priority: Major
>
> How to reproduce this issue:
> {code:sql}
> create table test_cube using parquet as select id as a, id as b, id as c from 
> range(10);
> select a, b, c, count(*) from test_cube group by 1, cube(2, 3);
> {code}
> {noformat}
> spark-sql> select a, b, c, count(*) from test_cube group by 1, cube(2, 3);
> 20/10/23 06:31:51 ERROR SparkSQLDriver: Failed in [select a, b, c, count(*) 
> from test_cube group by 1, cube(2, 3)]
> java.lang.UnsupportedOperationException
>       at 
> org.apache.spark.sql.catalyst.expressions.GroupingSet.dataType(grouping.scala:35)
>       at 
> org.apache.spark.sql.catalyst.expressions.GroupingSet.dataType$(grouping.scala:35)
>       at 
> org.apache.spark.sql.catalyst.expressions.Cube.dataType(grouping.scala:60)
>       at 
> org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkValidGroupingExprs$1(CheckAnalysis.scala:268)
>       at 
> org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$12(CheckAnalysis.scala:284)
>       at 
> org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$12$adapted(CheckAnalysis.scala:284)
>       at 
> scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
>       at 
> scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
>       at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
>       at 
> org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$1(CheckAnalysis.scala:284)
>       at 
> org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$1$adapted(CheckAnalysis.scala:92)
>       at 
> org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:177)
>       at 
> org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis(CheckAnalysis.scala:92)
>       at 
> org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis$(CheckAnalysis.scala:89)
>       at 
> org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:130)
>       at 
> org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:156)
>       at 
> org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
>       at 
> org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:153)
>       at 
> org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:68)
>       at 
> org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
>       at 
> org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:133)
>       at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
>       at 
> org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:133)
>       at 
> org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:68)
>       at 
> org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:66)
>       at 
> org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:58)
>       at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to