[ 
https://issues.apache.org/jira/browse/SPARK-34252?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17272437#comment-17272437
 ] 

Dongjoon Hyun commented on SPARK-34252:
---------------------------------------

Hi, [~imback82]. I cannot reproduce the error.

Could you double check the example again?

*Spark 3.1.1 RC1*
{code:java}
scala> sql("create temporary view ta(a, b) as select 1, 2")

scala> sql("create temporary view tc(c, d) as select 1, 2")

scala> sql("select a, (select sum(d) from tc where a = c) sum_d from ta group 
by 1, 2").show
+---+-----+
|  a|sum_d|
+---+-----+
|  1|    2|
+---+-----+

scala> spark.version
res3: String = 3.1.1{code}
*Spark 3.2.0*
{code:java}
scala> sql("create temporary view ta(a, b) as select 1, 2")

scala> sql("create temporary view tc(c, d) as select 1, 2")

scala> sql("select a, (select sum(d) from tc where a = c) sum_d from ta group 
by 1, 2").show
+---+-----+
|  a|sum_d|
+---+-----+
|  1|    2|
+---+-----+

scala> spark.version
res3: String = 3.2.0-SNAPSHOT {code}

> View subqueries in aggregate's grouping expression fail during the analysis 
> check
> ---------------------------------------------------------------------------------
>
>                 Key: SPARK-34252
>                 URL: https://issues.apache.org/jira/browse/SPARK-34252
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: Terry Kim
>            Priority: Major
>
> To repro:
> {code:java}
> sql("create temporary view ta(a, b) as select 1, 2")
> sql("create temporary view tc(c, d) as select 1, 2")
> sql("select a, (select sum(d) from tc where a = c) sum_d from ta group by 1, 
> 2").show
> {code}
> fails with:
> {code:java}
> This method should not be called in the analyzer
> java.lang.RuntimeException: This method should not be called in the analyzer
>       at 
> org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.assertNotAnalysisRule(AnalysisHelper.scala:159)
>       at 
> org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.assertNotAnalysisRule$(AnalysisHelper.scala:155)
>       at 
> org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.assertNotAnalysisRule(LogicalPlan.scala:29)
>       at 
> org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformUp(AnalysisHelper.scala:179)
>       at 
> org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformUp$(AnalysisHelper.scala:178)
>       at 
> org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformUp(LogicalPlan.scala:29)
>       at 
> org.apache.spark.sql.catalyst.analysis.EliminateView$.apply(view.scala:56)
>       at 
> org.apache.spark.sql.catalyst.plans.logical.View.doCanonicalize(basicLogicalOperators.scala:485)
>       at 
> org.apache.spark.sql.catalyst.plans.logical.View.doCanonicalize(basicLogicalOperators.scala:458)
>       at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.canonicalized$lzycompute(QueryPlan.scala:373)
>       at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.canonicalized(QueryPlan.scala:372)
>       at 
> org.apache.spark.sql.catalyst.plans.logical.SubqueryAlias.doCanonicalize(basicLogicalOperators.scala:953)
>       at 
> org.apache.spark.sql.catalyst.plans.logical.SubqueryAlias.doCanonicalize(basicLogicalOperators.scala:936)
>       at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.canonicalized$lzycompute(QueryPlan.scala:373)
>       at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.canonicalized(QueryPlan.scala:372)
>       at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$doCanonicalize$1(QueryPlan.scala:387)
>       at scala.collection.immutable.List.map(List.scala:293)
> ...
> {code}
> This works fine in Spark 3.0, and the issue seems to be brought by 
> https://github.com/apache/spark/pull/30567, which introduced 
> View.doCanonicalize()



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to