[ 
https://issues.apache.org/jira/browse/SPARK-34260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-34260:
------------------------------------

    Assignee:     (was: Apache Spark)

> UnresolvedException when creating temp view twice
> -------------------------------------------------
>
>                 Key: SPARK-34260
>                 URL: https://issues.apache.org/jira/browse/SPARK-34260
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.2, 3.1.2
>            Reporter: Linhong Liu
>            Priority: Major
>
> when creating temp view twice, there is an UnresolvedException, queries to 
> reproduce:
> {code:java}
> sql("create or replace temp view v as select * from (select * from 
> range(10))")
> sql("create or replace temp view v as select * from (select * from 
> range(10))")
> {code}
> error message:
> {noformat}
> org.apache.spark.sql.catalyst.analysis.UnresolvedException: Invalid call to 
> toAttribute on unresolved object, tree: *
>         at 
> org.apache.spark.sql.catalyst.analysis.Star.toAttribute(unresolved.scala:295)
>         at 
> org.apache.spark.sql.catalyst.plans.logical.Project.$anonfun$output$1(basicLogicalOperators.scala:62)
>         at scala.collection.immutable.List.map(List.scala:293)
>         at 
> org.apache.spark.sql.catalyst.plans.logical.Project.output(basicLogicalOperators.scala:62)
>         at 
> org.apache.spark.sql.catalyst.plans.logical.SubqueryAlias.output(basicLogicalOperators.scala:945)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$allAttributes$1(QueryPlan.scala:431)
>         at scala.collection.immutable.List.flatMap(List.scala:366)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.allAttributes$lzycompute(QueryPlan.scala:431)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.allAttributes(QueryPlan.scala:431)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$doCanonicalize$2(QueryPlan.scala:404)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$1(QueryPlan.scala:116)
>         at 
> org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:73)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpression$1(QueryPlan.scala:116)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.recursiveTransform$1(QueryPlan.scala:127)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$3(QueryPlan.scala:132)
>         at 
> scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
>         at scala.collection.immutable.List.foreach(List.scala:431)
>         at scala.collection.TraversableLike.map(TraversableLike.scala:286)
>         at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
>         at scala.collection.immutable.List.map(List.scala:305)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.recursiveTransform$1(QueryPlan.scala:132)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$4(QueryPlan.scala:137)
>         at 
> org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:243)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:137)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.doCanonicalize(QueryPlan.scala:389)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.canonicalized$lzycompute(QueryPlan.scala:373)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.canonicalized(QueryPlan.scala:372)
>         at 
> org.apache.spark.sql.catalyst.plans.QueryPlan.sameResult(QueryPlan.scala:420)
>         at 
> org.apache.spark.sql.execution.command.CreateViewCommand.run(views.scala:118)
>         at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
>         at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
>         at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
>         at 
> org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228)
>         at 
> org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3699)
>         at 
> org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
>         at 
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
>         at 
> org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
>         at 
> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
>         at 
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
>         at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3697)
>         at org.apache.spark.sql.Dataset.<init>(Dataset.scala:228)
>         at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
>         at 
> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
>         at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
>         at 
> org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:615)
>         at 
> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
>         at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:610)
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to