cloud-fan commented on a change in pull request #31273:
URL: https://github.com/apache/spark/pull/31273#discussion_r580824030



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/views.scala
##########
@@ -541,40 +552,31 @@ object ViewHelper {
   }
 
   /**
-   * Collect all temporary views and functions and return the identifiers 
separately
-   * This func traverses the unresolved plan `child`. Below are the reasons:
-   * 1) Analyzer replaces unresolved temporary views by a SubqueryAlias with 
the corresponding
-   * logical plan. After replacement, it is impossible to detect whether the 
SubqueryAlias is
-   * added/generated from a temporary view.
-   * 2) The temp functions are represented by multiple classes. Most are 
inaccessible from this
-   * package (e.g., HiveGenericUDF).
+   * Collect all temporary views and functions and return the identifiers 
separately.
    */
   private def collectTemporaryObjects(
       catalog: SessionCatalog, child: LogicalPlan): (Seq[Seq[String]], 
Seq[String]) = {
     def collectTempViews(child: LogicalPlan): Seq[Seq[String]] = {
       child.flatMap {
-        case UnresolvedRelation(nameParts, _, _) if 
catalog.isTempView(nameParts) =>
-          Seq(nameParts)
-        case w: With if !w.resolved => 
w.innerChildren.flatMap(collectTempViews)
-        case plan if !plan.resolved => plan.expressions.flatMap(_.flatMap {
+        case s @ SubqueryAlias(_, view: View) if view.isTempView =>

Review comment:
       BTW does temp view come with `SubqueryAlias`? What's the qualifier there?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to