Github user jiangxb1987 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16233#discussion_r95422458
  
    --- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala ---
    @@ -125,11 +132,16 @@ private[hive] class 
HiveMetastoreCatalog(sparkSession: SparkSession) extends Log
           // Otherwise, wrap the table with a Subquery using the table name.
           alias.map(a => SubqueryAlias(a, qualifiedTable, 
None)).getOrElse(qualifiedTable)
         } else if (table.tableType == CatalogTableType.VIEW) {
    +      val tableIdentifier = table.identifier
           val viewText = table.viewText.getOrElse(sys.error("Invalid view 
without text."))
    -      SubqueryAlias(
    -        alias.getOrElse(table.identifier.table),
    -        sparkSession.sessionState.sqlParser.parsePlan(viewText),
    -        Option(table.identifier))
    +      // The relation is a view, so we wrap the relation by:
    +      // 1. Add a [[View]] operator over the relation to keep track of the 
view desc;
    +      // 2. Wrap the logical plan in a [[SubqueryAlias]] which tracks the 
name of the view.
    +      val child = View(
    +        desc = table,
    +        output = table.schema.toAttributes,
    +        child = sparkSession.sessionState.sqlParser.parsePlan(viewText))
    --- End diff --
    
    It may looks a little over-engineering for now, but that enables us to 
decouple planning of query from the planning of the view, which allows us to 
cache resolved views in the future, and decoupling also allows you to deal with 
some forms of schema evolution (moving column order or columns being added to 
underlying data).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to