Github user haiyangsea commented on a diff in the pull request: https://github.com/apache/spark/pull/4929#discussion_r27186638 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala --- @@ -170,21 +170,35 @@ class Analyzer(catalog: Catalog, * Replaces [[UnresolvedRelation]]s with concrete relations from the catalog. */ object ResolveRelations extends Rule[LogicalPlan] { - def getTable(u: UnresolvedRelation) = { + def getTable(u: UnresolvedRelation, cteRelations: Map[String, LogicalPlan]) = { try { - catalog.lookupRelation(u.tableIdentifier, u.alias) + // In hive, if there is same table name in database and CTE definition, + // hive will use the table in database, not the CTE one. + // Taking into account the reasonableness and the implementation complexity, + // here use the CTE definition first, check table name only and ignore database name + cteRelations.get(u.tableIdentifier.last) --- End diff -- I am in favor of ignoring the database name is not a good way. But there will be a problem if we look for CTEs only when `u.tableIdentifier.size == 1`.For example,user may use CTE to create a view like: ```sql create view v1 as with q1 as ( select key from src where key = '5') select * from q1 ``` The expanded text of this view will be : > with q1 as ( select key from src where key = '5') >select \`q1\`.\`key\` from \`default\`.\`q1\` See,the CTE definition alias has changed to default.q1! This will cause `NoSuchTableException`.I have tried to solve the problem, but didn't find a good way. Since user defined CTEs, I think they must be to want to use it.Maybe by the users themselves to avoid this problem is the best way:).
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org