Ryan Johnson created SPARK-45319: ------------------------------------ Summary: SHOW COLUMNS namespace is not catalog-aware Key: SPARK-45319 URL: https://issues.apache.org/jira/browse/SPARK-45319 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 3.5.0 Reporter: Ryan Johnson
Today, the namespace argument of {{SHOW COLUMNS}} command is only partly catalog-aware. [AstBuilder.scala|https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala#L4693] correctly pulls in and applies namespaces with more than one element, but [ResolveSessionCatalog.scala|https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala#L316] does not correctly validate a multi-element namespace against the qualified table name: {code:java} case ShowColumns(ResolvedV1TableOrViewIdentifier(ident), ns, output) => val v1TableName = ident val resolver = conf.resolver val db = ns match { case Some(db) if v1TableName.database.exists(!resolver(_, db.head)) => throw QueryCompilationErrors.showColumnsWithConflictDatabasesError(db, v1TableName) case _ => ns.map(_.head) } {code} By always/only checking {{db.head}} against {{{}v1TableName.database{}}}, it will not correctly handle e.g. {code:java} SHOW COLUMNS FOR a.b.table IN a -- incorrectly succeeds SHOW COLUMNS FOR a.b.table IN a.b -- incorrectly fails{code} A better solution would use an actual {{Identifier}} and compare its {{namespace}} against the user-provided namespace. Tangentially: It's a arguably a bit strange for the validity check to reside in {{ResolveSessionCatalog}} rule, given that it doesn't actually have anything to do with session catalogs? This seems more an artifact of an implementation that searches specifically for V1 tables, and {{ResolveSessionCatalog}} providing matchers that make that job easier? -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org