amaliujia commented on code in PR #36586: URL: https://github.com/apache/spark/pull/36586#discussion_r875478230
########## sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala: ########## @@ -204,8 +211,12 @@ class CatalogImpl(sparkSession: SparkSession) extends Catalog { * table/view. This throws an `AnalysisException` when no `Table` can be found. */ override def getTable(tableName: String): Table = { - val tableIdent = sparkSession.sessionState.sqlParser.parseTableIdentifier(tableName) - getTable(tableIdent.database.orNull, tableIdent.table) +// val tableIdent = sparkSession.sessionState.sqlParser.parseTableIdentifier(tableName) +// getTable(tableIdent.database.orNull, tableIdent.table) + val multiPart = sparkSession.sessionState.sqlParser.parseMultipartIdentifier(tableName) + val plan = ShowTables(UnresolvedNamespace(multiPart), None) Review Comment: It is a bit wired to use `DESC TABLE`'s logical plan here. ``` case class DescribeTableExec( output: Seq[Attribute], table: Table, isExtended: Boolean) ``` It asks for Table as parameter. DescribleTableExec is more like that after we get a Table, we further get Table's metadata, for example schema. `DescribeTableExec` might be overkill for `getTable` call here. It only asks to return `Table`: ``` class Table( val name: String, @Nullable val database: String, @Nullable val description: String, val tableType: String, val isTemporary: Boolean) ``` I think use ShowTables logical plan then get a list of table, then lookup from the list may better fit the interface.... -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org