cloud-fan commented on code in PR #36586:
URL: https://github.com/apache/spark/pull/36586#discussion_r875383944


##########
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/ShowTablesExec.scala:
##########
@@ -38,12 +38,14 @@ case class ShowTablesExec(
     val rows = new ArrayBuffer[InternalRow]()
 
     val tables = catalog.listTables(namespace.toArray)
+    // how to make sure the table is matched here? What is pattern and how to 
use it?
     tables.map { table =>
       if (pattern.map(StringUtils.filterPattern(Seq(table.name()), 
_).nonEmpty).getOrElse(true)) {
         rows += toCatalystRow(table.namespace().quoted, table.name(), 
isTempView(table))
       }
     }
 
+    // these rows are CatalystRow (which is InternalRow). How to convert it 
back to table in CatalogImpl?

Review Comment:
   This can only be done in `CatalogImpl`, we probably need to issue a 
`getTable` request for each table name to get detailed table information



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to