Github user rxin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/15998#discussion_r89411482
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/ExternalCatalog.scala
 ---
    @@ -189,6 +189,38 @@ abstract class ExternalCatalog {
           spec: TablePartitionSpec): Option[CatalogTablePartition]
     
       /**
    +   * List the names of all partitions that belong to the specified table, 
assuming it exists.
    +   *
    +   * A partial partition spec may optionally be provided to filter the 
partitions returned.
    +   * For instance, if there exist partitions (a='1', b='2'), (a='1', 
b='3') and (a='2', b='4'),
    +   * then a partial spec of (a='1') will return the first two only.
    +   *
    +   * We provide a default implementation here which simply delegates to 
the `listPartitions`
    +   * method. For efficiency's sake, overriding this method is recommended 
for external catalogs
    +   * that can list partition names directly.
    +   * @param db database name
    +   * @param table table name
    +   * @param partialSpec  partition spec
    +   */
    +  def listPartitionNames(
    +      db: String,
    +      table: String,
    +      partialSpec: Option[TablePartitionSpec] = None): Seq[String] = {
    --- End diff --
    
    My worry is that when we create a new implementation of the external 
catalog, we'd forget to implement this, leading to bad performance.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to