Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/15024#discussion_r85808919
  
    --- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
    @@ -418,21 +424,41 @@ private[spark] class HiveExternalCatalog(conf: 
SparkConf, hadoopConf: Configurat
         }
     
         if (DDLUtils.isDatasourceTable(withStatsProps)) {
    -      val oldDef = client.getTable(db, withStatsProps.identifier.table)
    -      // Sets the `schema`, `partitionColumnNames` and `bucketSpec` from 
the old table definition,
    -      // to retain the spark specific format if it is. Also add old data 
source properties to table
    -      // properties, to retain the data source table format.
    -      val oldDataSourceProps = 
oldDef.properties.filter(_._1.startsWith(SPARK_SQL_PREFIX))
    +      val oldTableDef = client.getTable(db, 
withStatsProps.identifier.table)
    +
    +      // Always update the location property w.r.t. the new table location.
    +      val locationProp = tableDefinition.storage.locationUri.map { 
location =>
    +        TABLE_LOCATION -> location
    +      }
    +      // Only update the `locationUri` field if the location is really 
changed, because this table
    +      // may be not Hive-compatible and can not set the `locationUri` 
field. We should respect the
    +      // old `locationUri` even it's None.
    +      val oldLocation = getLocationFromRawTable(oldTableDef)
    +      val locationUri = if (oldLocation == 
tableDefinition.storage.locationUri) {
    --- End diff --
    
    ```Scala
      test("alter table - rename") {
        val tabName = "tab1"
        val newTabName = "tab2"
        withTable(tabName, newTabName) {
          spark.range(10).write.saveAsTable(tabName)
          val catalog = spark.sessionState.catalog
          sql(s"ALTER TABLE $tabName RENAME TO $newTabName")
          sql(s"DESC FORMATTED $newTabName").show(100, false)
          assert(!catalog.tableExists(TableIdentifier(tabName)))
          assert(catalog.tableExists(TableIdentifier(newTabName)))
        }
      }
    ```
    
    You can try to run the above test case in `DDLSuite.scala` and 
`HiveDDLSuite.scala`. The locations are different. One is using the new table 
name; another is using the old one. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to