Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/22466
  
    There is ... see 
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-ManagedandExternalTables
    
    I think Spark conflates the two. It's rare (?) but possible to specify a 
custom location of a managed table, but, typically occurs for `EXTERNAL` 
tables. So maybe this is OK.
    
    ```
      private def createTable(tableIdent: TableIdentifier): Unit = {
        val storage = 
DataSource.buildStorageFormatFromOptions(extraOptions.toMap)
        val tableType = if (storage.locationUri.isDefined) {
          CatalogTableType.EXTERNAL
        } else {
          CatalogTableType.MANAGED
        }
    ```
    
    And in `SqlParser`:
    
    ```
        // If location is defined, we'll assume this is an external table.
        // Otherwise, we may accidentally delete existing data.
        val tableType = if (external || location.isDefined) {
          CatalogTableType.EXTERNAL
        } else {
          CatalogTableType.MANAGED
        }
    ```
    
    So if `LOCATION` implies `EXTERNAL` in Spark, then I get this. `EXTERNAL` 
tables shouldn't be deleted.
    
    I agree that the Hive impl doesn't seem to take this into account, on the 
code paths that call `dropDatabase`. CC @andrewor14 in case he is available to 
comment on the original implementaiton. WDYT @cloud-fan 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to