Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/22466
  
    We should look at Spark documentation, and Hive, if any, to figure out what 
the right behavior is here. Spark generally follows Hive. See 
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-CreateTableCreate/Drop/TruncateTable
  I think this is, further, conflating what `LOCATION` and `EXTERNAL` does. I 
agree that external DB files shouldn't be deleted, but not simply those 
specified by `LOCATION`. At least that is my understanding.
    @yhuai or @cloud-fan or @clockfly might know more.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to