MaxGekk commented on a change in pull request #31524: URL: https://github.com/apache/spark/pull/31524#discussion_r579646193
########## File path: docs/sql-ref-syntax-ddl-drop-table.md ########## @@ -26,6 +26,8 @@ if the table is not `EXTERNAL` table. If the table is not present it throws an e In case of an external table, only the associated metadata information is removed from the metastore database. +If the table is cached, the command uncaches the table and all its dependents. Review comment: ```scala scala> sql("CREATE TABLE tbl (c0 INT)") scala> sql("INSERT INTO tbl SELECT 0") scala> val tbl = spark.table("tbl") scala> tbl.cache() scala> tbl.show(false) +---+ |c0 | +---+ |0 | +---+ scala> tbl.select(($"c0" + 1).as("c1")).createOrReplaceTempView("tmp_view0") scala> val v = spark.table("tmp_view0") v: org.apache.spark.sql.DataFrame = [c1: int] scala> v.cache() res6: v.type = [c1: int] scala> v.show(false) +---+ |c1 | +---+ |1 | +---+ scala> sql("DROP TABLE tbl") res8: org.apache.spark.sql.DataFrame = [] scala> v.show(false) org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/Users/maximgekk/proj/doc-cmd-caching/spark-warehouse/tbl ``` ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org