[ https://issues.apache.org/jira/browse/SPARK-32213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17166145#comment-17166145 ]
Hyukjin Kwon commented on SPARK-32213: -------------------------------------- I think this is documented: {code} * In this method, save mode is used to determine the behavior if the data source table exists in * Spark catalog. We will always overwrite the underlying data of data source (e.g. a table in * JDBC data source) if the table doesn't exist in Spark catalog, and will always append to the * underlying data of data source if the table already exists. {code} > saveAsTable deletes all files in path > ------------------------------------- > > Key: SPARK-32213 > URL: https://issues.apache.org/jira/browse/SPARK-32213 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 3.0.0 > Reporter: Yuval Rochman > Priority: Major > > The problem is presented in the following link: > [https://stackoverflow.com/questions/62782637/saveastable-can-delete-all-my-files-in-desktop?noredirect=1#comment111026138_62782637] > Apparently, without no warning, all files is desktop where deleted after > writing a file. > There is no warning in Pyspark that the "Path" parameter makes that problem. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org