Hi, Just ran into it today and wonder whether it's a bug or something I may have missed before.
scala> spark.version res21: String = 2.3.2 // that's OK scala> spark.range(1).write.saveAsTable("t1") org.apache.spark.sql.AnalysisException: Table `t1` already exists.; at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:408) at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:393) ... 51 elided // Let's overwrite it then // An exception?! Why?! scala> spark.range(1).write.mode("overwrite").saveAsTable("t1") org.apache.spark.sql.AnalysisException: Unable to infer schema for Parquet. It must be specified manually.; at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$9.apply(DataSource.scala:208) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$9.apply(DataSource.scala:208) at scala.Option.getOrElse(Option.scala:121) ... // If the above works properly, why does the following work fine (and not throw an exception)? scala> spark.range(1).write.saveAsTable("t10") p.s. I was not sure whether I should be sending the question to dev or users so accept my apologizes when sent to a wrong mailing list. Pozdrawiam, Jacek Laskowski ---- https://about.me/JacekLaskowski Mastering Spark SQL https://bit.ly/mastering-spark-sql Spark Structured Streaming https://bit.ly/spark-structured-streaming Mastering Kafka Streams https://bit.ly/mastering-kafka-streams Follow me at https://twitter.com/jaceklaskowski