Do you want to merge the schema when incoming data is changed?

spark.conf.set("spark.sql.parquet.mergeSchema", "true")

https://kontext.tech/column/spark/381/schema-merging-evolution-with-parquet-in-spark-and-hive


On Mon, Jul 20, 2020 at 3:48 PM fansparker <revincha...@gmail.com> wrote:

> Does anybody know if there is a way to get the persisted table's schema
> updated when the underlying custom data source schema is changed?
> Currently,
> we have to drop and re-create the table.
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to