The last I looked into this the answer is no. I believe since there is a
Spark Session internal relation cache, the only way to update a sessions
information was a full drop and create. That was my experience with a
custom hive metastore and entries read from it. I could change the entries
in the metastore underneath the session but since the session cached the
relation lookup I couldn't get it to reload the metadata.

DatssourceV2 does make this easy though

On Mon, Jul 20, 2020, 5:17 AM fansparker <revincha...@gmail.com> wrote:

> Does anybody know if there is a way to get the persisted table's schema
> updated when the underlying custom data source schema is changed?
> Currently,
> we have to drop and re-create the table.
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to