Re: schema changes of custom data source in persistent tables DataSourceV1

2020-07-20 Thread fansparker
Makes sense, Russell. I am trying to figure out if there is a way to enforce metadata reload at "createRelation" if the provided schema in the new sparkSession is different than the existing metadata schema. -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ ---

Re: schema changes of custom data source in persistent tables DataSourceV1

2020-07-20 Thread fansparker
Thanks Russell. This shows that the "refreshTable" and "invalidateTable" could be used to reload the metadata but they do not work in our case. I have tried to in

Re: schema changes of custom data source in persistent tables DataSourceV1

2020-07-20 Thread fansparker
Does anybody know if there is a way to get the persisted table's schema updated when the underlying custom data source schema is changed? Currently, we have to drop and re-create the table. -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ -

persistent tables in DataSource api V2

2020-07-18 Thread fansparker
1. In DataSource api V1, we were able to create persistent tables over custom data sources using SQL DDL using "createRelation", "buildScan", "schema" etc:. Is there a way to achieve this in DataSource api V2? 2. In DataSource api V1, any schema changes in the underlying custom data source is not