[ 
https://issues.apache.org/jira/browse/SPARK-34370?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Attila Zsolt Piros updated SPARK-34370:
---------------------------------------
    Description: 
This came up in 
https://github.com/apache/spark/pull/31133#issuecomment-773567152.


The use case is the following there is a partitioned Hive table with Avro data. 
The schema is specified via "avro.schema.url".
With time the schema is evolved and the new schema is set for the table 
"avro.schema.url" when data is read from the old partition this new evolved 
schema must be used.

  was:
This came up in 
https://github.com/apache/spark/pull/31133#issuecomment-773567152.


The use case is the following there is a partitioned Hive table with Avro data. 
The schema is specified via "avro.schema.url".
With time the schema is evolved and the new schema is set for the table 
"avro.schema.url" when data is read from the old p


> Supporting Avro schema evolution for partitioned Hive tables using 
> "avro.schema.url"
> ------------------------------------------------------------------------------------
>
>                 Key: SPARK-34370
>                 URL: https://issues.apache.org/jira/browse/SPARK-34370
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.1, 2.4.0, 3.0.1, 3.1.0, 3.2.0
>            Reporter: Attila Zsolt Piros
>            Priority: Major
>
> This came up in 
> https://github.com/apache/spark/pull/31133#issuecomment-773567152.
> The use case is the following there is a partitioned Hive table with Avro 
> data. The schema is specified via "avro.schema.url".
> With time the schema is evolved and the new schema is set for the table 
> "avro.schema.url" when data is read from the old partition this new evolved 
> schema must be used.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to