Karen Feng created SPARK-37859: ---------------------------------- Summary: SQL tables created with JDBC with Spark 3.1 are not readable with 3.2 Key: SPARK-37859 URL: https://issues.apache.org/jira/browse/SPARK-37859 Project: Spark Issue Type: Bug Components: SQL Affects Versions: 3.2.0 Reporter: Karen Feng
In https://github.com/apache/spark/blob/bd24b4884b804fc85a083f82b864823851d5980c/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala#L312, a new metadata field is added during reading. As we do a full comparison of the user-provided schema and the actual schema in https://github.com/apache/spark/blob/bd24b4884b804fc85a083f82b864823851d5980c/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSource.scala#L356, resolution fails if a table created with Spark 3.1 is read with Spark 3.2. -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org