GitHub user gatorsmile reopened a pull request: https://github.com/apache/spark/pull/19003
[SPARK-21769] [SQL] Add a table-specific option for always respecting schemas inferred/controlled by Spark SQL ## What changes were proposed in this pull request? For Hive-serde tables, we always respect the schema stored in Hive metastore, because the schema could be altered by the other engines that share the same metastore. Thus, we always trust the metastore-controlled schema for Hive-serde tables when the schemas are different (without considering the nullability and cases). However, in some scenarios, Hive metastore also could INCORRECTLY overwrite the schemas when the serde and Hive metastore built-in serde are different. The proposed solution is to introduce a table-specific option for such scenarios. For a specific table, users can make Spark always respect Spark-inferred/controlled schema instead of trusting metastore-controlled schema. By default, we trust Hive metastore-controlled schema. ## How was this patch tested? Added a cross-version test case You can merge this pull request into a Git repository by running: $ git pull https://github.com/gatorsmile/spark respectSparkSchema Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/19003.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #19003 ---- commit 4c7349f5d7cef703e11d93e114c8361a940e8bfa Author: gatorsmile <gatorsm...@gmail.com> Date: 2017-08-20T03:17:12Z fix. commit 36339c809a086fb1bb94ec167bf2fa9e4169aca1 Author: gatorsmile <gatorsm...@gmail.com> Date: 2017-08-22T18:22:26Z fix. ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org