wangyum opened a new pull request #23553: [SPARK-23710][SQL] Upgrade built-in Hive to 2.3.4 URL: https://github.com/apache/spark/pull/23553 This PR upgrade built-in Hive from [1.2.1-spark2](https://github.com/JoshRosen/hive/tree/release-1.2.1-spark2) to [2.3.4](https://github.com/apache/hive/releases/tag/rel%2Frelease-2.3.4) to solve some critical issues. **Hive issues**: [[SPARK-26332]](https://issues.apache.org/jira/browse/SPARK-26332)[HIVE-10790] Spark sql write orc table on viewFS throws exception [[SPARK-25193]](https://issues.apache.org/jira/browse/SPARK-25193)[HIVE-12505] insert overwrite doesn't throw exception when drop old data fails [[SPARK-26437]](https://issues.apache.org/jira/browse/SPARK-26437)[HIVE-13083] Decimal data becomes bigint to query, unable to query [[SPARK-25919]](https://issues.apache.org/jira/browse/SPARK-25919)[HIVE-11771] Date value corrupts when tables are "ParquetHiveSerDe" formatted and target table is Partitioned [[SPARK-12014]](https://issues.apache.org/jira/browse/SPARK-12014)[HIVE-11100] Spark SQL query containing semicolon is broken in Beeline **Spark issues**: [[SPARK-23534]](https://issues.apache.org/jira/browse/SPARK-23534) Spark run on Hadoop 3.0.0 [[SPARK-20202]](https://issues.apache.org/jira/browse/SPARK-20202) Remove references to org.spark-project.hive [[SPARK-18673]](https://issues.apache.org/jira/browse/SPARK-18673) Dataframes doesn't work on Hadoop 3.x; Hive rejects Hadoop version [[SPARK-24766]](https://issues.apache.org/jira/browse/SPARK-24766) CreateHiveTableAsSelect and InsertIntoHiveDir won't generate decimal column stats in parquet ## How was this patch tested? unit tests and manual tests
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
