dongjoon-hyun commented on PR #46468:
URL: https://github.com/apache/spark/pull/46468#issuecomment-2105258785
It turns out that Apache Spark is unable to support all legacy Hive UDF jar
files. Let me make a follow-up.
- https://github.com/apache/hive/pull/4892
--
This is an automated
dongjoon-hyun commented on PR #46468:
URL: https://github.com/apache/spark/pull/46468#issuecomment-2105239656
I locally verified that the failure of `HiveUDFDynamicLoadSuite` is
consistent.
```
$ build/mvn -Dtest=none
-DwildcardSuites=org.apache.spark.sql.hive.HiveUDFDynamicLoadSuite
dongjoon-hyun commented on code in PR #46468:
URL: https://github.com/apache/spark/pull/46468#discussion_r1597143409
##
dev/deps/spark-deps-hadoop-3-hive-2.3:
##
@@ -46,7 +46,6 @@ commons-compress/1.26.1//commons-compress-1.26.1.jar
dongjoon-hyun commented on PR #46468:
URL: https://github.com/apache/spark/pull/46468#issuecomment-2103844998
Also, cc @cloud-fan and @HyukjinKwon
This fixes not only Hive dependency but also a long standing `libthrift`
library issue.
--
This is an automated message from the
dongjoon-hyun commented on PR #46468:
URL: https://github.com/apache/spark/pull/46468#issuecomment-2103844347
Merged to master!
Thank you so much, @pan3793 and @sunchao .
From now, many people will use Hive 2.3.10. I believe we can build more
confidence before Apache Spark
dongjoon-hyun closed pull request #46468: [SPARK-47018][BUILD][SQL] Bump
built-in Hive to 2.3.10
URL: https://github.com/apache/spark/pull/46468
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
dongjoon-hyun commented on PR #46468:
URL: https://github.com/apache/spark/pull/46468#issuecomment-2103781151
Thank you!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
pan3793 commented on PR #46468:
URL: https://github.com/apache/spark/pull/46468#issuecomment-2103709988
Hive 2.3.10 jars should be available on Google Maven Central Mirror now,
re-triggered CI
--
This is an automated message from the Apache Git Service.
To respond to the message, please
pan3793 commented on code in PR #46468:
URL: https://github.com/apache/spark/pull/46468#discussion_r1596161241
##
sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveExternalCatalogVersionsSuite.scala:
##
@@ -211,7 +211,7 @@ class HiveExternalCatalogVersionsSuite extends
dongjoon-hyun commented on PR #46468:
URL: https://github.com/apache/spark/pull/46468#issuecomment-2103271420
Hive UT failure is due to `Google Maven Cache` seems to be a little late to
sync with `Maven Central`.
```
[info] - ADD JAR command 2 *** FAILED *** (154 milliseconds)
dongjoon-hyun commented on code in PR #46468:
URL: https://github.com/apache/spark/pull/46468#discussion_r1595872901
##
docs/sql-migration-guide.md:
##
@@ -1067,7 +1067,7 @@ Python UDF registration is unchanged.
Spark SQL is designed to be compatible with the Hive Metastore,
viirya commented on code in PR #46468:
URL: https://github.com/apache/spark/pull/46468#discussion_r1595798797
##
dev/deps/spark-deps-hadoop-3-hive-2.3:
##
@@ -184,7 +183,7 @@
kubernetes-model-storageclass/6.12.1//kubernetes-model-storageclass-6.12.1.jar
viirya commented on code in PR #46468:
URL: https://github.com/apache/spark/pull/46468#discussion_r1595793893
##
docs/sql-migration-guide.md:
##
@@ -1067,7 +1067,7 @@ Python UDF registration is unchanged.
Spark SQL is designed to be compatible with the Hive Metastore, SerDes
13 matches
Mail list logo