Thank you very much for the feedback from Dongjoon and Xiao Li.
After carefully reading
https://lists.apache.org/thread/mrx0y078cf3ozs7czykvv864y6dr55xq, I have
decided to abandon the deletion of HiveContext. As Xiao Li said, its
maintenance cost is not high, but it will increase the cost of
Thank you for raising it in the dev list. I do not think we should remove
HiveContext based on the cost of break and maintenance.
FYI, when releasing Spark 3.0, we had a lot of discussions about the
related topics
https://lists.apache.org/thread/mrx0y078cf3ozs7czykvv864y6dr55xq
Dongjoon Hyun
Thank you for the heads-up.
I agree with your intention and the fact that it's not useful in Apache
Spark 4.0.0.
However, as you know, historically, it was removed once and explicitly
added back to the Apache Spark 3.0 via the vote.
SPARK-31088 Add back HiveContext and createExternalTable
(As a
Hi all,
In SPARK-46171 (apache/spark#44077 [1]), I’m trying to remove the
deprecated HiveContext from Apache Spark 4.0 since HiveContext has been
marked as deprecated after Spark 2.0. This is a long-deprecated API, it
should be replaced with SparkSession with enableHiveSupport now, so I think