Thank you for raising it in the dev list. I do not think we should remove
HiveContext based on the cost of break and maintenance.

FYI, when releasing Spark 3.0, we had a lot of discussions about the
related topics
https://lists.apache.org/thread/mrx0y078cf3ozs7czykvv864y6dr55xq


Dongjoon Hyun <dongjoon.h...@gmail.com> 于2023年11月29日周三 08:43写道:

> Thank you for the heads-up.
>
> I agree with your intention and the fact that it's not useful in Apache
> Spark 4.0.0.
>
> However, as you know, historically, it was removed once and explicitly
> added back to the Apache Spark 3.0 via the vote.
>
> SPARK-31088 Add back HiveContext and createExternalTable
> (As a subtask of SPARK-31085 Amend Spark's Semantic Versioning Policy)
>
> Like you, I'd love to remove that too, but it's a little hard to remove it
> from Apache Spark 4.0.0 under our AS-IS versioning policy and history.
>
> I believe a new specific vote could make it possible to remove HiveContext
> (if we need to remove it).
>
> So, do you want to delete it from Apache Spark 4.0.0 via the official
> community vote with this thread context?
>
> Thanks,
> Dongjoon.
>
>
> On Wed, Nov 29, 2023 at 3:03 AM 杨杰 <yangji...@apache.org> wrote:
>
>> Hi all,
>>
>> In SPARK-46171 (apache/spark#44077 [1]), I’m trying to remove the
>> deprecated HiveContext from Apache Spark 4.0 since HiveContext has been
>> marked as deprecated after Spark 2.0. This is a long-deprecated API, it
>> should be replaced with SparkSession with enableHiveSupport now, so I think
>> it's time to remove it.
>>
>> Feel free to comment if you have any concerns.
>>
>> [1] https://github.com/apache/spark/pull/44077
>>
>> Thanks,
>> Jie Yang
>>
>

Reply via email to