Is this related to https://github.com/apache/spark/pull/42428?

cc @Yang,Jie(INF) <yangji...@baidu.com>

On Mon, 4 Mar 2024 at 22:21, Jungtaek Lim <kabhwan.opensou...@gmail.com>
wrote:

> Shall we revisit this functionality? The API doc is built with individual
> versions, and for each individual version we depend on other released
> versions. This does not seem to be right to me. Also, the functionality is
> only in PySpark API doc which does not seem to be consistent as well.
>
> I don't think this is manageable with the current approach (listing
> versions in version-dependent doc). Let's say we release 3.4.3 after 3.5.1.
> Should we update the versions in 3.5.1 to add 3.4.3 in version switcher?
> How about the time we are going to release the new version after releasing
> 10 versions? What's the criteria of pruning the version?
>
> Unless we have a good answer to these questions, I think it's better to
> revert the functionality - it missed various considerations.
>
> On Fri, Mar 1, 2024 at 2:44 PM Jungtaek Lim <kabhwan.opensou...@gmail.com>
> wrote:
>
>> Thanks for reporting - this is odd - the dropdown did not exist in other
>> recent releases.
>>
>> https://spark.apache.org/docs/3.5.0/api/python/index.html
>> https://spark.apache.org/docs/3.4.2/api/python/index.html
>> https://spark.apache.org/docs/3.3.4/api/python/index.html
>>
>> Looks like the dropdown feature was recently introduced but partially
>> done. The addition of a dropdown was done, but the way how to bump the
>> version was missed to be documented.
>> The contributor proposed the way to update the version "automatically",
>> but the PR wasn't merged. As a result, we are neither having the
>> instruction how to bump the version manually, nor having the automatic bump.
>>
>> * PR for addition of dropdown: https://github.com/apache/spark/pull/42428
>> * PR for automatically bumping version:
>> https://github.com/apache/spark/pull/42881
>>
>> We will probably need to add an instruction in the release process to
>> update the version. (For automatic bumping I don't have a good idea.)
>> I'll look into it. Please expect some delay during the holiday weekend
>> in S. Korea.
>>
>> Thanks again.
>> Jungtaek Lim (HeartSaVioR)
>>
>>
>> On Fri, Mar 1, 2024 at 2:14 PM Dongjoon Hyun <dongjoon.h...@gmail.com>
>> wrote:
>>
>>> BTW, Jungtaek.
>>>
>>> PySpark document seems to show a wrong branch. At this time, `master`.
>>>
>>>     https://spark.apache.org/docs/3.5.1/api/python/index.html
>>>
>>>     PySpark Overview
>>> <https://spark.apache.org/docs/3.5.1/api/python/index.html#pyspark-overview>
>>>
>>>        Date: Feb 24, 2024 Version: master
>>>
>>> [image: Screenshot 2024-02-29 at 21.12.24.png]
>>>
>>>
>>> Could you do the follow-up, please?
>>>
>>> Thank you in advance.
>>>
>>> Dongjoon.
>>>
>>>
>>> On Thu, Feb 29, 2024 at 2:48 PM John Zhuge <jzh...@apache.org> wrote:
>>>
>>>> Excellent work, congratulations!
>>>>
>>>> On Wed, Feb 28, 2024 at 10:12 PM Dongjoon Hyun <dongjoon.h...@gmail.com>
>>>> wrote:
>>>>
>>>>> Congratulations!
>>>>>
>>>>> Bests,
>>>>> Dongjoon.
>>>>>
>>>>> On Wed, Feb 28, 2024 at 11:43 AM beliefer <belie...@163.com> wrote:
>>>>>
>>>>>> Congratulations!
>>>>>>
>>>>>>
>>>>>>
>>>>>> At 2024-02-28 17:43:25, "Jungtaek Lim" <kabhwan.opensou...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>> Hi everyone,
>>>>>>
>>>>>> We are happy to announce the availability of Spark 3.5.1!
>>>>>>
>>>>>> Spark 3.5.1 is a maintenance release containing stability fixes. This
>>>>>> release is based on the branch-3.5 maintenance branch of Spark. We
>>>>>> strongly
>>>>>> recommend all 3.5 users to upgrade to this stable release.
>>>>>>
>>>>>> To download Spark 3.5.1, head over to the download page:
>>>>>> https://spark.apache.org/downloads.html
>>>>>>
>>>>>> To view the release notes:
>>>>>> https://spark.apache.org/releases/spark-release-3-5-1.html
>>>>>>
>>>>>> We would like to acknowledge all community members for contributing
>>>>>> to this
>>>>>> release. This release would not have been possible without you.
>>>>>>
>>>>>> Jungtaek Lim
>>>>>>
>>>>>> ps. Yikun is helping us through releasing the official docker image
>>>>>> for Spark 3.5.1 (Thanks Yikun!) It may take some time to be generally
>>>>>> available.
>>>>>>
>>>>>>
>>>>
>>>> --
>>>> John Zhuge
>>>>
>>>

Reply via email to