Yeah the approach seems OK to me - please double check that the doc generation in Spark repo won't fail after the move of the js file. Other than that, it would be probably just a matter of updating the release process.
On Tue, Mar 5, 2024 at 7:24 PM Pan,Bingkun <panbing...@baidu.com> wrote: > Okay, I see. > > Perhaps we can solve this confusion by sharing the same file `version.json` > across `all versions` in the `Spark website repo`? Make each version of > the document display the `same` data in the dropdown menu. > ------------------------------ > *发件人:* Jungtaek Lim <kabhwan.opensou...@gmail.com> > *发送时间:* 2024年3月5日 17:09:07 > *收件人:* Pan,Bingkun > *抄送:* Dongjoon Hyun; dev; user > *主题:* Re: [ANNOUNCE] Apache Spark 3.5.1 released > > Let me be more specific. > > We have two active release version lines, 3.4.x and 3.5.x. We just > released Spark 3.5.1, having a dropdown as 3.5.1 and 3.4.2 given the fact > the last version of 3.4.x is 3.4.2. After a month we released Spark 3.4.3. > In the dropdown of Spark 3.4.3, there will be 3.5.1 and 3.4.3. But if we > call this as done, 3.5.1 (still latest) won't show 3.4.3 in the dropdown, > giving confusion that 3.4.3 wasn't ever released. > > This is just about two active release version lines with keeping only the > latest version of version lines. If you expand this to EOLed version lines > and versions which aren't the latest in their version line, the problem > gets much more complicated. > > On Tue, Mar 5, 2024 at 6:01 PM Pan,Bingkun <panbing...@baidu.com> wrote: > >> Based on my understanding, we should not update versions that have >> already been released, >> >> such as the situation you mentioned: `But what about dropout of version >> D? Should we add E in the dropdown?` We only need to record the latest >> `version. json` file that has already been published at the time of each >> new document release. >> >> Of course, if we need to keep the latest in every document, I think it's >> also possible. >> >> Only by sharing the same version. json file in each version. >> ------------------------------ >> *发件人:* Jungtaek Lim <kabhwan.opensou...@gmail.com> >> *发送时间:* 2024年3月5日 16:47:30 >> *收件人:* Pan,Bingkun >> *抄送:* Dongjoon Hyun; dev; user >> *主题:* Re: [ANNOUNCE] Apache Spark 3.5.1 released >> >> But this does not answer my question about updating the dropdown for the >> doc of "already released versions", right? >> >> Let's say we just released version D, and the dropdown has version A, B, >> C. We have another release tomorrow as version E, and it's probably easy to >> add A, B, C, D in the dropdown of E. But what about dropdown of version D? >> Should we add E in the dropdown? How do we maintain it if we will have 10 >> releases afterwards? >> >> On Tue, Mar 5, 2024 at 5:27 PM Pan,Bingkun <panbing...@baidu.com> wrote: >> >>> According to my understanding, the original intention of this feature is >>> that when a user has entered the pyspark document, if he finds that the >>> version he is currently in is not the version he wants, he can easily jump >>> to the version he wants by clicking on the drop-down box. Additionally, in >>> this PR, the current automatic mechanism for PRs did not merge in. >>> >>> https://github.com/apache/spark/pull/42881 >>> <https://mailshield.baidu.com/check?q=NXF5O0EN4F6TOoAzxFGzXSJvMnQlPeztKpu%2fBYaKpd2sRl6qEYTx2NGUrTYUrhOk> >>> >>> So, we need to manually update this file. I can manually submit an >>> update first to get this feature working. >>> ------------------------------ >>> *发件人:* Jungtaek Lim <kabhwan.opensou...@gmail.com> >>> *发送时间:* 2024年3月4日 6:34:42 >>> *收件人:* Dongjoon Hyun >>> *抄送:* dev; user >>> *主题:* Re: [ANNOUNCE] Apache Spark 3.5.1 released >>> >>> Shall we revisit this functionality? The API doc is built with >>> individual versions, and for each individual version we depend on other >>> released versions. This does not seem to be right to me. Also, the >>> functionality is only in PySpark API doc which does not seem to be >>> consistent as well. >>> >>> I don't think this is manageable with the current approach (listing >>> versions in version-dependent doc). Let's say we release 3.4.3 after 3.5.1. >>> Should we update the versions in 3.5.1 to add 3.4.3 in version switcher? >>> How about the time we are going to release the new version after releasing >>> 10 versions? What's the criteria of pruning the version? >>> >>> Unless we have a good answer to these questions, I think it's better to >>> revert the functionality - it missed various considerations. >>> >>> On Fri, Mar 1, 2024 at 2:44 PM Jungtaek Lim < >>> kabhwan.opensou...@gmail.com> wrote: >>> >>>> Thanks for reporting - this is odd - the dropdown did not exist in >>>> other recent releases. >>>> >>>> https://spark.apache.org/docs/3.5.0/api/python/index.html >>>> <https://mailshield.baidu.com/check?q=uXELebgeq9ShKrQ3HDYtw08xGdWbbrT3FEzFk%2fzTZ%2bVxzlJrJa41y1xJkZ7RbZcLmQNMGzBVvVX6KlpxrtsKRQ%3d%3d> >>>> https://spark.apache.org/docs/3.4.2/api/python/index.html >>>> <https://mailshield.baidu.com/check?q=vFHg6IjqXnlPilWEcpu6a0oCJLXpFeNnsL6hZ%2fpZY0nGPd6tnUFbimhVD6zVpMlL8RAgxzN8GQM6cNBFe8hXvA%3d%3d> >>>> https://spark.apache.org/docs/3.3.4/api/python/index.html >>>> <https://mailshield.baidu.com/check?q=cfoH89Pu%2fNbZC4s7657SqqfHpT7hoppw7e6%2fZzsz8S7ZoEMm2LijOxwcGgKS5O29HzYUyQoooMRdy%2fd5Y36e2Q%3d%3d> >>>> >>>> Looks like the dropdown feature was recently introduced but partially >>>> done. The addition of a dropdown was done, but the way how to bump the >>>> version was missed to be documented. >>>> The contributor proposed the way to update the version "automatically", >>>> but the PR wasn't merged. As a result, we are neither having the >>>> instruction how to bump the version manually, nor having the automatic >>>> bump. >>>> >>>> * PR for addition of dropdown: >>>> https://github.com/apache/spark/pull/42428 >>>> <https://mailshield.baidu.com/check?q=pSDq2Cdb4aBtjOEg7J1%2fXPtYeSxjVkQfXKV%2fmfX1Y7NeT77hnIS%2bsvMbbXwT3DLm> >>>> * PR for automatically bumping version: >>>> https://github.com/apache/spark/pull/42881 >>>> <https://mailshield.baidu.com/check?q=NXF5O0EN4F6TOoAzxFGzXSJvMnQlPeztKpu%2fBYaKpd2sRl6qEYTx2NGUrTYUrhOk> >>>> >>>> We will probably need to add an instruction in the release process to >>>> update the version. (For automatic bumping I don't have a good idea.) >>>> I'll look into it. Please expect some delay during the holiday weekend >>>> in S. Korea. >>>> >>>> Thanks again. >>>> Jungtaek Lim (HeartSaVioR) >>>> >>>> >>>> On Fri, Mar 1, 2024 at 2:14 PM Dongjoon Hyun <dongjoon.h...@gmail.com> >>>> wrote: >>>> >>>>> BTW, Jungtaek. >>>>> >>>>> PySpark document seems to show a wrong branch. At this time, `master`. >>>>> >>>>> https://spark.apache.org/docs/3.5.1/api/python/index.html >>>>> <https://mailshield.baidu.com/check?q=KwooIjNwx9R5XjkTxvpqs6ApF2YX2ZujKl%2bha1PX%2bf3X4CQowIWtvSFmFPVO1297fFYMkgFMgmFuEBDkuDwpig%3d%3d> >>>>> >>>>> PySpark Overview >>>>> <https://mailshield.baidu.com/check?q=rahGq5g%2bcbjBOU3xXCbESExdvGhXXTpk%2f%2f3BUMatX7zAgGbgcBy3mkuJmlmgtZZIoahnY2Cj2t4uylAFmefkTY1%2bQbN0rqSWYUU6qjrQRqY%3d> >>>>> >>>>> Date: Feb 24, 2024 Version: master >>>>> >>>>> [image: Screenshot 2024-02-29 at 21.12.24.png] >>>>> >>>>> >>>>> Could you do the follow-up, please? >>>>> >>>>> Thank you in advance. >>>>> >>>>> Dongjoon. >>>>> >>>>> >>>>> On Thu, Feb 29, 2024 at 2:48 PM John Zhuge <jzh...@apache.org> wrote: >>>>> >>>>>> Excellent work, congratulations! >>>>>> >>>>>> On Wed, Feb 28, 2024 at 10:12 PM Dongjoon Hyun < >>>>>> dongjoon.h...@gmail.com> wrote: >>>>>> >>>>>>> Congratulations! >>>>>>> >>>>>>> Bests, >>>>>>> Dongjoon. >>>>>>> >>>>>>> On Wed, Feb 28, 2024 at 11:43 AM beliefer <belie...@163.com> wrote: >>>>>>> >>>>>>>> Congratulations! >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> At 2024-02-28 17:43:25, "Jungtaek Lim" < >>>>>>>> kabhwan.opensou...@gmail.com> wrote: >>>>>>>> >>>>>>>> Hi everyone, >>>>>>>> >>>>>>>> We are happy to announce the availability of Spark 3.5.1! >>>>>>>> >>>>>>>> Spark 3.5.1 is a maintenance release containing stability fixes. >>>>>>>> This >>>>>>>> release is based on the branch-3.5 maintenance branch of Spark. We >>>>>>>> strongly >>>>>>>> recommend all 3.5 users to upgrade to this stable release. >>>>>>>> >>>>>>>> To download Spark 3.5.1, head over to the download page: >>>>>>>> https://spark.apache.org/downloads.html >>>>>>>> <https://mailshield.baidu.com/check?q=aV5QpxMQ4pApHhycByY17SDpg%2fyWowLsFKuT2QIJ%2blgKNmM8ZTuo%2bh%2bxuQw%3d> >>>>>>>> >>>>>>>> To view the release notes: >>>>>>>> https://spark.apache.org/releases/spark-release-3-5-1.html >>>>>>>> <https://mailshield.baidu.com/check?q=me6nB9rE2h%2bDoAZOURkES1TiardFuBBwiWHLJO3L1B6BiaU1WjeuCNeT2Ud5QrbOsddD0T3JV1pCt9aUQ4JR8A%3d%3d> >>>>>>>> >>>>>>>> We would like to acknowledge all community members for contributing >>>>>>>> to this >>>>>>>> release. This release would not have been possible without you. >>>>>>>> >>>>>>>> Jungtaek Lim >>>>>>>> >>>>>>>> ps. Yikun is helping us through releasing the official docker image >>>>>>>> for Spark 3.5.1 (Thanks Yikun!) It may take some time to be generally >>>>>>>> available. >>>>>>>> >>>>>>>> >>>>>> >>>>>> -- >>>>>> John Zhuge >>>>>> >>>>>