That sounds like a great suggestion.

发件人: Jungtaek Lim <kabhwan.opensou...@gmail.com>
日期: 2024年3月5日 星期二 10:46
收件人: Hyukjin Kwon <gurwls...@apache.org>
抄送: yangjie01 <yangji...@baidu.com>, Dongjoon Hyun <dongjoon.h...@gmail.com>, 
dev <dev@spark.apache.org>, user <u...@spark.apache.org>
主题: Re: [ANNOUNCE] Apache Spark 3.5.1 released

Yes, it's relevant to that PR. I wonder, if we want to expose version switcher, 
it should be in versionless doc (spark-website) rather than the doc being 
pinned to a specific version.

On Tue, Mar 5, 2024 at 11:18 AM Hyukjin Kwon 
<gurwls...@apache.org<mailto:gurwls...@apache.org>> wrote:
Is this related to 
https://github.com/apache/spark/pull/42428<https://mailshield.baidu.com/check?q=pSDq2Cdb4aBtjOEg7J1%2fXPtYeSxjVkQfXKV%2fmfX1Y7NeT77hnIS%2bsvMbbXwT3DLm>?

cc @Yang,Jie(INF)<mailto:yangji...@baidu.com>

On Mon, 4 Mar 2024 at 22:21, Jungtaek Lim 
<kabhwan.opensou...@gmail.com<mailto:kabhwan.opensou...@gmail.com>> wrote:
Shall we revisit this functionality? The API doc is built with individual 
versions, and for each individual version we depend on other released versions. 
This does not seem to be right to me. Also, the functionality is only in 
PySpark API doc which does not seem to be consistent as well.

I don't think this is manageable with the current approach (listing versions in 
version-dependent doc). Let's say we release 3.4.3 after 3.5.1. Should we 
update the versions in 3.5.1 to add 3.4.3 in version switcher? How about the 
time we are going to release the new version after releasing 10 versions? 
What's the criteria of pruning the version?

Unless we have a good answer to these questions, I think it's better to revert 
the functionality - it missed various considerations.

On Fri, Mar 1, 2024 at 2:44 PM Jungtaek Lim 
<kabhwan.opensou...@gmail.com<mailto:kabhwan.opensou...@gmail.com>> wrote:
Thanks for reporting - this is odd - the dropdown did not exist in other recent 
releases.

https://spark.apache.org/docs/3.5.0/api/python/index.html<https://mailshield.baidu.com/check?q=uXELebgeq9ShKrQ3HDYtw08xGdWbbrT3FEzFk%2fzTZ%2bVxzlJrJa41y1xJkZ7RbZcLmQNMGzBVvVX6KlpxrtsKRQ%3d%3d>
https://spark.apache.org/docs/3.4.2/api/python/index.html<https://mailshield.baidu.com/check?q=vFHg6IjqXnlPilWEcpu6a0oCJLXpFeNnsL6hZ%2fpZY0nGPd6tnUFbimhVD6zVpMlL8RAgxzN8GQM6cNBFe8hXvA%3d%3d>
https://spark.apache.org/docs/3.3.4/api/python/index.html<https://mailshield.baidu.com/check?q=cfoH89Pu%2fNbZC4s7657SqqfHpT7hoppw7e6%2fZzsz8S7ZoEMm2LijOxwcGgKS5O29HzYUyQoooMRdy%2fd5Y36e2Q%3d%3d>

Looks like the dropdown feature was recently introduced but partially done. The 
addition of a dropdown was done, but the way how to bump the version was missed 
to be documented.
The contributor proposed the way to update the version "automatically", but the 
PR wasn't merged. As a result, we are neither having the instruction how to 
bump the version manually, nor having the automatic bump.

* PR for addition of dropdown: 
https://github.com/apache/spark/pull/42428<https://mailshield.baidu.com/check?q=pSDq2Cdb4aBtjOEg7J1%2fXPtYeSxjVkQfXKV%2fmfX1Y7NeT77hnIS%2bsvMbbXwT3DLm>
* PR for automatically bumping version: 
https://github.com/apache/spark/pull/42881<https://mailshield.baidu.com/check?q=NXF5O0EN4F6TOoAzxFGzXSJvMnQlPeztKpu%2fBYaKpd2sRl6qEYTx2NGUrTYUrhOk>

We will probably need to add an instruction in the release process to update 
the version. (For automatic bumping I don't have a good idea.)
I'll look into it. Please expect some delay during the holiday weekend in S. 
Korea.

Thanks again.
Jungtaek Lim (HeartSaVioR)


On Fri, Mar 1, 2024 at 2:14 PM Dongjoon Hyun 
<dongjoon.h...@gmail.com<mailto:dongjoon.h...@gmail.com>> wrote:
BTW, Jungtaek.

PySpark document seems to show a wrong branch. At this time, `master`.

    
https://spark.apache.org/docs/3.5.1/api/python/index.html<https://mailshield.baidu.com/check?q=KwooIjNwx9R5XjkTxvpqs6ApF2YX2ZujKl%2bha1PX%2bf3X4CQowIWtvSFmFPVO1297fFYMkgFMgmFuEBDkuDwpig%3d%3d>

    PySpark Overview

       Date: Feb 24, 2024 Version: master
[cid:image001.png@01DA6F13.CD4B0B00]



Could you do the follow-up, please?

Thank you in advance.

Dongjoon.


On Thu, Feb 29, 2024 at 2:48 PM John Zhuge 
<jzh...@apache.org<mailto:jzh...@apache.org>> wrote:
Excellent work, congratulations!

On Wed, Feb 28, 2024 at 10:12 PM Dongjoon Hyun 
<dongjoon.h...@gmail.com<mailto:dongjoon.h...@gmail.com>> wrote:
Congratulations!

Bests,
Dongjoon.

On Wed, Feb 28, 2024 at 11:43 AM beliefer 
<belie...@163.com<mailto:belie...@163.com>> wrote:

Congratulations!





At 2024-02-28 17:43:25, "Jungtaek Lim" 
<kabhwan.opensou...@gmail.com<mailto:kabhwan.opensou...@gmail.com>> wrote:
Hi everyone,

We are happy to announce the availability of Spark 3.5.1!

Spark 3.5.1 is a maintenance release containing stability fixes. This
release is based on the branch-3.5 maintenance branch of Spark. We strongly
recommend all 3.5 users to upgrade to this stable release.

To download Spark 3.5.1, head over to the download page:
https://spark.apache.org/downloads.html<https://mailshield.baidu.com/check?q=aV5QpxMQ4pApHhycByY17SDpg%2fyWowLsFKuT2QIJ%2blgKNmM8ZTuo%2bh%2bxuQw%3d>

To view the release notes:
https://spark.apache.org/releases/spark-release-3-5-1.html<https://mailshield.baidu.com/check?q=me6nB9rE2h%2bDoAZOURkES1TiardFuBBwiWHLJO3L1B6BiaU1WjeuCNeT2Ud5QrbOsddD0T3JV1pCt9aUQ4JR8A%3d%3d>

We would like to acknowledge all community members for contributing to this
release. This release would not have been possible without you.

Jungtaek Lim

ps. Yikun is helping us through releasing the official docker image for Spark 
3.5.1 (Thanks Yikun!) It may take some time to be generally available.



--
John Zhuge

Reply via email to