https://spark.apache.org/downloads.html

*2. Choose a package type:* menu shows that Pre-built for Hadoop 3.3

but download link is *spark-3.2.1-bin-hadoop3.2.tgz*

need an update?

L. C. Hsieh <vii...@gmail.com> 于2022年1月29日周六 14:26写道:

> Thanks Huaxin for the 3.2.1 release!
>
> On Fri, Jan 28, 2022 at 10:14 PM Dongjoon Hyun <dongjoon.h...@gmail.com>
> wrote:
> >
> > Thank you again, Huaxin!
> >
> > Dongjoon.
> >
> > On Fri, Jan 28, 2022 at 6:23 PM DB Tsai <dbt...@dbtsai.com> wrote:
> >>
> >> Thank you, Huaxin for the 3.2.1 release!
> >>
> >> Sent from my iPhone
> >>
> >> On Jan 28, 2022, at 5:45 PM, Chao Sun <sunc...@apache.org> wrote:
> >>
> >> 
> >> Thanks Huaxin for driving the release!
> >>
> >> On Fri, Jan 28, 2022 at 5:37 PM Ruifeng Zheng <ruife...@foxmail.com>
> wrote:
> >>>
> >>> It's Great!
> >>> Congrats and thanks, huaxin!
> >>>
> >>>
> >>> ------------------ 原始邮件 ------------------
> >>> 发件人: "huaxin gao" <huaxin.ga...@gmail.com>;
> >>> 发送时间: 2022年1月29日(星期六) 上午9:07
> >>> 收件人: "dev"<d...@spark.apache.org>;"user"<user@spark.apache.org>;
> >>> 主题: [ANNOUNCE] Apache Spark 3.2.1 released
> >>>
> >>> We are happy to announce the availability of Spark 3.2.1!
> >>>
> >>> Spark 3.2.1 is a maintenance release containing stability fixes. This
> >>> release is based on the branch-3.2 maintenance branch of Spark. We
> strongly
> >>> recommend all 3.2 users to upgrade to this stable release.
> >>>
> >>> To download Spark 3.2.1, head over to the download page:
> >>> https://spark.apache.org/downloads.html
> >>>
> >>> To view the release notes:
> >>> https://spark.apache.org/releases/spark-release-3-2-1.html
> >>>
> >>> We would like to acknowledge all community members for contributing to
> this
> >>> release. This release would not have been possible without you.
> >>>
> >>> Huaxin Gao
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

-- 
*camper42 (曹丰宇)*
Douban, Inc.

Mobile: +86 15691996359
E-mail:  camper.x...@gmail.com

Reply via email to