Looks like it was an issue with wget not fetching all the artifacts, my bad
!

Looks good to me, +1 for release - thanks !


Regards,
Mridul


On Sat, Feb 11, 2023 at 12:11 PM L. C. Hsieh <vii...@gmail.com> wrote:

> Hi Mridul,
>
> Thanks for testing it.
>
> I can see the artifact in
>
> https://repository.apache.org/content/repositories/orgapachespark-1433/org/apache/spark/spark-mllib-local_2.13/3.3.2/
> .
> Did I miss something?
>
> Liang-Chi
>
> On Sat, Feb 11, 2023 at 10:08 AM Mridul Muralidharan <mri...@gmail.com>
> wrote:
> >
> >
> > Hi,
> >
> > The following file is missing in the staging repository - there is a
> corresponding asc sig file, without the artifact.
> > *
> org/apache/spark/spark-mllib-local_2.13/3.3.2/spark-mllib-local_2.13-3.3.2-test-sources.jar
> > Can we have this fixed please ?
> >
> > Rest of the signatures, digests, etc check out fine.
> >
> > Built and tested with "-Phive -Pyarn -Pmesos -Pkubernetes".
> >
> > Regards,
> > Mridul
> >
> >
> >
> >
> > On Fri, Feb 10, 2023 at 11:01 PM L. C. Hsieh <vii...@gmail.com> wrote:
> >>
> >> Please vote on releasing the following candidate as Apache Spark
> version 3.3.2.
> >>
> >> The vote is open until Feb 15th 9AM (PST) and passes if a majority +1
> >> PMC votes are cast, with a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 3.3.2
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see https://spark.apache.org/
> >>
> >> The tag to be voted on is v3.3.2-rc1 (commit
> >> 5103e00c4ce5fcc4264ca9c4df12295d42557af6):
> >> https://github.com/apache/spark/tree/v3.3.2-rc1
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.3.2-rc1-bin/
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1433/
> >>
> >> The documentation corresponding to this release can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.3.2-rc1-docs/
> >>
> >> The list of bug fixes going into 3.3.2 can be found at the following
> URL:
> >> https://issues.apache.org/jira/projects/SPARK/versions/12352299
> >>
> >> This release is using the release script of the tag v3.3.2-rc1.
> >>
> >> FAQ
> >>
> >> =========================
> >> How can I help test this release?
> >> =========================
> >>
> >> If you are a Spark user, you can help us test this release by taking
> >> an existing Spark workload and running on this release candidate, then
> >> reporting any regressions.
> >>
> >> If you're working in PySpark you can set up a virtual env and install
> >> the current RC and see if anything important breaks, in the Java/Scala
> >> you can add the staging repository to your projects resolvers and test
> >> with the RC (make sure to clean up the artifact cache before/after so
> >> you don't end up building with a out of date RC going forward).
> >>
> >> ===========================================
> >> What should happen to JIRA tickets still targeting 3.3.2?
> >> ===========================================
> >>
> >> The current list of open tickets targeted at 3.3.2 can be found at:
> >> https://issues.apache.org/jira/projects/SPARK and search for "Target
> >> Version/s" = 3.3.2
> >>
> >> Committers should look at those and triage. Extremely important bug
> >> fixes, documentation, and API tweaks that impact compatibility should
> >> be worked on immediately. Everything else please retarget to an
> >> appropriate release.
> >>
> >> ==================
> >> But my bug isn't fixed?
> >> ==================
> >>
> >> In order to make timely releases, we will typically not hold the
> >> release unless the bug in question is a regression from the previous
> >> release. That being said, if there is something which is a regression
> >> that has not been correctly targeted please ping me or a committer to
> >> help target the issue.
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
>

Reply via email to