Let me give you the 1st

+1



On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell <pwend...@gmail.com> wrote:

> He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
> exact commit and all other information is correct. (thanks Shivaram
> who pointed this out).
>
> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pwend...@gmail.com>
> wrote:
> > Please vote on releasing the following candidate as Apache Spark version
> 1.4.0!
> >
> > The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> > https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> > 22596c534a38cfdda91aef18aa9037ab101e4251
> >
> > The release files, including signatures, digests, etc. can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
> >
> > Release artifacts are signed with the following key:
> > https://people.apache.org/keys/committer/pwendell.asc
> >
> > The staging repository for this release can be found at:
> > [published as version: 1.4.0]
> > https://repository.apache.org/content/repositories/orgapachespark-1111/
> > [published as version: 1.4.0-rc4]
> > https://repository.apache.org/content/repositories/orgapachespark-1112/
> >
> > The documentation corresponding to this release can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
> >
> > Please vote on releasing this package as Apache Spark 1.4.0!
> >
> > The vote is open until Saturday, June 06, at 05:00 UTC and passes
> > if a majority of at least 3 +1 PMC votes are cast.
> >
> > [ ] +1 Release this package as Apache Spark 1.4.0
> > [ ] -1 Do not release this package because ...
> >
> > To learn more about Apache Spark, please see
> > http://spark.apache.org/
> >
> > == What has changed since RC3 ==
> > In addition to may smaller fixes, three blocker issues were fixed:
> > 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
> > metadataHive get constructed too early
> > 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> > 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
> >
> > == How can I help test this release? ==
> > If you are a Spark user, you can help us test this release by
> > taking a Spark 1.3 workload and running on this release candidate,
> > then reporting any regressions.
> >
> > == What justifies a -1 vote for this release? ==
> > This vote is happening towards the end of the 1.4 QA period,
> > so -1 votes should only occur for significant regressions from 1.3.1.
> > Bugs already present in 1.3.X, minor regressions, or bugs related
> > to new features will not block this release.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to