Sorry for the duplication, but I think this is the current VOTE candidate
-- we're not voting on rc8 yet?

+1, but just barely.  We've got quite a number of outstanding bugs
identified, and many of them have fixes in progress.  I'd hate to see those
efforts get lost in a post-1.0.0 flood of new features targeted at 1.1.0 --
in other words, I'd like to see 1.0.1 retain a high priority relative to
1.1.0.

Looking through the unresolved JIRAs, it doesn't look like any of the
identified bugs are show-stoppers or strictly regressions (although I will
note that one that I have in progress, SPARK-1749, is a bug that we
introduced with recent work -- it's not strictly a regression because we
had equally bad but different behavior when the DAGScheduler exceptions
weren't previously being handled at all vs. being slightly mis-handled
now), so I'm not currently seeing a reason not to release.


On Fri, May 16, 2014 at 11:42 AM, Henry Saputra <henry.sapu...@gmail.com>wrote:

> Ah ok, thanks Aaron
>
> Just to make sure we VOTE the right RC.
>
> Thanks,
>
> Henry
>
> On Fri, May 16, 2014 at 11:37 AM, Aaron Davidson <ilike...@gmail.com>
> wrote:
> > It was, but due to the apache infra issues, some may not have received
> the
> > email yet...
> >
> > On Fri, May 16, 2014 at 10:48 AM, Henry Saputra <henry.sapu...@gmail.com
> >
> > wrote:
> >>
> >> Hi Patrick,
> >>
> >> Just want to make sure that VOTE for rc6 also cancelled?
> >>
> >>
> >> Thanks,
> >>
> >> Henry
> >>
> >> On Thu, May 15, 2014 at 1:15 AM, Patrick Wendell <pwend...@gmail.com>
> >> wrote:
> >> > I'll start the voting with a +1.
> >> >
> >> > On Thu, May 15, 2014 at 1:14 AM, Patrick Wendell <pwend...@gmail.com>
> >> > wrote:
> >> >> Please vote on releasing the following candidate as Apache Spark
> >> >> version 1.0.0!
> >> >>
> >> >> This patch has minor documentation changes and fixes on top of rc6.
> >> >>
> >> >> The tag to be voted on is v1.0.0-rc7 (commit 9212b3e):
> >> >>
> >> >>
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=9212b3e5bb5545ccfce242da8d89108e6fb1c464
> >> >>
> >> >> The release files, including signatures, digests, etc. can be found
> at:
> >> >> http://people.apache.org/~pwendell/spark-1.0.0-rc7/
> >> >>
> >> >> Release artifacts are signed with the following key:
> >> >> https://people.apache.org/keys/committer/pwendell.asc
> >> >>
> >> >> The staging repository for this release can be found at:
> >> >>
> https://repository.apache.org/content/repositories/orgapachespark-1015
> >> >>
> >> >> The documentation corresponding to this release can be found at:
> >> >> http://people.apache.org/~pwendell/spark-1.0.0-rc7-docs/
> >> >>
> >> >> Please vote on releasing this package as Apache Spark 1.0.0!
> >> >>
> >> >> The vote is open until Sunday, May 18, at 09:12 UTC and passes if a
> >> >> majority of at least 3 +1 PMC votes are cast.
> >> >>
> >> >> [ ] +1 Release this package as Apache Spark 1.0.0
> >> >> [ ] -1 Do not release this package because ...
> >> >>
> >> >> To learn more about Apache Spark, please see
> >> >> http://spark.apache.org/
> >> >>
> >> >> == API Changes ==
> >> >> We welcome users to compile Spark applications against 1.0. There are
> >> >> a few API changes in this release. Here are links to the associated
> >> >> upgrade guides - user facing changes have been kept as small as
> >> >> possible.
> >> >>
> >> >> changes to ML vector specification:
> >> >>
> >> >>
> http://people.apache.org/~pwendell/spark-1.0.0-rc5-docs/mllib-guide.html#from-09-to-10
> >> >>
> >> >> changes to the Java API:
> >> >>
> >> >>
> http://people.apache.org/~pwendell/spark-1.0.0-rc5-docs/java-programming-guide.html#upgrading-from-pre-10-versions-of-spark
> >> >>
> >> >> changes to the streaming API:
> >> >>
> >> >>
> http://people.apache.org/~pwendell/spark-1.0.0-rc5-docs/streaming-programming-guide.html#migration-guide-from-091-or-below-to-1x
> >> >>
> >> >> changes to the GraphX API:
> >> >>
> >> >>
> http://people.apache.org/~pwendell/spark-1.0.0-rc5-docs/graphx-programming-guide.html#upgrade-guide-from-spark-091
> >> >>
> >> >> coGroup and related functions now return Iterable[T] instead of
> Seq[T]
> >> >> ==> Call toSeq on the result to restore the old behavior
> >> >>
> >> >> SparkContext.jarOfClass returns Option[String] instead of Seq[String]
> >> >> ==> Call toSeq on the result to restore old behavior
> >
> >
>

Reply via email to