+1, but just barely. We've got quite a number of outstanding bugs identified, and many of them have fixes in progress. I'd hate to see those efforts get lost in a post-1.0.0 flood of new features targeted at 1.1.0 -- in other words, I'd like to see 1.0.1 retain a high priority relative to 1.1.0.
Looking through the unresolved JIRAs, it doesn't look like any of the identified bugs are show-stoppers or strictly regressions (although I will note that one that I have in progress, SPARK-1749, is a bug that we introduced with recent work -- it's not strictly a regression because we had equally bad but different behavior when the DAGScheduler exceptions weren't previously being handled at all vs. being slightly mis-handled now), so I'm not currently seeing a reason not to release. On Tue, May 13, 2014 at 1:36 AM, Patrick Wendell <pwend...@gmail.com> wrote: > Please vote on releasing the following candidate as Apache Spark version > 1.0.0! > > The tag to be voted on is v1.0.0-rc5 (commit 18f0623): > > https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=18f062303303824139998e8fc8f4158217b0dbc3 > > The release files, including signatures, digests, etc. can be found at: > http://people.apache.org/~pwendell/spark-1.0.0-rc5/ > > Release artifacts are signed with the following key: > https://people.apache.org/keys/committer/pwendell.asc > > The staging repository for this release can be found at: > https://repository.apache.org/content/repositories/orgapachespark-1012/ > > The documentation corresponding to this release can be found at: > http://people.apache.org/~pwendell/spark-1.0.0-rc5-docs/ > > Please vote on releasing this package as Apache Spark 1.0.0! > > The vote is open until Friday, May 16, at 09:30 UTC and passes if a > majority of at least 3 +1 PMC votes are cast. > > [ ] +1 Release this package as Apache Spark 1.0.0 > [ ] -1 Do not release this package because ... > > To learn more about Apache Spark, please see > http://spark.apache.org/ > > == API Changes == > We welcome users to compile Spark applications against 1.0. There are > a few API changes in this release. Here are links to the associated > upgrade guides - user facing changes have been kept as small as > possible. > > changes to ML vector specification: > > http://people.apache.org/~pwendell/spark-1.0.0-rc5-docs/mllib-guide.html#from-09-to-10 > > changes to the Java API: > > http://people.apache.org/~pwendell/spark-1.0.0-rc5-docs/java-programming-guide.html#upgrading-from-pre-10-versions-of-spark > > changes to the streaming API: > > http://people.apache.org/~pwendell/spark-1.0.0-rc5-docs/streaming-programming-guide.html#migration-guide-from-091-or-below-to-1x > > changes to the GraphX API: > > http://people.apache.org/~pwendell/spark-1.0.0-rc5-docs/graphx-programming-guide.html#upgrade-guide-from-spark-091 > > coGroup and related functions now return Iterable[T] instead of Seq[T] > ==> Call toSeq on the result to restore the old behavior > > SparkContext.jarOfClass returns Option[String] instead of Seq[String] > ==> Call toSeq on the result to restore old behavior >