+1
On May 28, 2014 7:05 PM, "Xiangrui Meng" <men...@gmail.com> wrote:

> +1
>
> Tested apps with standalone client mode and yarn cluster and client modes.
>
> Xiangrui
>
> On Wed, May 28, 2014 at 1:07 PM, Sean McNamara
> <sean.mcnam...@webtrends.com> wrote:
> > Pulled down, compiled, and tested examples on OS X and ubuntu.
> > Deployed app we are building on spark and poured data through it.
> >
> > +1
> >
> > Sean
> >
> >
> > On May 26, 2014, at 8:39 AM, Tathagata Das <tathagata.das1...@gmail.com>
> wrote:
> >
> >> Please vote on releasing the following candidate as Apache Spark
> version 1.0.0!
> >>
> >> This has a few important bug fixes on top of rc10:
> >> SPARK-1900 and SPARK-1918: https://github.com/apache/spark/pull/853
> >> SPARK-1870: https://github.com/apache/spark/pull/848
> >> SPARK-1897: https://github.com/apache/spark/pull/849
> >>
> >> The tag to be voted on is v1.0.0-rc11 (commit c69d97cd):
> >>
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=c69d97cdb42f809cb71113a1db4194c21372242a
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> http://people.apache.org/~tdas/spark-1.0.0-rc11/
> >>
> >> Release artifacts are signed with the following key:
> >> https://people.apache.org/keys/committer/tdas.asc
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1019/
> >>
> >> The documentation corresponding to this release can be found at:
> >> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/
> >>
> >> Please vote on releasing this package as Apache Spark 1.0.0!
> >>
> >> The vote is open until Thursday, May 29, at 16:00 UTC and passes if
> >> a majority of at least 3 +1 PMC votes are cast.
> >>
> >> [ ] +1 Release this package as Apache Spark 1.0.0
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see
> >> http://spark.apache.org/
> >>
> >> == API Changes ==
> >> We welcome users to compile Spark applications against 1.0. There are
> >> a few API changes in this release. Here are links to the associated
> >> upgrade guides - user facing changes have been kept as small as
> >> possible.
> >>
> >> Changes to ML vector specification:
> >>
> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/mllib-guide.html#from-09-to-10
> >>
> >> Changes to the Java API:
> >>
> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/java-programming-guide.html#upgrading-from-pre-10-versions-of-spark
> >>
> >> Changes to the streaming API:
> >>
> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/streaming-programming-guide.html#migration-guide-from-091-or-below-to-1x
> >>
> >> Changes to the GraphX API:
> >>
> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/graphx-programming-guide.html#upgrade-guide-from-spark-091
> >>
> >> Other changes:
> >> coGroup and related functions now return Iterable[T] instead of Seq[T]
> >> ==> Call toSeq on the result to restore the old behavior
> >>
> >> SparkContext.jarOfClass returns Option[String] instead of Seq[String]
> >> ==> Call toSeq on the result to restore old behavior
> >
>

Reply via email to