+1

Tested it on both Windows and Mac OS X, with both Scala and Python. Confirmed 
that the issues in the previous RC were fixed.

Matei

On May 20, 2014, at 5:28 PM, Marcelo Vanzin <van...@cloudera.com> wrote:

> +1 (non-binding)
> 
> I have:
> - checked signatures and checksums of the files
> - built the code from the git repo using both sbt and mvn (against hadoop 
> 2.3.0)
> - ran a few simple jobs in local, yarn-client and yarn-cluster mode
> 
> Haven't explicitly tested any of the recent fixes, streaming nor sql.
> 
> 
> On Tue, May 20, 2014 at 1:13 PM, Tathagata Das
> <tathagata.das1...@gmail.com> wrote:
>> Please vote on releasing the following candidate as Apache Spark version 
>> 1.0.0!
>> 
>> This has a few bug fixes on top of rc9:
>> SPARK-1875: https://github.com/apache/spark/pull/824
>> SPARK-1876: https://github.com/apache/spark/pull/819
>> SPARK-1878: https://github.com/apache/spark/pull/822
>> SPARK-1879: https://github.com/apache/spark/pull/823
>> 
>> The tag to be voted on is v1.0.0-rc10 (commit d8070234):
>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=d807023479ce10aec28ef3c1ab646ddefc2e663c
>> 
>> The release files, including signatures, digests, etc. can be found at:
>> http://people.apache.org/~tdas/spark-1.0.0-rc10/
>> 
>> The release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/tdas.asc
>> 
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1018/
>> 
>> The documentation corresponding to this release can be found at:
>> http://people.apache.org/~tdas/spark-1.0.0-rc10-docs/
>> 
>> The full list of changes in this release can be found at:
>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=blob;f=CHANGES.txt;h=d21f0ace6326e099360975002797eb7cba9d5273;hb=d807023479ce10aec28ef3c1ab646ddefc2e663c
>> 
>> Please vote on releasing this package as Apache Spark 1.0.0!
>> 
>> The vote is open until Friday, May 23, at 20:00 UTC and passes if
>> amajority of at least 3 +1 PMC votes are cast.
>> 
>> [ ] +1 Release this package as Apache Spark 1.0.0
>> [ ] -1 Do not release this package because ...
>> 
>> To learn more about Apache Spark, please see
>> http://spark.apache.org/
>> 
>> ====== API Changes ======
>> We welcome users to compile Spark applications against 1.0. There are
>> a few API changes in this release. Here are links to the associated
>> upgrade guides - user facing changes have been kept as small as
>> possible.
>> 
>> Changes to ML vector specification:
>> http://people.apache.org/~tdas/spark-1.0.0-rc10-docs/mllib-guide.html#from-09-to-10
>> 
>> Changes to the Java API:
>> http://people.apache.org/~tdas/spark-1.0.0-rc10-docs/java-programming-guide.html#upgrading-from-pre-10-versions-of-spark
>> 
>> Changes to the streaming API:
>> http://people.apache.org/~tdas/spark-1.0.0-rc10-docs/streaming-programming-guide.html#migration-guide-from-091-or-below-to-1x
>> 
>> Changes to the GraphX API:
>> http://people.apache.org/~tdas/spark-1.0.0-rc10-docs/graphx-programming-guide.html#upgrade-guide-from-spark-091
>> 
>> Other changes:
>> coGroup and related functions now return Iterable[T] instead of Seq[T]
>> ==> Call toSeq on the result to restore the old behavior
>> 
>> SparkContext.jarOfClass returns Option[String] instead of Seq[String]
>> ==> Call toSeq on the result to restore old behavior
> 
> 
> 
> -- 
> Marcelo

Reply via email to