All the signatures are correct. The licensing all looks fine. The
source builds fine.

Now, let me ask about unit tests, since I had a more detailed look,
which I should have done before.


dev/run-tests fails two tests (1 Hive, 1 Kafka Streaming) for me
locally on 1.1.0-rc3. Does anyone else see that? It may be my env.
Although I still see the Hive failure on Debian too:

[info] - SET commands semantics for a HiveContext *** FAILED ***
[info]   Expected Array("spark.sql.key.usedfortestonly=test.val.0",
"spark.sql.key.usedfortestonlyspark.sql.key.usedfortestonly=test.val.0test.val.0"),
but got 
Array("spark.sql.key.usedfortestonlyspark.sql.key.usedfortestonly=test.val.0test.val.0",
"spark.sql.key.usedfortestonly=test.val.0") (HiveQuerySuite.scala:541)


Python lint checks fail for files in python/build/py4j. These aren't
Spark files and are only present in this location in the release. The
check should simply be updated later to ignore this. Not a blocker.


Evidently, the SBT tests pass, usually, in master:
https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/SparkPullRequestBuilder/
But Maven tests have not passed in master for a long time:
https://amplab.cs.berkeley.edu/jenkins/view/Spark/

I can reproduce this with Maven for 1.1.0-rc3. It feels funny to ship
with a repeatable Maven build failure, since Maven is the build of
record for release. Whatever is being tested is probably OK since SBT
passes, so it need not block release. I'll look for a fix as well.

A simple "sbt test" always fails for me, and that just may be because
the build is now only meaningful with further configuration. SBT tests
are mostly passing if not consistently for all profiles:
https://amplab.cs.berkeley.edu/jenkins/view/Spark/  These also sort of
feel funny, although nothing seems like an outright blocker.

I guess I'll add a non-binding +0 -- none of these are necessarily a
blocker but adds up to feeling a bit iffy about the state of tests in
the context of a release.

On Sat, Aug 30, 2014 at 11:07 PM, Patrick Wendell <pwend...@gmail.com> wrote:
> Please vote on releasing the following candidate as Apache Spark version 
> 1.1.0!
>
> The tag to be voted on is v1.1.0-rc3 (commit b2d0493b):
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=b2d0493b223c5f98a593bb6d7372706cc02bebad
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-1.1.0-rc3/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1030/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-1.1.0-rc3-docs/
>
> Please vote on releasing this package as Apache Spark 1.1.0!
>
> The vote is open until Tuesday, September 02, at 23:07 UTC and passes if
> a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 1.1.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see
> http://spark.apache.org/
>
> == Regressions fixed since RC1 ==
> - Build issue for SQL support: 
> https://issues.apache.org/jira/browse/SPARK-3234
> - EC2 script version bump to 1.1.0.
>
> == What justifies a -1 vote for this release? ==
> This vote is happening very late into the QA period compared with
> previous votes, so -1 votes should only occur for significant
> regressions from 1.0.2. Bugs already present in 1.0.X will not block
> this release.
>
> == What default changes should I be aware of? ==
> 1. The default value of "spark.io.compression.codec" is now "snappy"
> --> Old behavior can be restored by switching to "lzf"
>
> 2. PySpark now performs external spilling during aggregations.
> --> Old behavior can be restored by setting "spark.shuffle.spill" to "false".
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to