+1 On Fri, Apr 28, 2017 at 9:17 AM Kazuaki Ishizaki <ishiz...@jp.ibm.com> wrote:
> +1 (non-binding) > > I tested it on Ubuntu 16.04 and OpenJDK8 on ppc64le. All of the tests for > core have passed.. > > $ java -version > openjdk version "1.8.0_111" > OpenJDK Runtime Environment (build > 1.8.0_111-8u111-b14-2ubuntu0.16.04.2-b14) > OpenJDK 64-Bit Server VM (build 25.111-b14, mixed mode) > $ build/mvn -DskipTests -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 > package install > $ build/mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 test -pl core > ... > Total number of tests run: 1788 > Suites: completed 198, aborted 0 > Tests: succeeded 1788, failed 0, canceled 4, ignored 8, pending 0 > All tests passed. > [INFO] > ------------------------------------------------------------------------ > [INFO] BUILD SUCCESS > [INFO] > ------------------------------------------------------------------------ > [INFO] Total time: 16:30 min > [INFO] Finished at: 2017-04-29T01:02:29+09:00 > [INFO] Final Memory: 54M/576M > [INFO] > ------------------------------------------------------------------------ > > Regards, > Kazuaki Ishizaki, > > > > From: Michael Armbrust <mich...@databricks.com> > To: "dev@spark.apache.org" <dev@spark.apache.org> > Date: 2017/04/27 09:30 > Subject: [VOTE] Apache Spark 2.1.1 (RC4) > ------------------------------ > > > > Please vote on releasing the following candidate as Apache Spark version > 2.1.1. The vote is open until Sat, April 29th, 2018 at 18:00 PST and passes > if a majority of at least 3 +1 PMC votes are cast. > > [ ] +1 Release this package as Apache Spark 2.1.1 > [ ] -1 Do not release this package because ... > > > To learn more about Apache Spark, please see *http://spark.apache.org/* > <http://spark.apache.org/> > > The tag to be voted on is *v2.1.1-rc4* > <https://github.com/apache/spark/tree/v2.1.1-rc4> ( > 267aca5bd5042303a718d10635bc0d1a1596853f) > > List of JIRA tickets resolved can be found *with this filter* > <https://issues.apache.org/jira/browse/SPARK-20134?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.1> > . > > The release files, including signatures, digests, etc. can be found at: > *http://home.apache.org/~pwendell/spark-releases/spark-2.1.1-rc4-bin/* > <http://home.apache.org/~pwendell/spark-releases/spark-2.1.1-rc4-bin/> > > Release artifacts are signed with the following key: > *https://people.apache.org/keys/committer/pwendell.asc* > <https://people.apache.org/keys/committer/pwendell.asc> > > The staging repository for this release can be found at: > *https://repository.apache.org/content/repositories/orgapachespark-1232/* > <https://repository.apache.org/content/repositories/orgapachespark-1232/> > > The documentation corresponding to this release can be found at: > *http://people.apache.org/~pwendell/spark-releases/spark-2.1.1-rc4-docs/* > <http://people.apache.org/~pwendell/spark-releases/spark-2.1.1-rc4-docs/> > > > *FAQ* > > *How can I help test this release?* > > If you are a Spark user, you can help us test this release by taking an > existing Spark workload and running on this release candidate, then > reporting any regressions. > > *What should happen to JIRA tickets still targeting 2.1.1?* > > Committers should look at those and triage. Extremely important bug fixes, > documentation, and API tweaks that impact compatibility should be worked on > immediately. Everything else please retarget to 2.1.2 or 2.2.0. > > *But my bug isn't fixed!??!* > > In order to make timely releases, we will typically not hold the release > unless the bug in question is a regression from 2.1.0. > > *What happened to RC1?* > > There were issues with the release packaging and as a result was skipped. > >