Xiao,

This is the third time in this release cycle that this is happening.
Sorry to single out you guys, but can you please do two things:

- do not merge things in 2.3 you're not absolutely sure about
- make sure that things you backport to 2.3 are not causing problems
- let the RM know about these things as soon as you discover them, not
when they send the next RC for voting.

Even though I was in the middle of preparing the rc, I could have
easily aborted that and skipped this whole thread.

This vote is canceled. I'll prepare a new RC right away. I hope this
does not happen again.


On Fri, Jun 1, 2018 at 1:20 PM, Xiao Li <gatorsm...@gmail.com> wrote:
> Sorry, I need to say -1
>
> This morning, just found a regression in 2.3.1 and reverted
> https://github.com/apache/spark/pull/21443
>
> Xiao
>
> 2018-06-01 13:09 GMT-07:00 Marcelo Vanzin <van...@cloudera.com>:
>>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.3.1.
>>
>> Given that I expect at least a few people to be busy with Spark Summit
>> next
>> week, I'm taking the liberty of setting an extended voting period. The
>> vote
>> will be open until Friday, June 8th, at 19:00 UTC (that's 12:00 PDT).
>>
>> It passes with a majority of +1 votes, which must include at least 3 +1
>> votes
>> from the PMC.
>>
>> [ ] +1 Release this package as Apache Spark 2.3.1
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v2.3.1-rc3 (commit 1cc5f68b):
>> https://github.com/apache/spark/tree/v2.3.1-rc3
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.1-rc3-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1271/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.1-rc3-docs/
>>
>> The list of bug fixes going into 2.3.1 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12342432
>>
>> FAQ
>>
>> =========================
>> How can I help test this release?
>> =========================
>>
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 2.3.1?
>> ===========================================
>>
>> The current list of open tickets targeted at 2.3.1 can be found at:
>> https://s.apache.org/Q3Uo
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==================
>> But my bug isn't fixed?
>> ==================
>>
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>>
>> --
>> Marcelo
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to