All of those look like QA or documentation, which I don't think needs to
block testing on an RC (and in fact probably needs an RC to test?).
Joseph, please correct me if I'm wrong.  It is unlikely this first RC is
going to pass, but I wanted to get the ball rolling on testing 2.2.

On Thu, Apr 27, 2017 at 1:45 PM, Sean Owen <so...@cloudera.com> wrote:

> These are still blockers for 2.2:
>
> SPARK-20501 ML, Graph 2.2 QA: API: New Scala APIs, docs
> SPARK-20504 ML 2.2 QA: API: Java compatibility, docs
> SPARK-20503 ML 2.2 QA: API: Python API coverage
> SPARK-20502 ML, Graph 2.2 QA: API: Experimental, DeveloperApi, final,
> sealed audit
> SPARK-20500 ML, Graph 2.2 QA: API: Binary incompatible changes
> SPARK-18813 MLlib 2.2 Roadmap
>
> Joseph you opened most of these just now. Is this an "RC0" we know won't
> pass? or, wouldn't we normally cut an RC after those things are ready?
>
> On Thu, Apr 27, 2017 at 7:31 PM Michael Armbrust <mich...@databricks.com>
> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.2.0. The vote is open until Tues, May 2nd, 2017 at 12:00 PST and
>> passes if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.2.0
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v2.2.0-rc1
>> <https://github.com/apache/spark/tree/v2.2.0-rc1> (8ccb4a57c82146c
>> 1a8f8966c7e64010cf5632cb6)
>>
>> List of JIRA tickets resolved can be found with this filter
>> <https://issues.apache.org/jira/browse/SPARK-20134?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.1>
>> .
>>
>> The release files, including signatures, digests, etc. can be found at:
>> http://home.apache.org/~pwendell/spark-releases/spark-2.2.0-rc1-bin/
>>
>> Release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/pwendell.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1235/
>>
>> The documentation corresponding to this release can be found at:
>> http://people.apache.org/~pwendell/spark-releases/spark-2.2.0-rc1-docs/
>>
>>
>> *FAQ*
>>
>> *How can I help test this release?*
>>
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> *What should happen to JIRA tickets still targeting 2.2.0?*
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should be
>> worked on immediately. Everything else please retarget to 2.3.0 or 2.2.1.
>>
>> *But my bug isn't fixed!??!*
>>
>> In order to make timely releases, we will typically not hold the release
>> unless the bug in question is a regression from 2.1.1.
>>
>

Reply via email to