+1 (non-binding)

On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <felixcheun...@hotmail.com>
wrote:

> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>
> _____________________________
> From: Sean Owen <so...@cloudera.com>
> Sent: Thursday, September 14, 2017 3:12 PM
> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>
>
>
> +1
> Very nice. The sigs and hashes look fine, it builds fine for me on Debian
> Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes tests.
>
> Yes as you say, no outstanding issues except for this which doesn't look
> critical, as it's not a regression.
>
> SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs
>
>
> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau <hol...@pigscanfly.ca> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.1.2. The vote is open until Friday September 22nd at 18:00 PST and
>> passes if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.1.2
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see https://spark.apache.org/
>>
>> The tag to be voted on is v2.1.2-rc1
>> <https://github.com/apache/spark/tree/v2.1.2-rc1> (
>> 6f470323a0363656999dd36cb33f528afe627c12)
>>
>> List of JIRA tickets resolved in this release can be found with this
>> filter.
>> <https://issues.apache.org/jira/browse/SPARK-20134?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.2>
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/
>>
>> Release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/pwendell.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1248/
>>
>> The documentation corresponding to this release can be found at:
>> https://people.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-docs/
>>
>>
>> *FAQ*
>>
>> *How can I help test this release?*
>>
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install the
>> current RC and see if anything important breaks, in the Java/Scala you can
>> add the staging repository to your projects resolvers and test with the RC
>> (make sure to clean up the artifact cache before/after so you don't end up
>> building with a out of date RC going forward).
>>
>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should be
>> worked on immediately. Everything else please retarget to 2.1.3.
>>
>> *But my bug isn't fixed!??!*
>>
>> In order to make timely releases, we will typically not hold the release
>> unless the bug in question is a regression from 2.1.1. That being said if
>> there is something which is a regression form 2.1.1 that has not been
>> correctly targeted please ping a committer to help target the issue (you
>> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>> <https://issues.apache.org/jira/browse/SPARK-21985?jql=project%20%3D%20SPARK%20AND%20status%20%3D%20OPEN%20AND%20(affectedVersion%20%3D%202.1.2%20OR%20affectedVersion%20%3D%202.1.1)>
>> )
>>
>> *What are the unresolved* issues targeted for 2.1.2
>> <https://issues.apache.org/jira/browse/SPARK-21985?jql=project%20%3D%20SPARK%20AND%20status%20in%20(Open%2C%20%22In%20Progress%22%2C%20Reopened)%20AND%20%22Target%20Version%2Fs%22%20%3D%202.1.2>
>> ?
>>
>> At the time of the writing, there is one in progress major issue
>> SPARK-21985 <https://issues.apache.org/jira/browse/SPARK-21985>, I
>> believe Andrew Ray & HyukjinKwon are looking into this one.
>>
>> --
>> Twitter: https://twitter.com/holdenkarau
>>
>
>
>

Reply via email to