While you're at it, one thing that needs to be done is create a 2.1.3
version on JIRA. Not sure if you have enough permissions to do that.

Fixes after an RC should use the new version, and if you create a new
RC, you'll need to go and backdate the patches that went into the new
RC.

On Mon, Sep 18, 2017 at 8:22 PM, Holden Karau <hol...@pigscanfly.ca> wrote:
> As per the conversation happening around the signing of releases I'm
> cancelling this vote. If folks agree with the temporary solution there I'll
> try and get a new RC out shortly but if we end up blocking on migrating the
> Jenkins jobs it could take a bit longer.
>
> On Sun, Sep 17, 2017 at 1:30 AM, yuming wang <wgy...@gmail.com> wrote:
>>
>> Yes, It doesn’t work in 2.1.0 and 2.1.1, I create a PR for this:
>> https://github.com/apache/spark/pull/19259.
>>
>>
>> 在 2017年9月17日,16:14,Sean Owen <so...@cloudera.com> 写道:
>>
>> So, didn't work in 2.1.0 or 2.1.1? If it's not a regression and not
>> critical, it shouldn't block a release. It seems like this can only affect
>> Docker and/or Oracle JDBC? Well, if we need to roll another release anyway,
>> seems OK.
>>
>> On Sun, Sep 17, 2017 at 6:06 AM Xiao Li <gatorsm...@gmail.com> wrote:
>>>
>>> This is a bug introduced in 2.1. It works fine in 2.0
>>>
>>> 2017-09-16 16:15 GMT-07:00 Holden Karau <hol...@pigscanfly.ca>:
>>>>
>>>> Ok :) Was this working in 2.1.1?
>>>>
>>>> On Sat, Sep 16, 2017 at 3:59 PM Xiao Li <gatorsm...@gmail.com> wrote:
>>>>>
>>>>> Still -1
>>>>>
>>>>> Unable to pass the tests in my local environment. Open a JIRA
>>>>> https://issues.apache.org/jira/browse/SPARK-22041
>>>>>
>>>>> - SPARK-16625: General data types to be mapped to Oracle *** FAILED ***
>>>>>
>>>>>   types.apply(9).equals(org.apache.spark.sql.types.DateType) was false
>>>>> (OracleIntegrationSuite.scala:158)
>>>>>
>>>>> Xiao
>>>>>
>>>>>
>>>>> 2017-09-15 17:35 GMT-07:00 Ryan Blue <rb...@netflix.com.invalid>:
>>>>>>
>>>>>> -1 (with my Apache member hat on, non-binding)
>>>>>>
>>>>>> I'll continue discussion in the other thread, but I don't think we
>>>>>> should share signing keys.
>>>>>>
>>>>>> On Fri, Sep 15, 2017 at 5:14 PM, Holden Karau <hol...@pigscanfly.ca>
>>>>>> wrote:
>>>>>>>
>>>>>>> Indeed it's limited to a people with login permissions on the Jenkins
>>>>>>> host (and perhaps further limited, I'm not certain). Shane probably 
>>>>>>> knows
>>>>>>> more about the ACLs, so I'll ask him in the other thread for specifics.
>>>>>>>
>>>>>>> This is maybe branching a bit from the question of the current RC
>>>>>>> though, so I'd suggest we continue this discussion on the thread Sean 
>>>>>>> Owen
>>>>>>> made.
>>>>>>>
>>>>>>> On Fri, Sep 15, 2017 at 4:04 PM Ryan Blue <rb...@netflix.com> wrote:
>>>>>>>>
>>>>>>>> I'm not familiar with the release procedure, can you send a link to
>>>>>>>> this Jenkins job? Can anyone run this job, or is it limited to 
>>>>>>>> committers?
>>>>>>>>
>>>>>>>> rb
>>>>>>>>
>>>>>>>> On Fri, Sep 15, 2017 at 12:28 PM, Holden Karau
>>>>>>>> <hol...@pigscanfly.ca> wrote:
>>>>>>>>>
>>>>>>>>> That's a good question, I built the release candidate however the
>>>>>>>>> Jenkins scripts don't take a parameter for configuring who signs them 
>>>>>>>>> rather
>>>>>>>>> it always signs them with Patrick's key. You can see this from 
>>>>>>>>> previous
>>>>>>>>> releases which were managed by other folks but still signed by 
>>>>>>>>> Patrick.
>>>>>>>>>
>>>>>>>>> On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com>
>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>> The signature is valid, but why was the release signed with
>>>>>>>>>> Patrick Wendell's private key? Did Patrick build the release 
>>>>>>>>>> candidate?
>>>>>>>>>>
>>>>>>>>>> rb
>>>>>>>>>>
>>>>>>>>>> On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g....@gmail.com>
>>>>>>>>>> wrote:
>>>>>>>>>>>
>>>>>>>>>>> +1 (non-binding)
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung
>>>>>>>>>>> <felixcheun...@hotmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>>>>>>>>>>>>
>>>>>>>>>>>> _____________________________
>>>>>>>>>>>> From: Sean Owen <so...@cloudera.com>
>>>>>>>>>>>> Sent: Thursday, September 14, 2017 3:12 PM
>>>>>>>>>>>> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
>>>>>>>>>>>> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> +1
>>>>>>>>>>>> Very nice. The sigs and hashes look fine, it builds fine for me
>>>>>>>>>>>> on Debian Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and 
>>>>>>>>>>>> passes
>>>>>>>>>>>> tests.
>>>>>>>>>>>>
>>>>>>>>>>>> Yes as you say, no outstanding issues except for this which
>>>>>>>>>>>> doesn't look critical, as it's not a regression.
>>>>>>>>>>>>
>>>>>>>>>>>> SPARK-21985 PySpark PairDeserializer is broken for double-zipped
>>>>>>>>>>>> RDDs
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau
>>>>>>>>>>>> <hol...@pigscanfly.ca> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>> Please vote on releasing the following candidate as Apache
>>>>>>>>>>>>> Spark version 2.1.2. The vote is open until Friday September 22nd 
>>>>>>>>>>>>> at 18:00
>>>>>>>>>>>>> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>>>>>>>>>>>>>
>>>>>>>>>>>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>>>>>>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> To learn more about Apache Spark, please see
>>>>>>>>>>>>> https://spark.apache.org/
>>>>>>>>>>>>>
>>>>>>>>>>>>> The tag to be voted on is v2.1.2-rc1
>>>>>>>>>>>>> (6f470323a0363656999dd36cb33f528afe627c12)
>>>>>>>>>>>>>
>>>>>>>>>>>>> List of JIRA tickets resolved in this release can be found with
>>>>>>>>>>>>> this filter.
>>>>>>>>>>>>>
>>>>>>>>>>>>> The release files, including signatures, digests, etc. can be
>>>>>>>>>>>>> found at:
>>>>>>>>>>>>>
>>>>>>>>>>>>> https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/
>>>>>>>>>>>>>
>>>>>>>>>>>>> Release artifacts are signed with the following key:
>>>>>>>>>>>>> https://people.apache.org/keys/committer/pwendell.asc
>>>>>>>>>>>>>
>>>>>>>>>>>>> The staging repository for this release can be found at:
>>>>>>>>>>>>>
>>>>>>>>>>>>> https://repository.apache.org/content/repositories/orgapachespark-1248/
>>>>>>>>>>>>>
>>>>>>>>>>>>> The documentation corresponding to this release can be found
>>>>>>>>>>>>> at:
>>>>>>>>>>>>>
>>>>>>>>>>>>> https://people.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-docs/
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> FAQ
>>>>>>>>>>>>>
>>>>>>>>>>>>> How can I help test this release?
>>>>>>>>>>>>>
>>>>>>>>>>>>> If you are a Spark user, you can help us test this release by
>>>>>>>>>>>>> taking an existing Spark workload and running on this release 
>>>>>>>>>>>>> candidate,
>>>>>>>>>>>>> then reporting any regressions.
>>>>>>>>>>>>>
>>>>>>>>>>>>> If you're working in PySpark you can set up a virtual env and
>>>>>>>>>>>>> install the current RC and see if anything important breaks, in 
>>>>>>>>>>>>> the
>>>>>>>>>>>>> Java/Scala you can add the staging repository to your projects 
>>>>>>>>>>>>> resolvers and
>>>>>>>>>>>>> test with the RC (make sure to clean up the artifact cache 
>>>>>>>>>>>>> before/after so
>>>>>>>>>>>>> you don't end up building with a out of date RC going forward).
>>>>>>>>>>>>>
>>>>>>>>>>>>> What should happen to JIRA tickets still targeting 2.1.2?
>>>>>>>>>>>>>
>>>>>>>>>>>>> Committers should look at those and triage. Extremely important
>>>>>>>>>>>>> bug fixes, documentation, and API tweaks that impact 
>>>>>>>>>>>>> compatibility should be
>>>>>>>>>>>>> worked on immediately. Everything else please retarget to 2.1.3.
>>>>>>>>>>>>>
>>>>>>>>>>>>> But my bug isn't fixed!??!
>>>>>>>>>>>>>
>>>>>>>>>>>>> In order to make timely releases, we will typically not hold
>>>>>>>>>>>>> the release unless the bug in question is a regression from 
>>>>>>>>>>>>> 2.1.1. That
>>>>>>>>>>>>> being said if there is something which is a regression form 2.1.1 
>>>>>>>>>>>>> that has
>>>>>>>>>>>>> not been correctly targeted please ping a committer to help 
>>>>>>>>>>>>> target the issue
>>>>>>>>>>>>> (you can see the open issues listed as impacting Spark 2.1.1 & 
>>>>>>>>>>>>> 2.1.2)
>>>>>>>>>>>>>
>>>>>>>>>>>>> What are the unresolved issues targeted for 2.1.2?
>>>>>>>>>>>>>
>>>>>>>>>>>>> At the time of the writing, there is one in progress major
>>>>>>>>>>>>> issue SPARK-21985, I believe Andrew Ray & HyukjinKwon are looking 
>>>>>>>>>>>>> into this
>>>>>>>>>>>>> one.
>>>>>>>>>>>>>
>>>>>>>>>>>>> --
>>>>>>>>>>>>> Twitter: https://twitter.com/holdenkarau
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> Ryan Blue
>>>>>>>>>> Software Engineer
>>>>>>>>>> Netflix
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> Twitter: https://twitter.com/holdenkarau
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> Ryan Blue
>>>>>>>> Software Engineer
>>>>>>>> Netflix
>>>>>>>
>>>>>>> --
>>>>>>> Twitter: https://twitter.com/holdenkarau
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Ryan Blue
>>>>>> Software Engineer
>>>>>> Netflix
>>>>>
>>>>>
>>>> --
>>>> Twitter: https://twitter.com/holdenkarau
>>>
>>>
>>
>
>
>
> --
> Cell : 425-233-8271
> Twitter: https://twitter.com/holdenkarau



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to