Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-07 Thread Holden Karau
This vote passes.  Thanks to everyone for testing and helping work out the
challenges with a different release flow & a new RM :)

Since I'm only a committer not PMC member there are some additional steps
involved to make the release available so thanks for your patience in
advance.

+1:
Sean Owen (binding)
Herman van Hövell tot Westerflier (binding)
Dongjoon Hyun
Kazuaki Ishizaki
Wenchen Fan (binding)
Ryan Blue
Liwei Lin
Felix Cheung
Denny Lee
Reynold Xin (binding)
Hyukjin Kwon
vaquar khan
Nick Pentreath (binding)
Ricardo Almeida
DB Tsai (binding)

0:
None

-1:
None

P.S.

Special thanks to Felix for catching and helping debug the R packaging
issue :)

On Sat, Oct 7, 2017 at 2:02 PM, Ricardo Almeida <
ricardo.alme...@actnowib.com> wrote:

> +1 (non-binding)
>
> Built and tested on
>
>- macOS 10.12.5 Java 8 (build 1.8.0_131)
>- Ubuntu 17.04, Java 8 (OpenJDK 1.8.0_111)
>
>
> On 3 October 2017 at 08:24, Holden Karau  wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and
>> passes if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.1.2
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see https://spark.apache.org/
>>
>> The tag to be voted on is v2.1.2-rc4
>>  (2abaea9e40fce81
>> cd4626498e0f5c28a70917499)
>>
>> List of JIRA tickets resolved in this release can be found with this
>> filter.
>> 
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>
>> Release artifacts are signed with a key from:
>> https://people.apache.org/~holden/holdens_keys.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>
>> The documentation corresponding to this release can be found at:
>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>
>>
>> *FAQ*
>>
>> *How can I help test this release?*
>>
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install the
>> current RC and see if anything important breaks, in the Java/Scala you
>> can add the staging repository to your projects resolvers and test with the
>> RC (make sure to clean up the artifact cache before/after so you don't
>> end up building with a out of date RC going forward).
>>
>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should be
>> worked on immediately. Everything else please retarget to 2.1.3.
>>
>> *But my bug isn't fixed!??!*
>>
>> In order to make timely releases, we will typically not hold the release
>> unless the bug in question is a regression from 2.1.1. That being said
>> if there is something which is a regression form 2.1.1 that has not been
>> correctly targeted please ping a committer to help target the issue (you
>> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>> 
>> )
>>
>> *What are the unresolved* issues targeted for 2.1.2
>> 
>> ?
>>
>> At this time there are no open unresolved issues.
>>
>> *Is there anything different about this release?*
>>
>> This is the first release in awhile not built on the AMPLAB Jenkins. This
>> is good because it means future releases can more easily be built and
>> signed securely (and I've been updating the documentation in
>> https://github.com/apache/spark-website/pull/66 as I progress), however
>> the chances of a mistake are higher with any change like this. If there
>> something you normally take for granted as correct when checking a release,
>> please double check this time :)
>>
>> *Should I be committing code to branch-2.1?*
>>
>> Thanks for asking! Please treat this stage in the RC process as "code
>> freeze" so bug fixes only. If you're uncertain if something should be back
>> ported please reach out. If you do commit to branch-2.1 please tag your
>> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the 2.1.3
>> fixed into 2.1.2 as appropriate.
>>
>> *What happened to RC3?*
>>
>> Some R+zinc interactions 

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-07 Thread Ricardo Almeida
+1 (non-binding)

Built and tested on

   - macOS 10.12.5 Java 8 (build 1.8.0_131)
   - Ubuntu 17.04, Java 8 (OpenJDK 1.8.0_111)


On 3 October 2017 at 08:24, Holden Karau  wrote:

> Please vote on releasing the following candidate as Apache Spark version 2
> .1.2. The vote is open until Saturday October 7th at 9:00 PST and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.1.2
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.1.2-rc4
>  (2abaea9e40fce81
> cd4626498e0f5c28a70917499)
>
> List of JIRA tickets resolved in this release can be found with this
> filter.
> 
>
> The release files, including signatures, digests, etc. can be found at:
> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>
> Release artifacts are signed with a key from:
> https://people.apache.org/~holden/holdens_keys.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1252
>
> The documentation corresponding to this release can be found at:
> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>
>
> *FAQ*
>
> *How can I help test this release?*
>
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install the
> current RC and see if anything important breaks, in the Java/Scala you
> can add the staging repository to your projects resolvers and test with the
> RC (make sure to clean up the artifact cache before/after so you don't
> end up building with a out of date RC going forward).
>
> *What should happen to JIRA tickets still targeting 2.1.2?*
>
> Committers should look at those and triage. Extremely important bug fixes,
> documentation, and API tweaks that impact compatibility should be worked on
> immediately. Everything else please retarget to 2.1.3.
>
> *But my bug isn't fixed!??!*
>
> In order to make timely releases, we will typically not hold the release
> unless the bug in question is a regression from 2.1.1. That being said if
> there is something which is a regression form 2.1.1 that has not been
> correctly targeted please ping a committer to help target the issue (you
> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
> 
> )
>
> *What are the unresolved* issues targeted for 2.1.2
> 
> ?
>
> At this time there are no open unresolved issues.
>
> *Is there anything different about this release?*
>
> This is the first release in awhile not built on the AMPLAB Jenkins. This
> is good because it means future releases can more easily be built and
> signed securely (and I've been updating the documentation in
> https://github.com/apache/spark-website/pull/66 as I progress), however
> the chances of a mistake are higher with any change like this. If there
> something you normally take for granted as correct when checking a release,
> please double check this time :)
>
> *Should I be committing code to branch-2.1?*
>
> Thanks for asking! Please treat this stage in the RC process as "code
> freeze" so bug fixes only. If you're uncertain if something should be back
> ported please reach out. If you do commit to branch-2.1 please tag your
> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the 2.1.3
> fixed into 2.1.2 as appropriate.
>
> *What happened to RC3?*
>
> Some R+zinc interactions kept it from getting out the door.
> --
> Twitter: https://twitter.com/holdenkarau
>


Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-06 Thread DB Tsai
+1

Sincerely,

DB Tsai
--
Web: https://www.dbtsai.com
PGP Key ID: 0x5CED8B896A6BDFA0


On Fri, Oct 6, 2017 at 7:46 AM, Felix Cheung <felixcheun...@hotmail.com> wrote:
> Thanks Nick, Hyukjin. Yes this seems to be a longer standing issue on RHEL
> with respect to forking.
>
> 
> From: Nick Pentreath <nick.pentre...@gmail.com>
> Sent: Friday, October 6, 2017 6:16:53 AM
> To: Hyukjin Kwon
> Cc: dev
> Subject: Re: [VOTE] Spark 2.1.2 (RC4)
>
> Ah yes - I recall that it was fixed. Forgot it was for 2.3.0
>
> My +1 vote stands.
>
> On Fri, 6 Oct 2017 at 15:15 Hyukjin Kwon <gurwls...@gmail.com> wrote:
>>
>> Hi Nick,
>>
>> I believe that R test failure is due to SPARK-21093, at least the error
>> message looks the same, and that is fixed from 2.3.0. This was not
>> backported because I and reviewers were worried as that fixed a very core to
>> SparkR (even, it was reverted once even after very close look by some
>> reviewers).
>>
>> I asked Michael to note this as a known issue in
>> https://spark.apache.org/releases/spark-release-2-2-0.html#known-issues
>> before due to this reason.
>> I believe It should be fine and probably we should note if possible. I
>> believe this should not be a regression anyway as, if I understood
>> correctly, it was there from the very first place.
>>
>> Thanks.
>>
>>
>>
>>
>> 2017-10-06 21:20 GMT+09:00 Nick Pentreath <nick.pentre...@gmail.com>:
>>>
>>> Checked sigs & hashes.
>>>
>>> Tested on RHEL
>>> build/mvn -Phadoop-2.7 -Phive -Pyarn test passed
>>> Python tests passed
>>>
>>> I ran R tests and am getting some failures:
>>> https://gist.github.com/MLnick/ddf4d531d5125208771beee0cc9c697e (I seem to
>>> recall similar issues on a previous release but I thought it was fixed).
>>>
>>> I re-ran R tests on an Ubuntu box to double check and they passed there.
>>>
>>> So I'd still +1 the release
>>>
>>> Perhaps someone can take a look at the R failures on RHEL just in case
>>> though.
>>>
>>>
>>> On Fri, 6 Oct 2017 at 05:58 vaquar khan <vaquar.k...@gmail.com> wrote:
>>>>
>>>> +1 (non binding ) tested on Ubuntu ,all test case  are passed.
>>>>
>>>> Regards,
>>>> Vaquar khan
>>>>
>>>> On Thu, Oct 5, 2017 at 10:46 PM, Hyukjin Kwon <gurwls...@gmail.com>
>>>> wrote:
>>>>>
>>>>> +1 too.
>>>>>
>>>>>
>>>>> On 6 Oct 2017 10:49 am, "Reynold Xin" <r...@databricks.com> wrote:
>>>>>
>>>>> +1
>>>>>
>>>>>
>>>>> On Mon, Oct 2, 2017 at 11:24 PM, Holden Karau <hol...@pigscanfly.ca>
>>>>> wrote:
>>>>>>
>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>> version 2.1.2. The vote is open until Saturday October 7th at 9:00 PST 
>>>>>> and
>>>>>> passes if a majority of at least 3 +1 PMC votes are cast.
>>>>>>
>>>>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>>>>> [ ] -1 Do not release this package because ...
>>>>>>
>>>>>>
>>>>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>>>>
>>>>>> The tag to be voted on is v2.1.2-rc4
>>>>>> (2abaea9e40fce81cd4626498e0f5c28a70917499)
>>>>>>
>>>>>> List of JIRA tickets resolved in this release can be found with this
>>>>>> filter.
>>>>>>
>>>>>> The release files, including signatures, digests, etc. can be found
>>>>>> at:
>>>>>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>>>>>
>>>>>> Release artifacts are signed with a key from:
>>>>>> https://people.apache.org/~holden/holdens_keys.asc
>>>>>>
>>>>>> The staging repository for this release can be found at:
>>>>>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>>>>>
>>>>>> The documentation corresponding to this release can be found at:
>>>>>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>>>>>
>>>>&

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-06 Thread Felix Cheung
Thanks Nick, Hyukjin. Yes this seems to be a longer standing issue on RHEL with 
respect to forking.


From: Nick Pentreath <nick.pentre...@gmail.com>
Sent: Friday, October 6, 2017 6:16:53 AM
To: Hyukjin Kwon
Cc: dev
Subject: Re: [VOTE] Spark 2.1.2 (RC4)

Ah yes - I recall that it was fixed. Forgot it was for 2.3.0

My +1 vote stands.

On Fri, 6 Oct 2017 at 15:15 Hyukjin Kwon 
<gurwls...@gmail.com<mailto:gurwls...@gmail.com>> wrote:
Hi Nick,

I believe that R test failure is due to SPARK-21093, at least the error message 
looks the same, and that is fixed from 2.3.0. This was not backported because I 
and reviewers were worried as that fixed a very core to SparkR (even, it was 
reverted once even after very close look by some reviewers).

I asked Michael to note this as a known issue in 
https://spark.apache.org/releases/spark-release-2-2-0.html#known-issues before 
due to this reason.
I believe It should be fine and probably we should note if possible. I believe 
this should not be a regression anyway as, if I understood correctly, it was 
there from the very first place.

Thanks.




2017-10-06 21:20 GMT+09:00 Nick Pentreath 
<nick.pentre...@gmail.com<mailto:nick.pentre...@gmail.com>>:
Checked sigs & hashes.

Tested on RHEL
build/mvn -Phadoop-2.7 -Phive -Pyarn test passed
Python tests passed

I ran R tests and am getting some failures: 
https://gist.github.com/MLnick/ddf4d531d5125208771beee0cc9c697e (I seem to 
recall similar issues on a previous release but I thought it was fixed).

I re-ran R tests on an Ubuntu box to double check and they passed there.

So I'd still +1 the release

Perhaps someone can take a look at the R failures on RHEL just in case though.


On Fri, 6 Oct 2017 at 05:58 vaquar khan 
<vaquar.k...@gmail.com<mailto:vaquar.k...@gmail.com>> wrote:
+1 (non binding ) tested on Ubuntu ,all test case  are passed.

Regards,
Vaquar khan

On Thu, Oct 5, 2017 at 10:46 PM, Hyukjin Kwon 
<gurwls...@gmail.com<mailto:gurwls...@gmail.com>> wrote:
+1 too.


On 6 Oct 2017 10:49 am, "Reynold Xin" 
<r...@databricks.com<mailto:r...@databricks.com>> wrote:
+1


On Mon, Oct 2, 2017 at 11:24 PM, Holden Karau 
<hol...@pigscanfly.ca<mailto:hol...@pigscanfly.ca>> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.1.2. 
The vote is open until Saturday October 7th at 9:00 PST and passes if a 
majority of at least 3 +1 PMC votes are cast.

[ ] +1 Release this package as Apache Spark 2.1.2
[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is 
v2.1.2-rc4<https://github.com/apache/spark/tree/v2.1.2-rc4> 
(2abaea9e40fce81cd4626498e0f5c28a70917499)

List of JIRA tickets resolved in this release can be found with this 
filter.<https://issues.apache.org/jira/issues/?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.2>

The release files, including signatures, digests, etc. can be found at:
https://home.apache.org/~holden/spark-2.1.2-rc4-bin/

Release artifacts are signed with a key from:
https://people.apache.org/~holden/holdens_keys.asc

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1252

The documentation corresponding to this release can be found at:
https://people.apache.org/~holden/spark-2.1.2-rc4-docs/


FAQ

How can I help test this release?

If you are a Spark user, you can help us test this release by taking an 
existing Spark workload and running on this release candidate, then reporting 
any regressions.

If you're working in PySpark you can set up a virtual env and install the 
current RC and see if anything important breaks, in the Java/Scala you can add 
the staging repository to your projects resolvers and test with the RC (make 
sure to clean up the artifact cache before/after so you don't end up building 
with a out of date RC going forward).

What should happen to JIRA tickets still targeting 2.1.2?

Committers should look at those and triage. Extremely important bug fixes, 
documentation, and API tweaks that impact compatibility should be worked on 
immediately. Everything else please retarget to 2.1.3.

But my bug isn't fixed!??!

In order to make timely releases, we will typically not hold the release unless 
the bug in question is a regression from 2.1.1. That being said if there is 
something which is a regression form 2.1.1 that has not been correctly targeted 
please ping a committer to help target the issue (you can see the open issues 
listed as impacting Spark 2.1.1 & 
2.1.2<https://issues.apache.org/jira/browse/SPARK-21985?jql=project%20%3D%20SPARK%20AND%20status%20%3D%20OPEN%20AND%20(affectedVersion%20%3D%202.1.2%20OR%20affectedVersion%20%3D%202.1.1)>)

What are the unresolved issues targeted for 
2.1.2<https://issues.apache.org/jira/brow

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-06 Thread Nick Pentreath
Ah yes - I recall that it was fixed. Forgot it was for 2.3.0

My +1 vote stands.

On Fri, 6 Oct 2017 at 15:15 Hyukjin Kwon  wrote:

> Hi Nick,
>
> I believe that R test failure is due to SPARK-21093, at least the error
> message looks the same, and that is fixed from 2.3.0. This was not
> backported because I and reviewers were worried as that fixed a very core
> to SparkR (even, it was reverted once even after very close look by some
> reviewers).
>
> I asked Michael to note this as a known issue in
> https://spark.apache.org/releases/spark-release-2-2-0.html#known-issues
> before due to this reason.
> I believe It should be fine and probably we should note if possible. I
> believe this should not be a regression anyway as, if I understood
> correctly, it was there from the very first place.
>
> Thanks.
>
>
>
>
> 2017-10-06 21:20 GMT+09:00 Nick Pentreath :
>
>> Checked sigs & hashes.
>>
>> Tested on RHEL
>> build/mvn -Phadoop-2.7 -Phive -Pyarn test passed
>> Python tests passed
>>
>> I ran R tests and am getting some failures:
>> https://gist.github.com/MLnick/ddf4d531d5125208771beee0cc9c697e (I seem
>> to recall similar issues on a previous release but I thought it was fixed).
>>
>> I re-ran R tests on an Ubuntu box to double check and they passed there.
>>
>> So I'd still +1 the release
>>
>> Perhaps someone can take a look at the R failures on RHEL just in case
>> though.
>>
>>
>> On Fri, 6 Oct 2017 at 05:58 vaquar khan  wrote:
>>
>>> +1 (non binding ) tested on Ubuntu ,all test case  are passed.
>>>
>>> Regards,
>>> Vaquar khan
>>>
>>> On Thu, Oct 5, 2017 at 10:46 PM, Hyukjin Kwon 
>>> wrote:
>>>
 +1 too.


 On 6 Oct 2017 10:49 am, "Reynold Xin"  wrote:

 +1


 On Mon, Oct 2, 2017 at 11:24 PM, Holden Karau 
 wrote:

> Please vote on releasing the following candidate as Apache Spark
> version 2.1.2. The vote is open until Saturday October 7th at 9:00
> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.1.2
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.1.2-rc4
>  (
> 2abaea9e40fce81cd4626498e0f5c28a70917499)
>
> List of JIRA tickets resolved in this release can be found with this
> filter.
> 
>
> The release files, including signatures, digests, etc. can be found at:
> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>
> Release artifacts are signed with a key from:
> https://people.apache.org/~holden/holdens_keys.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1252
>
> The documentation corresponding to this release can be found at:
> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>
>
> *FAQ*
>
> *How can I help test this release?*
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the
> Java/Scala you can add the staging repository to your projects resolvers
> and test with the RC (make sure to clean up the artifact cache
> before/after so you don't end up building with a out of date RC going
> forward).
>
> *What should happen to JIRA tickets still targeting 2.1.2?*
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should be
> worked on immediately. Everything else please retarget to 2.1.3.
>
> *But my bug isn't fixed!??!*
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from 2.1.1. That
> being said if there is something which is a regression form 2.1.1
> that has not been correctly targeted please ping a committer to help 
> target
> the issue (you can see the open issues listed as impacting Spark 2.1.1
> & 2.1.2
> 
> )
>
> *What are the unresolved* issues targeted for 2.1.2
> 

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-06 Thread Hyukjin Kwon
Hi Nick,

I believe that R test failure is due to SPARK-21093, at least the error
message looks the same, and that is fixed from 2.3.0. This was not
backported because I and reviewers were worried as that fixed a very core
to SparkR (even, it was reverted once even after very close look by some
reviewers).

I asked Michael to note this as a known issue in
https://spark.apache.org/releases/spark-release-2-2-0.html#known-issues
before due to this reason.
I believe It should be fine and probably we should note if possible. I
believe this should not be a regression anyway as, if I understood
correctly, it was there from the very first place.

Thanks.




2017-10-06 21:20 GMT+09:00 Nick Pentreath :

> Checked sigs & hashes.
>
> Tested on RHEL
> build/mvn -Phadoop-2.7 -Phive -Pyarn test passed
> Python tests passed
>
> I ran R tests and am getting some failures: https://gist.github.
> com/MLnick/ddf4d531d5125208771beee0cc9c697e (I seem to recall similar
> issues on a previous release but I thought it was fixed).
>
> I re-ran R tests on an Ubuntu box to double check and they passed there.
>
> So I'd still +1 the release
>
> Perhaps someone can take a look at the R failures on RHEL just in case
> though.
>
>
> On Fri, 6 Oct 2017 at 05:58 vaquar khan  wrote:
>
>> +1 (non binding ) tested on Ubuntu ,all test case  are passed.
>>
>> Regards,
>> Vaquar khan
>>
>> On Thu, Oct 5, 2017 at 10:46 PM, Hyukjin Kwon 
>> wrote:
>>
>>> +1 too.
>>>
>>>
>>> On 6 Oct 2017 10:49 am, "Reynold Xin"  wrote:
>>>
>>> +1
>>>
>>>
>>> On Mon, Oct 2, 2017 at 11:24 PM, Holden Karau 
>>> wrote:
>>>
 Please vote on releasing the following candidate as Apache Spark
 version 2.1.2. The vote is open until Saturday October 7th at 9:00
 PST and passes if a majority of at least 3 +1 PMC votes are cast.

 [ ] +1 Release this package as Apache Spark 2.1.2
 [ ] -1 Do not release this package because ...


 To learn more about Apache Spark, please see https://spark.apache.org/

 The tag to be voted on is v2.1.2-rc4
  (2abaea9e40fce81
 cd4626498e0f5c28a70917499)

 List of JIRA tickets resolved in this release can be found with this
 filter.
 

 The release files, including signatures, digests, etc. can be found at:
 https://home.apache.org/~holden/spark-2.1.2-rc4-bin/

 Release artifacts are signed with a key from:
 https://people.apache.org/~holden/holdens_keys.asc

 The staging repository for this release can be found at:
 https://repository.apache.org/content/repositories/orgapachespark-1252

 The documentation corresponding to this release can be found at:
 https://people.apache.org/~holden/spark-2.1.2-rc4-docs/


 *FAQ*

 *How can I help test this release?*

 If you are a Spark user, you can help us test this release by taking an
 existing Spark workload and running on this release candidate, then
 reporting any regressions.

 If you're working in PySpark you can set up a virtual env and install
 the current RC and see if anything important breaks, in the Java/Scala
 you can add the staging repository to your projects resolvers and test with
 the RC (make sure to clean up the artifact cache before/after so you
 don't end up building with a out of date RC going forward).

 *What should happen to JIRA tickets still targeting 2.1.2?*

 Committers should look at those and triage. Extremely important bug
 fixes, documentation, and API tweaks that impact compatibility should be
 worked on immediately. Everything else please retarget to 2.1.3.

 *But my bug isn't fixed!??!*

 In order to make timely releases, we will typically not hold the
 release unless the bug in question is a regression from 2.1.1. That
 being said if there is something which is a regression form 2.1.1 that
 has not been correctly targeted please ping a committer to help target the
 issue (you can see the open issues listed as impacting Spark 2.1.1 & 2
 .1.2
 
 )

 *What are the unresolved* issues targeted for 2.1.2
 
 ?

 At this time there are no open unresolved issues.

 *Is there anything different about this release?*

 This is the first release in awhile not built on the 

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-06 Thread Nick Pentreath
Checked sigs & hashes.

Tested on RHEL
build/mvn -Phadoop-2.7 -Phive -Pyarn test passed
Python tests passed

I ran R tests and am getting some failures:
https://gist.github.com/MLnick/ddf4d531d5125208771beee0cc9c697e (I seem to
recall similar issues on a previous release but I thought it was fixed).

I re-ran R tests on an Ubuntu box to double check and they passed there.

So I'd still +1 the release

Perhaps someone can take a look at the R failures on RHEL just in case
though.


On Fri, 6 Oct 2017 at 05:58 vaquar khan  wrote:

> +1 (non binding ) tested on Ubuntu ,all test case  are passed.
>
> Regards,
> Vaquar khan
>
> On Thu, Oct 5, 2017 at 10:46 PM, Hyukjin Kwon  wrote:
>
>> +1 too.
>>
>>
>> On 6 Oct 2017 10:49 am, "Reynold Xin"  wrote:
>>
>> +1
>>
>>
>> On Mon, Oct 2, 2017 at 11:24 PM, Holden Karau 
>> wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark
>>> version 2.1.2. The vote is open until Saturday October 7th at 9:00
>>> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>>>
>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>
>>> The tag to be voted on is v2.1.2-rc4
>>>  (
>>> 2abaea9e40fce81cd4626498e0f5c28a70917499)
>>>
>>> List of JIRA tickets resolved in this release can be found with this
>>> filter.
>>> 
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>>
>>> Release artifacts are signed with a key from:
>>> https://people.apache.org/~holden/holdens_keys.asc
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>>
>>>
>>> *FAQ*
>>>
>>> *How can I help test this release?*
>>>
>>> If you are a Spark user, you can help us test this release by taking an
>>> existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala
>>> you can add the staging repository to your projects resolvers and test with
>>> the RC (make sure to clean up the artifact cache before/after so you
>>> don't end up building with a out of date RC going forward).
>>>
>>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should be
>>> worked on immediately. Everything else please retarget to 2.1.3.
>>>
>>> *But my bug isn't fixed!??!*
>>>
>>> In order to make timely releases, we will typically not hold the release
>>> unless the bug in question is a regression from 2.1.1. That being said
>>> if there is something which is a regression form 2.1.1 that has not
>>> been correctly targeted please ping a committer to help target the issue
>>> (you can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>>> 
>>> )
>>>
>>> *What are the unresolved* issues targeted for 2.1.2
>>> 
>>> ?
>>>
>>> At this time there are no open unresolved issues.
>>>
>>> *Is there anything different about this release?*
>>>
>>> This is the first release in awhile not built on the AMPLAB Jenkins.
>>> This is good because it means future releases can more easily be built and
>>> signed securely (and I've been updating the documentation in
>>> https://github.com/apache/spark-website/pull/66 as I progress), however
>>> the chances of a mistake are higher with any change like this. If there
>>> something you normally take for granted as correct when checking a release,
>>> please double check this time :)
>>>
>>> *Should I be committing code to branch-2.1?*
>>>
>>> Thanks for asking! Please treat this stage in the RC process as "code
>>> freeze" so bug fixes only. If you're uncertain if something should be back
>>> ported please reach out. If you do commit to branch-2.1 please tag your
>>> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the
>>> 2.1.3 fixed into 2.1.2 as appropriate.
>>>
>>> *What happened to 

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-05 Thread vaquar khan
+1 (non binding ) tested on Ubuntu ,all test case  are passed.

Regards,
Vaquar khan

On Thu, Oct 5, 2017 at 10:46 PM, Hyukjin Kwon  wrote:

> +1 too.
>
>
> On 6 Oct 2017 10:49 am, "Reynold Xin"  wrote:
>
> +1
>
>
> On Mon, Oct 2, 2017 at 11:24 PM, Holden Karau 
> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and
>> passes if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.1.2
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see https://spark.apache.org/
>>
>> The tag to be voted on is v2.1.2-rc4
>>  (2abaea9e40fce81
>> cd4626498e0f5c28a70917499)
>>
>> List of JIRA tickets resolved in this release can be found with this
>> filter.
>> 
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>
>> Release artifacts are signed with a key from:
>> https://people.apache.org/~holden/holdens_keys.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>
>> The documentation corresponding to this release can be found at:
>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>
>>
>> *FAQ*
>>
>> *How can I help test this release?*
>>
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install the
>> current RC and see if anything important breaks, in the Java/Scala you
>> can add the staging repository to your projects resolvers and test with the
>> RC (make sure to clean up the artifact cache before/after so you don't
>> end up building with a out of date RC going forward).
>>
>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should be
>> worked on immediately. Everything else please retarget to 2.1.3.
>>
>> *But my bug isn't fixed!??!*
>>
>> In order to make timely releases, we will typically not hold the release
>> unless the bug in question is a regression from 2.1.1. That being said
>> if there is something which is a regression form 2.1.1 that has not been
>> correctly targeted please ping a committer to help target the issue (you
>> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>> 
>> )
>>
>> *What are the unresolved* issues targeted for 2.1.2
>> 
>> ?
>>
>> At this time there are no open unresolved issues.
>>
>> *Is there anything different about this release?*
>>
>> This is the first release in awhile not built on the AMPLAB Jenkins. This
>> is good because it means future releases can more easily be built and
>> signed securely (and I've been updating the documentation in
>> https://github.com/apache/spark-website/pull/66 as I progress), however
>> the chances of a mistake are higher with any change like this. If there
>> something you normally take for granted as correct when checking a release,
>> please double check this time :)
>>
>> *Should I be committing code to branch-2.1?*
>>
>> Thanks for asking! Please treat this stage in the RC process as "code
>> freeze" so bug fixes only. If you're uncertain if something should be back
>> ported please reach out. If you do commit to branch-2.1 please tag your
>> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the 2.1.3
>> fixed into 2.1.2 as appropriate.
>>
>> *What happened to RC3?*
>>
>> Some R+zinc interactions kept it from getting out the door.
>> --
>> Twitter: https://twitter.com/holdenkarau
>>
>
>
>


-- 
Regards,
Vaquar Khan
+1 -224-436-0783
Greater Chicago


Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-05 Thread Hyukjin Kwon
+1 too.

On 6 Oct 2017 10:49 am, "Reynold Xin"  wrote:

+1


On Mon, Oct 2, 2017 at 11:24 PM, Holden Karau  wrote:

> Please vote on releasing the following candidate as Apache Spark version 2
> .1.2. The vote is open until Saturday October 7th at 9:00 PST and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.1.2
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.1.2-rc4
>  (2abaea9e40fce81
> cd4626498e0f5c28a70917499)
>
> List of JIRA tickets resolved in this release can be found with this
> filter.
> 
>
> The release files, including signatures, digests, etc. can be found at:
> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>
> Release artifacts are signed with a key from:
> https://people.apache.org/~holden/holdens_keys.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1252
>
> The documentation corresponding to this release can be found at:
> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>
>
> *FAQ*
>
> *How can I help test this release?*
>
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install the
> current RC and see if anything important breaks, in the Java/Scala you
> can add the staging repository to your projects resolvers and test with the
> RC (make sure to clean up the artifact cache before/after so you don't
> end up building with a out of date RC going forward).
>
> *What should happen to JIRA tickets still targeting 2.1.2?*
>
> Committers should look at those and triage. Extremely important bug fixes,
> documentation, and API tweaks that impact compatibility should be worked on
> immediately. Everything else please retarget to 2.1.3.
>
> *But my bug isn't fixed!??!*
>
> In order to make timely releases, we will typically not hold the release
> unless the bug in question is a regression from 2.1.1. That being said if
> there is something which is a regression form 2.1.1 that has not been
> correctly targeted please ping a committer to help target the issue (you
> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
> 
> )
>
> *What are the unresolved* issues targeted for 2.1.2
> 
> ?
>
> At this time there are no open unresolved issues.
>
> *Is there anything different about this release?*
>
> This is the first release in awhile not built on the AMPLAB Jenkins. This
> is good because it means future releases can more easily be built and
> signed securely (and I've been updating the documentation in
> https://github.com/apache/spark-website/pull/66 as I progress), however
> the chances of a mistake are higher with any change like this. If there
> something you normally take for granted as correct when checking a release,
> please double check this time :)
>
> *Should I be committing code to branch-2.1?*
>
> Thanks for asking! Please treat this stage in the RC process as "code
> freeze" so bug fixes only. If you're uncertain if something should be back
> ported please reach out. If you do commit to branch-2.1 please tag your
> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the 2.1.3
> fixed into 2.1.2 as appropriate.
>
> *What happened to RC3?*
>
> Some R+zinc interactions kept it from getting out the door.
> --
> Twitter: https://twitter.com/holdenkarau
>


Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-05 Thread Reynold Xin
+1


On Mon, Oct 2, 2017 at 11:24 PM, Holden Karau  wrote:

> Please vote on releasing the following candidate as Apache Spark version 2
> .1.2. The vote is open until Saturday October 7th at 9:00 PST and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.1.2
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.1.2-rc4
>  (2abaea9e40fce81
> cd4626498e0f5c28a70917499)
>
> List of JIRA tickets resolved in this release can be found with this
> filter.
> 
>
> The release files, including signatures, digests, etc. can be found at:
> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>
> Release artifacts are signed with a key from:
> https://people.apache.org/~holden/holdens_keys.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1252
>
> The documentation corresponding to this release can be found at:
> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>
>
> *FAQ*
>
> *How can I help test this release?*
>
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install the
> current RC and see if anything important breaks, in the Java/Scala you
> can add the staging repository to your projects resolvers and test with the
> RC (make sure to clean up the artifact cache before/after so you don't
> end up building with a out of date RC going forward).
>
> *What should happen to JIRA tickets still targeting 2.1.2?*
>
> Committers should look at those and triage. Extremely important bug fixes,
> documentation, and API tweaks that impact compatibility should be worked on
> immediately. Everything else please retarget to 2.1.3.
>
> *But my bug isn't fixed!??!*
>
> In order to make timely releases, we will typically not hold the release
> unless the bug in question is a regression from 2.1.1. That being said if
> there is something which is a regression form 2.1.1 that has not been
> correctly targeted please ping a committer to help target the issue (you
> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
> 
> )
>
> *What are the unresolved* issues targeted for 2.1.2
> 
> ?
>
> At this time there are no open unresolved issues.
>
> *Is there anything different about this release?*
>
> This is the first release in awhile not built on the AMPLAB Jenkins. This
> is good because it means future releases can more easily be built and
> signed securely (and I've been updating the documentation in
> https://github.com/apache/spark-website/pull/66 as I progress), however
> the chances of a mistake are higher with any change like this. If there
> something you normally take for granted as correct when checking a release,
> please double check this time :)
>
> *Should I be committing code to branch-2.1?*
>
> Thanks for asking! Please treat this stage in the RC process as "code
> freeze" so bug fixes only. If you're uncertain if something should be back
> ported please reach out. If you do commit to branch-2.1 please tag your
> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the 2.1.3
> fixed into 2.1.2 as appropriate.
>
> *What happened to RC3?*
>
> Some R+zinc interactions kept it from getting out the door.
> --
> Twitter: https://twitter.com/holdenkarau
>


Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-05 Thread Denny Lee
+1 (non-binding)


On Wed, Oct 4, 2017 at 11:08 PM Holden Karau <hol...@pigscanfly.ca> wrote:

> Awesome, thanks for digging into the packaging on the R side in more
> detail. I'll look into how to update the keys file as well.
>
> On Wed, Oct 4, 2017 at 10:46 PM Felix Cheung <felixcheun...@hotmail.com>
> wrote:
>
>> +1
>>
>> Tested SparkR package manually on multiple platforms and checked
>> different Hadoop release jar.
>>
>> And previously tested the last RC on different R releases (see the last
>> RC vote thread)
>>
>> I found some differences in bin release jars created by the different
>> options when running the make-release script, created this JIRA to track
>> https://issues.apache.org/jira/browse/SPARK-22202
>>
>> I've checked to confirm these exist in 2.1.1 release so this isn't a
>> regression, and hence my +1.
>>
>> btw, I think we need to update this file for the new keys used in signing
>> this release https://www.apache.org/dist/spark/KEYS
>>
>>
>> _____
>> From: Liwei Lin <lwl...@gmail.com>
>> Sent: Wednesday, October 4, 2017 6:51 PM
>>
>> Subject: Re: [VOTE] Spark 2.1.2 (RC4)
>> To: Spark dev list <dev@spark.apache.org>
>>
>>
>> +1 (non-binding)
>>
>>
>> Cheers,
>> Liwei
>>
>> On Wed, Oct 4, 2017 at 4:03 PM, Nick Pentreath <nick.pentre...@gmail.com>
>> wrote:
>>
>>> Ah right! Was using a new cloud instance and didn't realize I was logged
>>> in as root! thanks
>>>
>>> On Tue, 3 Oct 2017 at 21:13 Marcelo Vanzin <van...@cloudera.com> wrote:
>>>
>>>> Maybe you're running as root (or the admin account on your OS)?
>>>>
>>>> On Tue, Oct 3, 2017 at 12:12 PM, Nick Pentreath
>>>> <nick.pentre...@gmail.com> wrote:
>>>> > Hmm I'm consistently getting this error in core tests:
>>>> >
>>>> > - SPARK-3697: ignore directories that cannot be read. *** FAILED ***
>>>> >   2 was not equal to 1 (FsHistoryProviderSuite.scala:146)
>>>> >
>>>> >
>>>> > Anyone else? Any insight? Perhaps it's my set up.
>>>> >
>>>> >>>
>>>> >>>
>>>> >>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau <hol...@pigscanfly.ca>
>>>> wrote:
>>>> >>>>
>>>> >>>> Please vote on releasing the following candidate as Apache Spark
>>>> version
>>>> >>>> 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and
>>>> passes if
>>>> >>>> a majority of at least 3 +1 PMC votes are cast.
>>>> >>>>
>>>> >>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>>> >>>> [ ] -1 Do not release this package because ...
>>>> >>>>
>>>> >>>>
>>>> >>>> To learn more about Apache Spark, please see
>>>> https://spark.apache.org/
>>>> >>>>
>>>> >>>> The tag to be voted on is v2.1.2-rc4
>>>> >>>> (2abaea9e40fce81cd4626498e0f5c28a70917499)
>>>> >>>>
>>>> >>>> List of JIRA tickets resolved in this release can be found with
>>>> this
>>>> >>>> filter.
>>>> >>>>
>>>> >>>> The release files, including signatures, digests, etc. can be
>>>> found at:
>>>> >>>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>>> >>>>
>>>> >>>> Release artifacts are signed with a key from:
>>>> >>>> https://people.apache.org/~holden/holdens_keys.asc
>>>> >>>>
>>>> >>>> The staging repository for this release can be found at:
>>>> >>>>
>>>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>>> >>>>
>>>> >>>> The documentation corresponding to this release can be found at:
>>>> >>>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>>> >>>>
>>>> >>>>
>>>> >>>> FAQ
>>>> >>>>
>>>> >>>> How can I help test this release?
>>>> >>>>
>>>> >>>> If you are a Spa

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-05 Thread Holden Karau
Awesome, thanks for digging into the packaging on the R side in more
detail. I'll look into how to update the keys file as well.

On Wed, Oct 4, 2017 at 10:46 PM Felix Cheung <felixcheun...@hotmail.com>
wrote:

> +1
>
> Tested SparkR package manually on multiple platforms and checked different
> Hadoop release jar.
>
> And previously tested the last RC on different R releases (see the last RC
> vote thread)
>
> I found some differences in bin release jars created by the different
> options when running the make-release script, created this JIRA to track
> https://issues.apache.org/jira/browse/SPARK-22202
>
> I've checked to confirm these exist in 2.1.1 release so this isn't a
> regression, and hence my +1.
>
> btw, I think we need to update this file for the new keys used in signing
> this release https://www.apache.org/dist/spark/KEYS
>
>
> _
> From: Liwei Lin <lwl...@gmail.com>
> Sent: Wednesday, October 4, 2017 6:51 PM
>
> Subject: Re: [VOTE] Spark 2.1.2 (RC4)
> To: Spark dev list <dev@spark.apache.org>
>
>
> +1 (non-binding)
>
>
> Cheers,
> Liwei
>
> On Wed, Oct 4, 2017 at 4:03 PM, Nick Pentreath <nick.pentre...@gmail.com>
> wrote:
>
>> Ah right! Was using a new cloud instance and didn't realize I was logged
>> in as root! thanks
>>
>> On Tue, 3 Oct 2017 at 21:13 Marcelo Vanzin <van...@cloudera.com> wrote:
>>
>>> Maybe you're running as root (or the admin account on your OS)?
>>>
>>> On Tue, Oct 3, 2017 at 12:12 PM, Nick Pentreath
>>> <nick.pentre...@gmail.com> wrote:
>>> > Hmm I'm consistently getting this error in core tests:
>>> >
>>> > - SPARK-3697: ignore directories that cannot be read. *** FAILED ***
>>> >   2 was not equal to 1 (FsHistoryProviderSuite.scala:146)
>>> >
>>> >
>>> > Anyone else? Any insight? Perhaps it's my set up.
>>> >
>>> >>>
>>> >>>
>>> >>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau <hol...@pigscanfly.ca>
>>> wrote:
>>> >>>>
>>> >>>> Please vote on releasing the following candidate as Apache Spark
>>> version
>>> >>>> 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and
>>> passes if
>>> >>>> a majority of at least 3 +1 PMC votes are cast.
>>> >>>>
>>> >>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>> >>>> [ ] -1 Do not release this package because ...
>>> >>>>
>>> >>>>
>>> >>>> To learn more about Apache Spark, please see
>>> https://spark.apache.org/
>>> >>>>
>>> >>>> The tag to be voted on is v2.1.2-rc4
>>> >>>> (2abaea9e40fce81cd4626498e0f5c28a70917499)
>>> >>>>
>>> >>>> List of JIRA tickets resolved in this release can be found with this
>>> >>>> filter.
>>> >>>>
>>> >>>> The release files, including signatures, digests, etc. can be found
>>> at:
>>> >>>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>> >>>>
>>> >>>> Release artifacts are signed with a key from:
>>> >>>> https://people.apache.org/~holden/holdens_keys.asc
>>> >>>>
>>> >>>> The staging repository for this release can be found at:
>>> >>>>
>>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>> >>>>
>>> >>>> The documentation corresponding to this release can be found at:
>>> >>>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>> >>>>
>>> >>>>
>>> >>>> FAQ
>>> >>>>
>>> >>>> How can I help test this release?
>>> >>>>
>>> >>>> If you are a Spark user, you can help us test this release by
>>> taking an
>>> >>>> existing Spark workload and running on this release candidate, then
>>> >>>> reporting any regressions.
>>> >>>>
>>> >>>> If you're working in PySpark you can set up a virtual env and
>>> install
>>> >>>> the current RC and see if anything important breaks, in the
>>> Java/Scala you
>>> >>>> can add the staging repository

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-04 Thread Felix Cheung
+1

Tested SparkR package manually on multiple platforms and checked different 
Hadoop release jar.

And previously tested the last RC on different R releases (see the last RC vote 
thread)

I found some differences in bin release jars created by the different options 
when running the make-release script, created this JIRA to track
https://issues.apache.org/jira/browse/SPARK-22202

I've checked to confirm these exist in 2.1.1 release so this isn't a 
regression, and hence my +1.

btw, I think we need to update this file for the new keys used in signing this 
release https://www.apache.org/dist/spark/KEYS


_
From: Liwei Lin <lwl...@gmail.com<mailto:lwl...@gmail.com>>
Sent: Wednesday, October 4, 2017 6:51 PM
Subject: Re: [VOTE] Spark 2.1.2 (RC4)
To: Spark dev list <dev@spark.apache.org<mailto:dev@spark.apache.org>>


+1 (non-binding)


Cheers,
Liwei

On Wed, Oct 4, 2017 at 4:03 PM, Nick Pentreath 
<nick.pentre...@gmail.com<mailto:nick.pentre...@gmail.com>> wrote:
Ah right! Was using a new cloud instance and didn't realize I was logged in as 
root! thanks

On Tue, 3 Oct 2017 at 21:13 Marcelo Vanzin 
<van...@cloudera.com<mailto:van...@cloudera.com>> wrote:
Maybe you're running as root (or the admin account on your OS)?

On Tue, Oct 3, 2017 at 12:12 PM, Nick Pentreath
<nick.pentre...@gmail.com<mailto:nick.pentre...@gmail.com>> wrote:
> Hmm I'm consistently getting this error in core tests:
>
> - SPARK-3697: ignore directories that cannot be read. *** FAILED ***
>   2 was not equal to 1 (FsHistoryProviderSuite.scala:146)
>
>
> Anyone else? Any insight? Perhaps it's my set up.
>
>>>
>>>
>>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau 
>>> <hol...@pigscanfly.ca<mailto:hol...@pigscanfly.ca>> wrote:
>>>>
>>>> Please vote on releasing the following candidate as Apache Spark version
>>>> 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and passes 
>>>> if
>>>> a majority of at least 3 +1 PMC votes are cast.
>>>>
>>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>>> [ ] -1 Do not release this package because ...
>>>>
>>>>
>>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>>
>>>> The tag to be voted on is v2.1.2-rc4
>>>> (2abaea9e40fce81cd4626498e0f5c28a70917499)
>>>>
>>>> List of JIRA tickets resolved in this release can be found with this
>>>> filter.
>>>>
>>>> The release files, including signatures, digests, etc. can be found at:
>>>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>>>
>>>> Release artifacts are signed with a key from:
>>>> https://people.apache.org/~holden/holdens_keys.asc
>>>>
>>>> The staging repository for this release can be found at:
>>>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>>>
>>>> The documentation corresponding to this release can be found at:
>>>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>>>
>>>>
>>>> FAQ
>>>>
>>>> How can I help test this release?
>>>>
>>>> If you are a Spark user, you can help us test this release by taking an
>>>> existing Spark workload and running on this release candidate, then
>>>> reporting any regressions.
>>>>
>>>> If you're working in PySpark you can set up a virtual env and install
>>>> the current RC and see if anything important breaks, in the Java/Scala you
>>>> can add the staging repository to your projects resolvers and test with the
>>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>>> up building with a out of date RC going forward).
>>>>
>>>> What should happen to JIRA tickets still targeting 2.1.2?
>>>>
>>>> Committers should look at those and triage. Extremely important bug
>>>> fixes, documentation, and API tweaks that impact compatibility should be
>>>> worked on immediately. Everything else please retarget to 2.1.3.
>>>>
>>>> But my bug isn't fixed!??!
>>>>
>>>> In order to make timely releases, we will typically not hold the release
>>>> unless the bug in question is a regression from 2.1.1. That being said if
>>>> there is something which is a regression form 2.1.1 that has not been
>>>> correctly targeted please ping a committer to help target the issue (you 
>>>>

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-04 Thread Liwei Lin
+1 (non-binding)


Cheers,
Liwei

On Wed, Oct 4, 2017 at 4:03 PM, Nick Pentreath 
wrote:

> Ah right! Was using a new cloud instance and didn't realize I was logged
> in as root! thanks
>
> On Tue, 3 Oct 2017 at 21:13 Marcelo Vanzin  wrote:
>
>> Maybe you're running as root (or the admin account on your OS)?
>>
>> On Tue, Oct 3, 2017 at 12:12 PM, Nick Pentreath
>>  wrote:
>> > Hmm I'm consistently getting this error in core tests:
>> >
>> > - SPARK-3697: ignore directories that cannot be read. *** FAILED ***
>> >   2 was not equal to 1 (FsHistoryProviderSuite.scala:146)
>> >
>> >
>> > Anyone else? Any insight? Perhaps it's my set up.
>> >
>> >>>
>> >>>
>> >>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau 
>> wrote:
>> 
>>  Please vote on releasing the following candidate as Apache Spark
>> version
>>  2.1.2. The vote is open until Saturday October 7th at 9:00 PST and
>> passes if
>>  a majority of at least 3 +1 PMC votes are cast.
>> 
>>  [ ] +1 Release this package as Apache Spark 2.1.2
>>  [ ] -1 Do not release this package because ...
>> 
>> 
>>  To learn more about Apache Spark, please see
>> https://spark.apache.org/
>> 
>>  The tag to be voted on is v2.1.2-rc4
>>  (2abaea9e40fce81cd4626498e0f5c28a70917499)
>> 
>>  List of JIRA tickets resolved in this release can be found with this
>>  filter.
>> 
>>  The release files, including signatures, digests, etc. can be found
>> at:
>>  https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>> 
>>  Release artifacts are signed with a key from:
>>  https://people.apache.org/~holden/holdens_keys.asc
>> 
>>  The staging repository for this release can be found at:
>>  https://repository.apache.org/content/repositories/
>> orgapachespark-1252
>> 
>>  The documentation corresponding to this release can be found at:
>>  https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>> 
>> 
>>  FAQ
>> 
>>  How can I help test this release?
>> 
>>  If you are a Spark user, you can help us test this release by taking
>> an
>>  existing Spark workload and running on this release candidate, then
>>  reporting any regressions.
>> 
>>  If you're working in PySpark you can set up a virtual env and install
>>  the current RC and see if anything important breaks, in the
>> Java/Scala you
>>  can add the staging repository to your projects resolvers and test
>> with the
>>  RC (make sure to clean up the artifact cache before/after so you
>> don't end
>>  up building with a out of date RC going forward).
>> 
>>  What should happen to JIRA tickets still targeting 2.1.2?
>> 
>>  Committers should look at those and triage. Extremely important bug
>>  fixes, documentation, and API tweaks that impact compatibility
>> should be
>>  worked on immediately. Everything else please retarget to 2.1.3.
>> 
>>  But my bug isn't fixed!??!
>> 
>>  In order to make timely releases, we will typically not hold the
>> release
>>  unless the bug in question is a regression from 2.1.1. That being
>> said if
>>  there is something which is a regression form 2.1.1 that has not been
>>  correctly targeted please ping a committer to help target the issue
>> (you can
>>  see the open issues listed as impacting Spark 2.1.1 & 2.1.2)
>> 
>>  What are the unresolved issues targeted for 2.1.2?
>> 
>>  At this time there are no open unresolved issues.
>> 
>>  Is there anything different about this release?
>> 
>>  This is the first release in awhile not built on the AMPLAB Jenkins.
>>  This is good because it means future releases can more easily be
>> built and
>>  signed securely (and I've been updating the documentation in
>>  https://github.com/apache/spark-website/pull/66 as I progress),
>> however the
>>  chances of a mistake are higher with any change like this. If there
>>  something you normally take for granted as correct when checking a
>> release,
>>  please double check this time :)
>> 
>>  Should I be committing code to branch-2.1?
>> 
>>  Thanks for asking! Please treat this stage in the RC process as "code
>>  freeze" so bug fixes only. If you're uncertain if something should
>> be back
>>  ported please reach out. If you do commit to branch-2.1 please tag
>> your JIRA
>>  issue fix version for 2.1.3 and if we cut another RC I'll move the
>> 2.1.3
>>  fixed into 2.1.2 as appropriate.
>> 
>>  What happened to RC3?
>> 
>>  Some R+zinc interactions kept it from getting out the door.
>>  --
>>  Twitter: https://twitter.com/holdenkarau
>> >>
>> >>
>> >
>>
>>
>>
>> --
>> Marcelo
>>
>


Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-04 Thread Nick Pentreath
Ah right! Was using a new cloud instance and didn't realize I was logged in
as root! thanks

On Tue, 3 Oct 2017 at 21:13 Marcelo Vanzin  wrote:

> Maybe you're running as root (or the admin account on your OS)?
>
> On Tue, Oct 3, 2017 at 12:12 PM, Nick Pentreath
>  wrote:
> > Hmm I'm consistently getting this error in core tests:
> >
> > - SPARK-3697: ignore directories that cannot be read. *** FAILED ***
> >   2 was not equal to 1 (FsHistoryProviderSuite.scala:146)
> >
> >
> > Anyone else? Any insight? Perhaps it's my set up.
> >
> >>>
> >>>
> >>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau 
> wrote:
> 
>  Please vote on releasing the following candidate as Apache Spark
> version
>  2.1.2. The vote is open until Saturday October 7th at 9:00 PST and
> passes if
>  a majority of at least 3 +1 PMC votes are cast.
> 
>  [ ] +1 Release this package as Apache Spark 2.1.2
>  [ ] -1 Do not release this package because ...
> 
> 
>  To learn more about Apache Spark, please see
> https://spark.apache.org/
> 
>  The tag to be voted on is v2.1.2-rc4
>  (2abaea9e40fce81cd4626498e0f5c28a70917499)
> 
>  List of JIRA tickets resolved in this release can be found with this
>  filter.
> 
>  The release files, including signatures, digests, etc. can be found
> at:
>  https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
> 
>  Release artifacts are signed with a key from:
>  https://people.apache.org/~holden/holdens_keys.asc
> 
>  The staging repository for this release can be found at:
> 
> https://repository.apache.org/content/repositories/orgapachespark-1252
> 
>  The documentation corresponding to this release can be found at:
>  https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
> 
> 
>  FAQ
> 
>  How can I help test this release?
> 
>  If you are a Spark user, you can help us test this release by taking
> an
>  existing Spark workload and running on this release candidate, then
>  reporting any regressions.
> 
>  If you're working in PySpark you can set up a virtual env and install
>  the current RC and see if anything important breaks, in the
> Java/Scala you
>  can add the staging repository to your projects resolvers and test
> with the
>  RC (make sure to clean up the artifact cache before/after so you
> don't end
>  up building with a out of date RC going forward).
> 
>  What should happen to JIRA tickets still targeting 2.1.2?
> 
>  Committers should look at those and triage. Extremely important bug
>  fixes, documentation, and API tweaks that impact compatibility should
> be
>  worked on immediately. Everything else please retarget to 2.1.3.
> 
>  But my bug isn't fixed!??!
> 
>  In order to make timely releases, we will typically not hold the
> release
>  unless the bug in question is a regression from 2.1.1. That being
> said if
>  there is something which is a regression form 2.1.1 that has not been
>  correctly targeted please ping a committer to help target the issue
> (you can
>  see the open issues listed as impacting Spark 2.1.1 & 2.1.2)
> 
>  What are the unresolved issues targeted for 2.1.2?
> 
>  At this time there are no open unresolved issues.
> 
>  Is there anything different about this release?
> 
>  This is the first release in awhile not built on the AMPLAB Jenkins.
>  This is good because it means future releases can more easily be
> built and
>  signed securely (and I've been updating the documentation in
>  https://github.com/apache/spark-website/pull/66 as I progress),
> however the
>  chances of a mistake are higher with any change like this. If there
>  something you normally take for granted as correct when checking a
> release,
>  please double check this time :)
> 
>  Should I be committing code to branch-2.1?
> 
>  Thanks for asking! Please treat this stage in the RC process as "code
>  freeze" so bug fixes only. If you're uncertain if something should be
> back
>  ported please reach out. If you do commit to branch-2.1 please tag
> your JIRA
>  issue fix version for 2.1.3 and if we cut another RC I'll move the
> 2.1.3
>  fixed into 2.1.2 as appropriate.
> 
>  What happened to RC3?
> 
>  Some R+zinc interactions kept it from getting out the door.
>  --
>  Twitter: https://twitter.com/holdenkarau
> >>
> >>
> >
>
>
>
> --
> Marcelo
>


Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Marcelo Vanzin
Maybe you're running as root (or the admin account on your OS)?

On Tue, Oct 3, 2017 at 12:12 PM, Nick Pentreath
 wrote:
> Hmm I'm consistently getting this error in core tests:
>
> - SPARK-3697: ignore directories that cannot be read. *** FAILED ***
>   2 was not equal to 1 (FsHistoryProviderSuite.scala:146)
>
>
> Anyone else? Any insight? Perhaps it's my set up.
>
>>>
>>>
>>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau  wrote:

 Please vote on releasing the following candidate as Apache Spark version
 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and passes 
 if
 a majority of at least 3 +1 PMC votes are cast.

 [ ] +1 Release this package as Apache Spark 2.1.2
 [ ] -1 Do not release this package because ...


 To learn more about Apache Spark, please see https://spark.apache.org/

 The tag to be voted on is v2.1.2-rc4
 (2abaea9e40fce81cd4626498e0f5c28a70917499)

 List of JIRA tickets resolved in this release can be found with this
 filter.

 The release files, including signatures, digests, etc. can be found at:
 https://home.apache.org/~holden/spark-2.1.2-rc4-bin/

 Release artifacts are signed with a key from:
 https://people.apache.org/~holden/holdens_keys.asc

 The staging repository for this release can be found at:
 https://repository.apache.org/content/repositories/orgapachespark-1252

 The documentation corresponding to this release can be found at:
 https://people.apache.org/~holden/spark-2.1.2-rc4-docs/


 FAQ

 How can I help test this release?

 If you are a Spark user, you can help us test this release by taking an
 existing Spark workload and running on this release candidate, then
 reporting any regressions.

 If you're working in PySpark you can set up a virtual env and install
 the current RC and see if anything important breaks, in the Java/Scala you
 can add the staging repository to your projects resolvers and test with the
 RC (make sure to clean up the artifact cache before/after so you don't end
 up building with a out of date RC going forward).

 What should happen to JIRA tickets still targeting 2.1.2?

 Committers should look at those and triage. Extremely important bug
 fixes, documentation, and API tweaks that impact compatibility should be
 worked on immediately. Everything else please retarget to 2.1.3.

 But my bug isn't fixed!??!

 In order to make timely releases, we will typically not hold the release
 unless the bug in question is a regression from 2.1.1. That being said if
 there is something which is a regression form 2.1.1 that has not been
 correctly targeted please ping a committer to help target the issue (you 
 can
 see the open issues listed as impacting Spark 2.1.1 & 2.1.2)

 What are the unresolved issues targeted for 2.1.2?

 At this time there are no open unresolved issues.

 Is there anything different about this release?

 This is the first release in awhile not built on the AMPLAB Jenkins.
 This is good because it means future releases can more easily be built and
 signed securely (and I've been updating the documentation in
 https://github.com/apache/spark-website/pull/66 as I progress), however the
 chances of a mistake are higher with any change like this. If there
 something you normally take for granted as correct when checking a release,
 please double check this time :)

 Should I be committing code to branch-2.1?

 Thanks for asking! Please treat this stage in the RC process as "code
 freeze" so bug fixes only. If you're uncertain if something should be back
 ported please reach out. If you do commit to branch-2.1 please tag your 
 JIRA
 issue fix version for 2.1.3 and if we cut another RC I'll move the 2.1.3
 fixed into 2.1.2 as appropriate.

 What happened to RC3?

 Some R+zinc interactions kept it from getting out the door.
 --
 Twitter: https://twitter.com/holdenkarau
>>
>>
>



-- 
Marcelo

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Nick Pentreath
Hmm I'm consistently getting this error in core tests:

- SPARK-3697: ignore directories that cannot be read. *** FAILED ***
  2 was not equal to 1 (FsHistoryProviderSuite.scala:146)


Anyone else? Any insight? Perhaps it's my set up.


>>
>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau  wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark
>>> version 2.1.2. The vote is open until Saturday October 7th at 9:00
>>> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>>>
>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>
>>> The tag to be voted on is v2.1.2-rc4
>>>  (
>>> 2abaea9e40fce81cd4626498e0f5c28a70917499)
>>>
>>> List of JIRA tickets resolved in this release can be found with this
>>> filter.
>>> 
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>>
>>> Release artifacts are signed with a key from:
>>> https://people.apache.org/~holden/holdens_keys.asc
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>>
>>>
>>> *FAQ*
>>>
>>> *How can I help test this release?*
>>>
>>> If you are a Spark user, you can help us test this release by taking an
>>> existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala
>>> you can add the staging repository to your projects resolvers and test with
>>> the RC (make sure to clean up the artifact cache before/after so you
>>> don't end up building with a out of date RC going forward).
>>>
>>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should be
>>> worked on immediately. Everything else please retarget to 2.1.3.
>>>
>>> *But my bug isn't fixed!??!*
>>>
>>> In order to make timely releases, we will typically not hold the release
>>> unless the bug in question is a regression from 2.1.1. That being said
>>> if there is something which is a regression form 2.1.1 that has not
>>> been correctly targeted please ping a committer to help target the issue
>>> (you can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>>> 
>>> )
>>>
>>> *What are the unresolved* issues targeted for 2.1.2
>>> 
>>> ?
>>>
>>> At this time there are no open unresolved issues.
>>>
>>> *Is there anything different about this release?*
>>>
>>> This is the first release in awhile not built on the AMPLAB Jenkins.
>>> This is good because it means future releases can more easily be built and
>>> signed securely (and I've been updating the documentation in
>>> https://github.com/apache/spark-website/pull/66 as I progress), however
>>> the chances of a mistake are higher with any change like this. If there
>>> something you normally take for granted as correct when checking a release,
>>> please double check this time :)
>>>
>>> *Should I be committing code to branch-2.1?*
>>>
>>> Thanks for asking! Please treat this stage in the RC process as "code
>>> freeze" so bug fixes only. If you're uncertain if something should be back
>>> ported please reach out. If you do commit to branch-2.1 please tag your
>>> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the
>>> 2.1.3 fixed into 2.1.2 as appropriate.
>>>
>>> *What happened to RC3?*
>>>
>>> Some R+zinc interactions kept it from getting out the door.
>>> --
>>> Twitter: https://twitter.com/holdenkarau
>>>
>>
>


Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Ryan Blue
+1

Verified checksums and signatures for the archives in home.apache.org, spot
checked the same for artifacts in Nexus.

On Tue, Oct 3, 2017 at 8:06 AM, Wenchen Fan <cloud0...@gmail.com> wrote:

> +1
>
> On Tue, Oct 3, 2017 at 11:00 PM, Kazuaki Ishizaki <ishiz...@jp.ibm.com>
> wrote:
>
>> +1 (non-binding)
>>
>> I tested it on Ubuntu 16.04 and OpenJDK8 on ppc64le. All of the tests for
>> core/sql-core/sql-catalyst/mllib/mllib-local have passed.
>>
>> $ java -version
>> openjdk version "1.8.0_131"
>> OpenJDK Runtime Environment (build 1.8.0_131-8u131-b11-2ubuntu1.1
>> 6.04.3-b11)
>> OpenJDK 64-Bit Server VM (build 25.131-b11, mixed mode)
>>
>> % build/mvn -DskipTests -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 -T
>> 24 clean package install
>> % build/mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 test -pl core
>> -pl 'sql/core' -pl 'sql/catalyst' -pl mllib -pl mllib-local
>> ...
>> Run completed in 12 minutes, 19 seconds.
>> Total number of tests run: 1035
>> Suites: completed 166, aborted 0
>> Tests: succeeded 1035, failed 0, canceled 0, ignored 5, pending 0
>> All tests passed.
>> [INFO] 
>> 
>> [INFO] Reactor Summary:
>> [INFO]
>> [INFO] Spark Project Core . SUCCESS
>> [17:13 min]
>> [INFO] Spark Project ML Local Library . SUCCESS [
>>  5.759 s]
>> [INFO] Spark Project Catalyst . SUCCESS
>> [09:48 min]
>> [INFO] Spark Project SQL .. SUCCESS
>> [12:01 min]
>> [INFO] Spark Project ML Library ... SUCCESS
>> [15:16 min]
>> [INFO] 
>> 
>> [INFO] BUILD SUCCESS
>> [INFO] 
>> 
>> [INFO] Total time: 54:28 min
>> [INFO] Finished at: 2017-10-03T23:53:33+09:00
>> [INFO] Final Memory: 112M/322M
>> [INFO] 
>> ----
>> [WARNING] The requested profile "hive" could not be activated because it
>> does not exist.
>>
>> Kazuaki Ishizaki
>>
>>
>>
>>
>> From:Dongjoon Hyun <dongjoon.h...@gmail.com>
>> To:Spark dev list <dev@spark.apache.org>
>> Date:2017/10/03 23:23
>> Subject:Re: [VOTE] Spark 2.1.2 (RC4)
>> --
>>
>>
>>
>> +1 (non-binding)
>>
>> Dongjoon.
>>
>> On Tue, Oct 3, 2017 at 5:13 AM, Herman van Hövell tot Westerflier <
>> *hvanhov...@databricks.com* <hvanhov...@databricks.com>> wrote:
>> +1
>>
>> On Tue, Oct 3, 2017 at 1:32 PM, Sean Owen <*so...@cloudera.com*
>> <so...@cloudera.com>> wrote:
>> +1 same as last RC. Tests pass, sigs and hashes are OK.
>>
>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau <*hol...@pigscanfly.ca*
>> <hol...@pigscanfly.ca>> wrote:
>> Please vote on releasing the following candidate as Apache Spark
>> version 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and
>> passes if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.1.2
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see *https://spark.apache.org/*
>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__spark.apache.org_=DwMFaQ=jf_iaSHvJObTbx-siA1ZOg=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg=cF_k_lDBFIRW7HXbcjAQSyY9hc2aq_au5TZdMKxvSQ8=oO_pnYI_zEguPHn9cIWmJabSdYmutt2Gd9UO7riE1h0=>
>>
>> The tag to be voted on is *v2.1.2-rc4*
>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_spark_tree_v2.1.2-2Drc4=DwMFaQ=jf_iaSHvJObTbx-siA1ZOg=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg=cF_k_lDBFIRW7HXbcjAQSyY9hc2aq_au5TZdMKxvSQ8=CM1lVDnnKDkUYUnsSXYcCng3VmusP6LGYZFHsm_a_Uc=>
>>  (2abaea9e40fce81cd4626498e0f5c28a70917499)
>>
>> List of JIRA tickets resolved in this release can be found *with this
>> filter.*
>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_issues_-3Fjql-3Dproject-2520-253D-2520SPARK-2520AND-2520fixVersion-2520-253D-25202.1.2=DwMFaQ=jf_iaSHvJObTbx-siA1ZOg=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg=cF_k_lDBFIRW7HXbcjAQSyY9hc2aq_au5TZdMKxvSQ8=HIL-A2GxedK0z2b3ojaTst1-DdnH6r3BdoE7-4-fOfY=>
>>
>> The re

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Wenchen Fan
+1

On Tue, Oct 3, 2017 at 11:00 PM, Kazuaki Ishizaki <ishiz...@jp.ibm.com>
wrote:

> +1 (non-binding)
>
> I tested it on Ubuntu 16.04 and OpenJDK8 on ppc64le. All of the tests for
> core/sql-core/sql-catalyst/mllib/mllib-local have passed.
>
> $ java -version
> openjdk version "1.8.0_131"
> OpenJDK Runtime Environment (build 1.8.0_131-8u131-b11-2ubuntu1.
> 16.04.3-b11)
> OpenJDK 64-Bit Server VM (build 25.131-b11, mixed mode)
>
> % build/mvn -DskipTests -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 -T
> 24 clean package install
> % build/mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 test -pl core
> -pl 'sql/core' -pl 'sql/catalyst' -pl mllib -pl mllib-local
> ...
> Run completed in 12 minutes, 19 seconds.
> Total number of tests run: 1035
> Suites: completed 166, aborted 0
> Tests: succeeded 1035, failed 0, canceled 0, ignored 5, pending 0
> All tests passed.
> [INFO] 
> 
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Spark Project Core . SUCCESS
> [17:13 min]
> [INFO] Spark Project ML Local Library . SUCCESS [
>  5.759 s]
> [INFO] Spark Project Catalyst . SUCCESS [09:48
> min]
> [INFO] Spark Project SQL .. SUCCESS
> [12:01 min]
> [INFO] Spark Project ML Library ... SUCCESS [15:16
> min]
> [INFO] 
> 
> [INFO] BUILD SUCCESS
> [INFO] 
> 
> [INFO] Total time: 54:28 min
> [INFO] Finished at: 2017-10-03T23:53:33+09:00
> [INFO] Final Memory: 112M/322M
> [INFO] 
> 
> [WARNING] The requested profile "hive" could not be activated because it
> does not exist.
>
> Kazuaki Ishizaki
>
>
>
>
> From:Dongjoon Hyun <dongjoon.h...@gmail.com>
> To:Spark dev list <dev@spark.apache.org>
> Date:2017/10/03 23:23
> Subject:Re: [VOTE] Spark 2.1.2 (RC4)
> --
>
>
>
> +1 (non-binding)
>
> Dongjoon.
>
> On Tue, Oct 3, 2017 at 5:13 AM, Herman van Hövell tot Westerflier <
> *hvanhov...@databricks.com* <hvanhov...@databricks.com>> wrote:
> +1
>
> On Tue, Oct 3, 2017 at 1:32 PM, Sean Owen <*so...@cloudera.com*
> <so...@cloudera.com>> wrote:
> +1 same as last RC. Tests pass, sigs and hashes are OK.
>
> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau <*hol...@pigscanfly.ca*
> <hol...@pigscanfly.ca>> wrote:
> Please vote on releasing the following candidate as Apache Spark
> version 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and
> passes if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.1.2
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see *https://spark.apache.org/*
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__spark.apache.org_=DwMFaQ=jf_iaSHvJObTbx-siA1ZOg=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg=cF_k_lDBFIRW7HXbcjAQSyY9hc2aq_au5TZdMKxvSQ8=oO_pnYI_zEguPHn9cIWmJabSdYmutt2Gd9UO7riE1h0=>
>
> The tag to be voted on is *v2.1.2-rc4*
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_spark_tree_v2.1.2-2Drc4=DwMFaQ=jf_iaSHvJObTbx-siA1ZOg=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg=cF_k_lDBFIRW7HXbcjAQSyY9hc2aq_au5TZdMKxvSQ8=CM1lVDnnKDkUYUnsSXYcCng3VmusP6LGYZFHsm_a_Uc=>
>  (2abaea9e40fce81cd4626498e0f5c28a70917499)
>
> List of JIRA tickets resolved in this release can be found *with this
> filter.*
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_issues_-3Fjql-3Dproject-2520-253D-2520SPARK-2520AND-2520fixVersion-2520-253D-25202.1.2=DwMFaQ=jf_iaSHvJObTbx-siA1ZOg=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg=cF_k_lDBFIRW7HXbcjAQSyY9hc2aq_au5TZdMKxvSQ8=HIL-A2GxedK0z2b3ojaTst1-DdnH6r3BdoE7-4-fOfY=>
>
> The release files, including signatures, digests, etc. can be found at:
> *https://home.apache.org/~holden/spark-2.1.2-rc4-bin/*
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__home.apache.org_-7Eholden_spark-2D2.1.2-2Drc4-2Dbin_=DwMFaQ=jf_iaSHvJObTbx-siA1ZOg=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg=cF_k_lDBFIRW7HXbcjAQSyY9hc2aq_au5TZdMKxvSQ8=zen74-f0FyM_GzLU-2TIYj_Sz0l-xuxmSMbmYZtNEf8=>
>
> Release artifacts are signed with a key from:
> *https://people.apache.org/~holden/holdens_keys.asc*
> <https://urldefense.proofpoint.com/v2/url?u=http

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Kazuaki Ishizaki
+1 (non-binding)

I tested it on Ubuntu 16.04 and OpenJDK8 on ppc64le. All of the tests for 
core/sql-core/sql-catalyst/mllib/mllib-local have passed.

$ java -version
openjdk version "1.8.0_131"
OpenJDK Runtime Environment (build 
1.8.0_131-8u131-b11-2ubuntu1.16.04.3-b11)
OpenJDK 64-Bit Server VM (build 25.131-b11, mixed mode)

% build/mvn -DskipTests -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 -T 
24 clean package install
% build/mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 test -pl core 
-pl 'sql/core' -pl 'sql/catalyst' -pl mllib -pl mllib-local
...
Run completed in 12 minutes, 19 seconds.
Total number of tests run: 1035
Suites: completed 166, aborted 0
Tests: succeeded 1035, failed 0, canceled 0, ignored 5, pending 0
All tests passed.
[INFO] 

[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Core . SUCCESS [17:13 
min]
[INFO] Spark Project ML Local Library . SUCCESS [ 
5.759 s]
[INFO] Spark Project Catalyst . SUCCESS [09:48 
min]
[INFO] Spark Project SQL .. SUCCESS [12:01 
min]
[INFO] Spark Project ML Library ... SUCCESS [15:16 
min]
[INFO] 

[INFO] BUILD SUCCESS
[INFO] 

[INFO] Total time: 54:28 min
[INFO] Finished at: 2017-10-03T23:53:33+09:00
[INFO] Final Memory: 112M/322M
[INFO] 

[WARNING] The requested profile "hive" could not be activated because it 
does not exist.

Kazuaki Ishizaki




From:   Dongjoon Hyun <dongjoon.h...@gmail.com>
To: Spark dev list <dev@spark.apache.org>
Date:   2017/10/03 23:23
Subject:    Re: [VOTE] Spark 2.1.2 (RC4)



+1 (non-binding)

Dongjoon.

On Tue, Oct 3, 2017 at 5:13 AM, Herman van Hövell tot Westerflier <
hvanhov...@databricks.com> wrote:
+1

On Tue, Oct 3, 2017 at 1:32 PM, Sean Owen <so...@cloudera.com> wrote:
+1 same as last RC. Tests pass, sigs and hashes are OK.

On Tue, Oct 3, 2017 at 7:24 AM Holden Karau <hol...@pigscanfly.ca> wrote:
Please vote on releasing the following candidate as Apache Spark 
version 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and 
passes if a majority of at least 3 +1 PMC votes are cast.

[ ] +1 Release this package as Apache Spark 2.1.2
[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.1.2-rc4 (
2abaea9e40fce81cd4626498e0f5c28a70917499)

List of JIRA tickets resolved in this release can be found with this 
filter.

The release files, including signatures, digests, etc. can be found at:
https://home.apache.org/~holden/spark-2.1.2-rc4-bin/

Release artifacts are signed with a key from:
https://people.apache.org/~holden/holdens_keys.asc

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1252

The documentation corresponding to this release can be found at:
https://people.apache.org/~holden/spark-2.1.2-rc4-docs/


FAQ

How can I help test this release?

If you are a Spark user, you can help us test this release by taking an 
existing Spark workload and running on this release candidate, then 
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the 
current RC and see if anything important breaks, in the Java/Scala you can 
add the staging repository to your projects resolvers and test with 
the RC (make sure to clean up the artifact cache before/after so you don't 
end up building with a out of date RC going forward).

What should happen to JIRA tickets still targeting 2.1.2?

Committers should look at those and triage. Extremely important bug fixes, 
documentation, and API tweaks that impact compatibility should be worked 
on immediately. Everything else please retarget to 2.1.3.

But my bug isn't fixed!??!

In order to make timely releases, we will typically not hold the release 
unless the bug in question is a regression from 2.1.1. That being said if 
there is something which is a regression form 2.1.1 that has not been 
correctly targeted please ping a committer to help target the issue (you 
can see the open issues listed as impacting Spark 2.1.1 & 2.1.2)

What are the unresolved issues targeted for 2.1.2?

At this time there are no open unresolved issues.

Is there anything different about this release?

This is the first release in awhile not built on the AMPLAB Jenkins. This 
is good because it means future releases can more easily be built and 
signed securely (and I've been updating the documentation in 
https://github.com/apache/spark-website/pull/66 as I progress), however 
th

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Dongjoon Hyun
+1 (non-binding)

Dongjoon.

On Tue, Oct 3, 2017 at 5:13 AM, Herman van Hövell tot Westerflier <
hvanhov...@databricks.com> wrote:

> +1
>
> On Tue, Oct 3, 2017 at 1:32 PM, Sean Owen  wrote:
>
>> +1 same as last RC. Tests pass, sigs and hashes are OK.
>>
>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau  wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark
>>> version 2.1.2. The vote is open until Saturday October 7th at 9:00
>>> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>>>
>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>
>>> The tag to be voted on is v2.1.2-rc4
>>>  (2abaea9e40fce81
>>> cd4626498e0f5c28a70917499)
>>>
>>> List of JIRA tickets resolved in this release can be found with this
>>> filter.
>>> 
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>>
>>> Release artifacts are signed with a key from:
>>> https://people.apache.org/~holden/holdens_keys.asc
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>>
>>>
>>> *FAQ*
>>>
>>> *How can I help test this release?*
>>>
>>> If you are a Spark user, you can help us test this release by taking an
>>> existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala
>>> you can add the staging repository to your projects resolvers and test with
>>> the RC (make sure to clean up the artifact cache before/after so you
>>> don't end up building with a out of date RC going forward).
>>>
>>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should be
>>> worked on immediately. Everything else please retarget to 2.1.3.
>>>
>>> *But my bug isn't fixed!??!*
>>>
>>> In order to make timely releases, we will typically not hold the release
>>> unless the bug in question is a regression from 2.1.1. That being said
>>> if there is something which is a regression form 2.1.1 that has not
>>> been correctly targeted please ping a committer to help target the issue
>>> (you can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>>> 
>>> )
>>>
>>> *What are the unresolved* issues targeted for 2.1.2
>>> 
>>> ?
>>>
>>> At this time there are no open unresolved issues.
>>>
>>> *Is there anything different about this release?*
>>>
>>> This is the first release in awhile not built on the AMPLAB Jenkins.
>>> This is good because it means future releases can more easily be built and
>>> signed securely (and I've been updating the documentation in
>>> https://github.com/apache/spark-website/pull/66 as I progress), however
>>> the chances of a mistake are higher with any change like this. If there
>>> something you normally take for granted as correct when checking a release,
>>> please double check this time :)
>>>
>>> *Should I be committing code to branch-2.1?*
>>>
>>> Thanks for asking! Please treat this stage in the RC process as "code
>>> freeze" so bug fixes only. If you're uncertain if something should be back
>>> ported please reach out. If you do commit to branch-2.1 please tag your
>>> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the
>>> 2.1.3 fixed into 2.1.2 as appropriate.
>>>
>>> *What happened to RC3?*
>>>
>>> Some R+zinc interactions kept it from getting out the door.
>>> --
>>> Twitter: https://twitter.com/holdenkarau
>>>
>>
>


Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Herman van Hövell tot Westerflier
+1

On Tue, Oct 3, 2017 at 1:32 PM, Sean Owen  wrote:

> +1 same as last RC. Tests pass, sigs and hashes are OK.
>
> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau  wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and
>> passes if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.1.2
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see https://spark.apache.org/
>>
>> The tag to be voted on is v2.1.2-rc4
>>  (2abaea9e40fce81
>> cd4626498e0f5c28a70917499)
>>
>> List of JIRA tickets resolved in this release can be found with this
>> filter.
>> 
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>
>> Release artifacts are signed with a key from:
>> https://people.apache.org/~holden/holdens_keys.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>
>> The documentation corresponding to this release can be found at:
>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>
>>
>> *FAQ*
>>
>> *How can I help test this release?*
>>
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install the
>> current RC and see if anything important breaks, in the Java/Scala you
>> can add the staging repository to your projects resolvers and test with the
>> RC (make sure to clean up the artifact cache before/after so you don't
>> end up building with a out of date RC going forward).
>>
>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should be
>> worked on immediately. Everything else please retarget to 2.1.3.
>>
>> *But my bug isn't fixed!??!*
>>
>> In order to make timely releases, we will typically not hold the release
>> unless the bug in question is a regression from 2.1.1. That being said
>> if there is something which is a regression form 2.1.1 that has not been
>> correctly targeted please ping a committer to help target the issue (you
>> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>> 
>> )
>>
>> *What are the unresolved* issues targeted for 2.1.2
>> 
>> ?
>>
>> At this time there are no open unresolved issues.
>>
>> *Is there anything different about this release?*
>>
>> This is the first release in awhile not built on the AMPLAB Jenkins. This
>> is good because it means future releases can more easily be built and
>> signed securely (and I've been updating the documentation in
>> https://github.com/apache/spark-website/pull/66 as I progress), however
>> the chances of a mistake are higher with any change like this. If there
>> something you normally take for granted as correct when checking a release,
>> please double check this time :)
>>
>> *Should I be committing code to branch-2.1?*
>>
>> Thanks for asking! Please treat this stage in the RC process as "code
>> freeze" so bug fixes only. If you're uncertain if something should be back
>> ported please reach out. If you do commit to branch-2.1 please tag your
>> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the 2.1.3
>> fixed into 2.1.2 as appropriate.
>>
>> *What happened to RC3?*
>>
>> Some R+zinc interactions kept it from getting out the door.
>> --
>> Twitter: https://twitter.com/holdenkarau
>>
>


Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Sean Owen
+1 same as last RC. Tests pass, sigs and hashes are OK.

On Tue, Oct 3, 2017 at 7:24 AM Holden Karau  wrote:

> Please vote on releasing the following candidate as Apache Spark version 2
> .1.2. The vote is open until Saturday October 7th at 9:00 PST and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.1.2
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.1.2-rc4
>  (
> 2abaea9e40fce81cd4626498e0f5c28a70917499)
>
> List of JIRA tickets resolved in this release can be found with this
> filter.
> 
>
> The release files, including signatures, digests, etc. can be found at:
> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>
> Release artifacts are signed with a key from:
> https://people.apache.org/~holden/holdens_keys.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1252
>
> The documentation corresponding to this release can be found at:
> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>
>
> *FAQ*
>
> *How can I help test this release?*
>
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install the
> current RC and see if anything important breaks, in the Java/Scala you
> can add the staging repository to your projects resolvers and test with the
> RC (make sure to clean up the artifact cache before/after so you don't
> end up building with a out of date RC going forward).
>
> *What should happen to JIRA tickets still targeting 2.1.2?*
>
> Committers should look at those and triage. Extremely important bug fixes,
> documentation, and API tweaks that impact compatibility should be worked on
> immediately. Everything else please retarget to 2.1.3.
>
> *But my bug isn't fixed!??!*
>
> In order to make timely releases, we will typically not hold the release
> unless the bug in question is a regression from 2.1.1. That being said if
> there is something which is a regression form 2.1.1 that has not been
> correctly targeted please ping a committer to help target the issue (you
> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
> 
> )
>
> *What are the unresolved* issues targeted for 2.1.2
> 
> ?
>
> At this time there are no open unresolved issues.
>
> *Is there anything different about this release?*
>
> This is the first release in awhile not built on the AMPLAB Jenkins. This
> is good because it means future releases can more easily be built and
> signed securely (and I've been updating the documentation in
> https://github.com/apache/spark-website/pull/66 as I progress), however
> the chances of a mistake are higher with any change like this. If there
> something you normally take for granted as correct when checking a release,
> please double check this time :)
>
> *Should I be committing code to branch-2.1?*
>
> Thanks for asking! Please treat this stage in the RC process as "code
> freeze" so bug fixes only. If you're uncertain if something should be back
> ported please reach out. If you do commit to branch-2.1 please tag your
> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the 2.1.3
> fixed into 2.1.2 as appropriate.
>
> *What happened to RC3?*
>
> Some R+zinc interactions kept it from getting out the door.
> --
> Twitter: https://twitter.com/holdenkarau
>