Re: [vote] Apache Spark 3.0 RC3

2020-06-17 Thread Reynold Xin
Hopefully today ! I think we can get the release officially out tomorrow. For 
those of you that don't know, it's already on Maven. Please take a final look 
at the release notes, as I will turn it into a release page today.

On Wed, Jun 17, 2020 at 11:31 AM, Tom Graves < tgraves...@yahoo.com > wrote:

> 
> Reynold,
> 
> 
> What's the plan on pushing the official release binaries and source tar? 
> It would be nice to have the release artifacts now that it's available on
> maven.
> 
> 
> thanks,
> Tom
> 
> 
> On Monday, June 15, 2020, 01:52:12 PM CDT, Reynold Xin < rxin@ databricks.
> com ( r...@databricks.com ) > wrote:
> 
> 
> 
> 
> Thanks for the reminder, Dongjoon.
> 
> 
> 
> I created the official release tag the past weekend and been working on
> the release notes (a lot of interesting changes!). I've created a google
> docs so it's easier for everybody to give comment on things that I've
> missed: https:/ / docs. google. com/ document/ d/ 
> 1NrTqxf2f39AXDF8VTIch6kwD8VKPaIlLW1QvuqEcwR4/
> edit (
> https://docs.google.com/document/d/1NrTqxf2f39AXDF8VTIch6kwD8VKPaIlLW1QvuqEcwR4/edit
> )
> 
> 
> 
> Plan to publish to maven et al today or tomorrow and give a day or two for
> dev@ to comment on the release notes before finalizing.
> 
> 
> 
> PS: There are two critical problems I've seen with the release (Spark UI
> is virtually unusable in some cases, and streaming issues). I will
> highlight them in the release notes and link to the JIRA tickets. But I
> think we should make 3.0.1 ASAP to follow up.
> 
> 
> 
> 
> 
> 
> On Sun, Jun 14, 2020 at 11:46 AM, Dongjoon Hyun < dongjoon. hyun@ gmail. com
> ( dongjoon.h...@gmail.com ) > wrote:
> 
>> Hi, Reynold.
>> 
>> 
>> Is there any progress on 3.0.0 release since the vote was finalized 5 days
>> ago?
>> 
>> 
>> Apparently, tag `v3.0.0` is not created yet, the binary and docs are still
>> sitting on the voting location, Maven Central doesn't have it, and
>> PySpark/SparkR uploading is not started yet.
>> 
>> 
>> https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ v3. 0. 0-rc3-bin/ (
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/ )
>> 
>> https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ v3. 0. 0-rc3-docs/ (
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/ )
>> 
>> 
>> 
>> Like Apache Spark 2.0.1 had 316 fixes after 2.0.0, we already have 35
>> patches on top of `v3.0.0-rc3` and are expecting more.
>> 
>> 
>> Although we can have Apache Spark 3.0.1 very soon before Spark+AI Summit,
>> Apache Spark 3.0.0 should be available in Apache Spark distribution
>> channel because it passed the vote.
>> 
>> 
>> 
>> Apache Spark 3.0.0 release itself helps the community use 3.0-line
>> codebase and makes the codebase healthy.
>> 
>> 
>> Please let us know if you need any help from the community for 3.0.0
>> release.
>> 
>> 
>> Thanks,
>> Dongjoon.
>> 
>> 
>> 
>> On Tue, Jun 9, 2020 at 9:41 PM Matei Zaharia < matei. zaharia@ gmail. com (
>> matei.zaha...@gmail.com ) > wrote:
>> 
>> 
>>> Congrats! Excited to see the release posted soon.
>>> 
>>> 
 On Jun 9, 2020, at 6:39 PM, Reynold Xin < rxin@ databricks. com (
 r...@databricks.com ) > wrote:
 
 
>>> 
>>> 
>>> 
 
 I waited another day to account for the weekend. This vote passes with the
 following +1 votes and no -1 votes!
 
 
 
 I'll start the release prep later this week.
 
 
 
 +1:
 
 Reynold Xin (binding)
 
 Prashant Sharma (binding)
 
 Gengliang Wang
 
 Sean Owen (binding)
 
 Mridul Muralidharan (binding)
 
 Takeshi Yamamuro
 
 Maxim Gekk
 
 Matei Zaharia (binding)
 
 Jungtaek Lim
 
 Denny Lee
 
 Russell Spitzer
 
 Dongjoon Hyun (binding)
 
 DB Tsai (binding)
 
 Michael Armbrust (binding)
 
 Tom Graves (binding)
 
 Bryan Cutler
 
 Huaxin Gao
 
 Jiaxin Shan
 
 Xingbo Jiang
 
 Xiao Li (binding)
 
 Hyukjin Kwon (binding)
 
 Kent Yao
 
 Wenchen Fan (binding)
 
 Shixiong Zhu (binding)
 
 Burak Yavuz
 
 Tathagata Das (binding)
 
 Ryan Blue
 
 
 
 -1: None
 
 
 
 
 
 
 On Sat, Jun 06, 2020 at 1:08 PM, Reynold Xin < rxin@ databricks. com (
 r...@databricks.com ) > wrote:
 
> Please vote on releasing the following candidate as Apache Spark version
> 3.0.0.
> 
> 
> 
> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes are
> cast, with a minimum of 3 +1 votes.
> 
> 
> 
> [ ] +1 Release this package as Apache Spark 3.0.0
> 
> [ ] -1 Do not release this package because ...
> 
> 
> 
> To learn more about Apache Spark, please see http:/ / spark. apache. org/ 
> (
> http://spark.apache.org/ )
> 
> 
> 
> The tag to be voted on is v3.0.0-rc3 (commit
> 

Re: [vote] Apache Spark 3.0 RC3

2020-06-17 Thread Tom Graves
 Reynold, 
What's the plan on pushing the official release binaries and source tar?  It 
would be nice to have the release artifacts now that it's available on maven.
thanks,Tom
On Monday, June 15, 2020, 01:52:12 PM CDT, Reynold Xin 
 wrote:  
 
 Thanks for the reminder, Dongjoon.

I created the official release tag the past weekend and been working on the 
release notes (a lot of interesting changes!). I've created a google docs so 
it's easier for everybody to give comment on things that I've missed: 
https://docs.google.com/document/d/1NrTqxf2f39AXDF8VTIch6kwD8VKPaIlLW1QvuqEcwR4/edit

Plan to publish to maven et al today or tomorrow and give a day or two for dev@ 
to comment on the release notes before finalizing.

PS: There are two critical problems I've seen with the release (Spark UI is 
virtually unusable in some cases, and streaming issues). I will highlight them 
in the release notes and link to the JIRA tickets. But I think we should make 
3.0.1 ASAP to follow up.



On Sun, Jun 14, 2020 at 11:46 AM, Dongjoon Hyun  wrote:

Hi, Reynold.
Is there any progress on 3.0.0 release since the vote was finalized 5 days ago?
Apparently, tag `v3.0.0` is not created yet, the binary and docs are still 
sitting on the voting location, Maven Central doesn't have it, and 
PySpark/SparkR uploading is not started yet.
    https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ v3. 0. 0-rc3-bin/
    https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ v3. 0. 0-rc3-docs/

Like Apache Spark 2.0.1 had 316 fixes after 2.0.0, we already have 35 patches 
on top of `v3.0.0-rc3` and are expecting more.
Although we can have Apache Spark 3.0.1 very soon before Spark+AI Summit, 
Apache Spark 3.0.0 should be available in Apache Spark distribution channel 
because it passed the vote.

Apache Spark 3.0.0 release itself helps the community use 3.0-line codebase and 
makes the codebase healthy.
Please let us know if you need any help from the community for 3.0.0 release.
Thanks,Dongjoon.

On Tue, Jun 9, 2020 at 9:41 PM Matei Zaharia  wrote:

Congrats! Excited to see the release posted soon.

On Jun 9, 2020, at 6:39 PM, Reynold Xin  wrote:



I waited another day to account for the weekend. This vote passes with the 
following +1 votes and no -1 votes!

I'll start the release prep later this week.

+1:
Reynold Xin (binding)
Prashant Sharma (binding)
Gengliang Wang
Sean Owen (binding)
Mridul Muralidharan (binding)
Takeshi Yamamuro
Maxim Gekk
Matei Zaharia (binding)
Jungtaek Lim
Denny Lee
Russell Spitzer
Dongjoon Hyun (binding)
DB Tsai (binding)
Michael Armbrust (binding)
Tom Graves (binding)
Bryan Cutler
Huaxin Gao
Jiaxin Shan
Xingbo Jiang
Xiao Li (binding)
Hyukjin Kwon (binding)
Kent Yao
Wenchen Fan (binding)
Shixiong Zhu (binding)
Burak Yavuz
Tathagata Das (binding)
Ryan Blue

-1: None



On Sat, Jun 06, 2020 at 1:08 PM, Reynold Xin  wrote:

Please vote on releasing the following candidate as Apache Spark version 3.0.0.

The vote is open until [DUE DAY] and passes if a majority +1 PMC votes are 
cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.0.0
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http:/ / spark. apache. org/

The tag to be voted on is v3.0.0-rc3 (commit 
3fdfce3120f307147244e5eaf46d61419a723d50):
https:/ / github. com/ apache/ spark/ tree/ v3. 0. 0-rc3

The release files, including signatures, digests, etc. can be found at:
https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ v3. 0. 0-rc3-bin/

Signatures used for Spark RCs can be found in this file:
https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ KEYS

The staging repository for this release can be found at:
https:/ / repository. apache. org/ content/ repositories/ orgapachespark-1350/

The documentation corresponding to this release can be found at:
https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ v3. 0. 0-rc3-docs/

The list of bug fixes going into 3.0.0 can be found at the following URL:
https:/ / issues. apache. org/ jira/ projects/ SPARK/ versions/ 12339177

This release is using the release script of the tag v3.0.0-rc3.

FAQ

=
How can I help test this release?
=

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===
What should happen to JIRA tickets still targeting 3.0.0?
===

The current list of open tickets targeted at 3.0.0 can be found at:
https:/ / issues. apache. org/ jira/ 

Re: [vote] Apache Spark 3.0 RC3

2020-06-15 Thread Dongjoon Hyun
Thank you, Reynold! :)


On Mon, Jun 15, 2020 at 11:51 AM Reynold Xin  wrote:

> Thanks for the reminder, Dongjoon.
>
> I created the official release tag the past weekend and been working on
> the release notes (a lot of interesting changes!). I've created a google
> docs so it's easier for everybody to give comment on things that I've
> missed:
> https://docs.google.com/document/d/1NrTqxf2f39AXDF8VTIch6kwD8VKPaIlLW1QvuqEcwR4/edit
>
> Plan to publish to maven et al today or tomorrow and give a day or two
> for dev@ to comment on the release notes before finalizing.
>
> PS: There are two critical problems I've seen with the release (Spark UI
> is virtually unusable in some cases, and streaming issues). I will
> highlight them in the release notes and link to the JIRA tickets. But I
> think we should make 3.0.1 ASAP to follow up.
>
>
>
> On Sun, Jun 14, 2020 at 11:46 AM, Dongjoon Hyun 
> wrote:
>
>> Hi, Reynold.
>>
>> Is there any progress on 3.0.0 release since the vote was finalized 5
>> days ago?
>>
>> Apparently, tag `v3.0.0` is not created yet, the binary and docs are
>> still sitting on the voting location, Maven Central doesn't have it, and
>> PySpark/SparkR uploading is not started yet.
>>
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>>
>> Like Apache Spark 2.0.1 had 316 fixes after 2.0.0, we already have 35
>> patches on top of `v3.0.0-rc3` and are expecting more.
>>
>> Although we can have Apache Spark 3.0.1 very soon before Spark+AI Summit,
>> Apache Spark 3.0.0 should be available in Apache Spark distribution channel
>> because it passed the vote.
>>
>> Apache Spark 3.0.0 release itself helps the community use 3.0-line
>> codebase and makes the codebase healthy.
>>
>> Please let us know if you need any help from the community for 3.0.0
>> release.
>>
>> Thanks,
>> Dongjoon.
>>
>>
>> On Tue, Jun 9, 2020 at 9:41 PM Matei Zaharia 
>> wrote:
>>
>>> Congrats! Excited to see the release posted soon.
>>>
>>> On Jun 9, 2020, at 6:39 PM, Reynold Xin  wrote:
>>>
>>> 
>>> I waited another day to account for the weekend. This vote passes with
>>> the following +1 votes and no -1 votes!
>>>
>>> I'll start the release prep later this week.
>>>
>>> +1:
>>> Reynold Xin (binding)
>>> Prashant Sharma (binding)
>>> Gengliang Wang
>>> Sean Owen (binding)
>>> Mridul Muralidharan (binding)
>>> Takeshi Yamamuro
>>> Maxim Gekk
>>> Matei Zaharia (binding)
>>> Jungtaek Lim
>>> Denny Lee
>>> Russell Spitzer
>>> Dongjoon Hyun (binding)
>>> DB Tsai (binding)
>>> Michael Armbrust (binding)
>>> Tom Graves (binding)
>>> Bryan Cutler
>>> Huaxin Gao
>>> Jiaxin Shan
>>> Xingbo Jiang
>>> Xiao Li (binding)
>>> Hyukjin Kwon (binding)
>>> Kent Yao
>>> Wenchen Fan (binding)
>>> Shixiong Zhu (binding)
>>> Burak Yavuz
>>> Tathagata Das (binding)
>>> Ryan Blue
>>>
>>> -1: None
>>>
>>>
>>>
>>> On Sat, Jun 06, 2020 at 1:08 PM, Reynold Xin 
>>> wrote:
>>>
 Please vote on releasing the following candidate as Apache Spark
 version 3.0.0.

 The vote is open until [DUE DAY] and passes if a majority +1 PMC votes
 are cast, with a minimum of 3 +1 votes.

 [ ] +1 Release this package as Apache Spark 3.0.0
 [ ] -1 Do not release this package because ...

 To learn more about Apache Spark, please see http://spark.apache.org/

 The tag to be voted on is v3.0.0-rc3 (commit
 3fdfce3120f307147244e5eaf46d61419a723d50):
 https://github.com/apache/spark/tree/v3.0.0-rc3

 The release files, including signatures, digests, etc. can be found at:
 https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/

 Signatures used for Spark RCs can be found in this file:
 https://dist.apache.org/repos/dist/dev/spark/KEYS

 The staging repository for this release can be found at:
 https://repository.apache.org/content/repositories/orgapachespark-1350/

 The documentation corresponding to this release can be found at:
 https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/

 The list of bug fixes going into 3.0.0 can be found at the following
 URL:
 https://issues.apache.org/jira/projects/SPARK/versions/12339177

 This release is using the release script of the tag v3.0.0-rc3.

 FAQ

 =
 How can I help test this release?
 =

 If you are a Spark user, you can help us test this release by taking
 an existing Spark workload and running on this release candidate, then
 reporting any regressions.

 If you're working in PySpark you can set up a virtual env and install
 the current RC and see if anything important breaks, in the Java/Scala
 you can add the staging repository to your projects resolvers and test
 with the RC (make sure to clean up the artifact cache before/after so
 you don't end up building with a out of date RC going 

Re: [vote] Apache Spark 3.0 RC3

2020-06-15 Thread Reynold Xin
Thanks for the reminder, Dongjoon.

I created the official release tag the past weekend and been working on the 
release notes (a lot of interesting changes!). I've created a google docs so 
it's easier for everybody to give comment on things that I've missed: 
https://docs.google.com/document/d/1NrTqxf2f39AXDF8VTIch6kwD8VKPaIlLW1QvuqEcwR4/edit

Plan to publish to maven et al today or tomorrow and give a day or two for dev@ 
to comment on the release notes before finalizing.

PS: There are two critical problems I've seen with the release (Spark UI is 
virtually unusable in some cases, and streaming issues). I will highlight them 
in the release notes and link to the JIRA tickets. But I think we should make 
3.0.1 ASAP to follow up.

On Sun, Jun 14, 2020 at 11:46 AM, Dongjoon Hyun < dongjoon.h...@gmail.com > 
wrote:

> 
> Hi, Reynold.
> 
> 
> Is there any progress on 3.0.0 release since the vote was finalized 5 days
> ago?
> 
> 
> Apparently, tag `v3.0.0` is not created yet, the binary and docs are still
> sitting on the voting location, Maven Central doesn't have it, and
> PySpark/SparkR uploading is not started yet.
> 
> 
> https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ v3. 0. 0-rc3-bin/ (
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/ )
> 
> https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ v3. 0. 0-rc3-docs/ (
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/ )
> 
> 
> 
> Like Apache Spark 2.0.1 had 316 fixes after 2.0.0, we already have 35
> patches on top of `v3.0.0-rc3` and are expecting more.
> 
> 
> Although we can have Apache Spark 3.0.1 very soon before Spark+AI Summit,
> Apache Spark 3.0.0 should be available in Apache Spark distribution
> channel because it passed the vote.
> 
> 
> 
> Apache Spark 3.0.0 release itself helps the community use 3.0-line
> codebase and makes the codebase healthy.
> 
> 
> Please let us know if you need any help from the community for 3.0.0
> release.
> 
> 
> Thanks,
> Dongjoon.
> 
> 
> 
> On Tue, Jun 9, 2020 at 9:41 PM Matei Zaharia < matei. zaharia@ gmail. com (
> matei.zaha...@gmail.com ) > wrote:
> 
> 
>> Congrats! Excited to see the release posted soon.
>> 
>> 
>>> On Jun 9, 2020, at 6:39 PM, Reynold Xin < rxin@ databricks. com (
>>> r...@databricks.com ) > wrote:
>>> 
>>> 
>> 
>> 
>> 
>>> 
>>> I waited another day to account for the weekend. This vote passes with the
>>> following +1 votes and no -1 votes!
>>> 
>>> 
>>> 
>>> I'll start the release prep later this week.
>>> 
>>> 
>>> 
>>> +1:
>>> 
>>> Reynold Xin (binding)
>>> 
>>> Prashant Sharma (binding)
>>> 
>>> Gengliang Wang
>>> 
>>> Sean Owen (binding)
>>> 
>>> Mridul Muralidharan (binding)
>>> 
>>> Takeshi Yamamuro
>>> 
>>> Maxim Gekk
>>> 
>>> Matei Zaharia (binding)
>>> 
>>> Jungtaek Lim
>>> 
>>> Denny Lee
>>> 
>>> Russell Spitzer
>>> 
>>> Dongjoon Hyun (binding)
>>> 
>>> DB Tsai (binding)
>>> 
>>> Michael Armbrust (binding)
>>> 
>>> Tom Graves (binding)
>>> 
>>> Bryan Cutler
>>> 
>>> Huaxin Gao
>>> 
>>> Jiaxin Shan
>>> 
>>> Xingbo Jiang
>>> 
>>> Xiao Li (binding)
>>> 
>>> Hyukjin Kwon (binding)
>>> 
>>> Kent Yao
>>> 
>>> Wenchen Fan (binding)
>>> 
>>> Shixiong Zhu (binding)
>>> 
>>> Burak Yavuz
>>> 
>>> Tathagata Das (binding)
>>> 
>>> Ryan Blue
>>> 
>>> 
>>> 
>>> -1: None
>>> 
>>> 
>>> 
>>> 
>>> 
>>> 
>>> On Sat, Jun 06, 2020 at 1:08 PM, Reynold Xin < rxin@ databricks. com (
>>> r...@databricks.com ) > wrote:
>>> 
 Please vote on releasing the following candidate as Apache Spark version
 3.0.0.
 
 
 
 The vote is open until [DUE DAY] and passes if a majority +1 PMC votes are
 cast, with a minimum of 3 +1 votes.
 
 
 
 [ ] +1 Release this package as Apache Spark 3.0.0
 
 [ ] -1 Do not release this package because ...
 
 
 
 To learn more about Apache Spark, please see http:/ / spark. apache. org/ (
 http://spark.apache.org/ )
 
 
 
 The tag to be voted on is v3.0.0-rc3 (commit
 3fdfce3120f307147244e5eaf46d61419a723d50):
 
 https:/ / github. com/ apache/ spark/ tree/ v3. 0. 0-rc3 (
 https://github.com/apache/spark/tree/v3.0.0-rc3 )
 
 
 
 The release files, including signatures, digests, etc. can be found at:
 
 https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ v3. 0. 0-rc3-bin/ (
 https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/ )
 
 
 
 Signatures used for Spark RCs can be found in this file:
 
 https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ KEYS (
 https://dist.apache.org/repos/dist/dev/spark/KEYS )
 
 
 
 The staging repository for this release can be found at:
 
 https:/ / repository. apache. org/ content/ repositories/ 
 orgapachespark-1350/
 ( https://repository.apache.org/content/repositories/orgapachespark-1350/
 )
 
 
 
 The documentation corresponding to this release can be found at:
 
 https:/ / dist. apache. 

Re: [vote] Apache Spark 3.0 RC3

2020-06-14 Thread Dongjoon Hyun
Hi, Reynold.

Is there any progress on 3.0.0 release since the vote was finalized 5 days
ago?

Apparently, tag `v3.0.0` is not created yet, the binary and docs are still
sitting on the voting location, Maven Central doesn't have it, and
PySpark/SparkR uploading is not started yet.

https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/

Like Apache Spark 2.0.1 had 316 fixes after 2.0.0, we already have 35
patches on top of `v3.0.0-rc3` and are expecting more.

Although we can have Apache Spark 3.0.1 very soon before Spark+AI Summit,
Apache Spark 3.0.0 should be available in Apache Spark distribution channel
because it passed the vote.

Apache Spark 3.0.0 release itself helps the community use 3.0-line codebase
and makes the codebase healthy.

Please let us know if you need any help from the community for 3.0.0
release.

Thanks,
Dongjoon.


On Tue, Jun 9, 2020 at 9:41 PM Matei Zaharia 
wrote:

> Congrats! Excited to see the release posted soon.
>
> On Jun 9, 2020, at 6:39 PM, Reynold Xin  wrote:
>
> 
> I waited another day to account for the weekend. This vote passes with the
> following +1 votes and no -1 votes!
>
> I'll start the release prep later this week.
>
> +1:
> Reynold Xin (binding)
> Prashant Sharma (binding)
> Gengliang Wang
> Sean Owen (binding)
> Mridul Muralidharan (binding)
> Takeshi Yamamuro
> Maxim Gekk
> Matei Zaharia (binding)
> Jungtaek Lim
> Denny Lee
> Russell Spitzer
> Dongjoon Hyun (binding)
> DB Tsai (binding)
> Michael Armbrust (binding)
> Tom Graves (binding)
> Bryan Cutler
> Huaxin Gao
> Jiaxin Shan
> Xingbo Jiang
> Xiao Li (binding)
> Hyukjin Kwon (binding)
> Kent Yao
> Wenchen Fan (binding)
> Shixiong Zhu (binding)
> Burak Yavuz
> Tathagata Das (binding)
> Ryan Blue
>
> -1: None
>
>
>
> On Sat, Jun 06, 2020 at 1:08 PM, Reynold Xin  wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 3.0.0.
>>
>> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes
>> are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.0.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v3.0.0-rc3 (commit
>> 3fdfce3120f307147244e5eaf46d61419a723d50):
>> https://github.com/apache/spark/tree/v3.0.0-rc3
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1350/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>>
>> The list of bug fixes going into 3.0.0 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>
>> This release is using the release script of the tag v3.0.0-rc3.
>>
>> FAQ
>>
>> =
>> How can I help test this release?
>> =
>>
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===
>> What should happen to JIRA tickets still targeting 3.0.0?
>> ===
>>
>> The current list of open tickets targeted at 3.0.0 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 3.0.0
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==
>> But my bug isn't fixed?
>> ==
>>
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>
>


Re: [vote] Apache Spark 3.0 RC3

2020-06-09 Thread Matei Zaharia
Congrats! Excited to see the release posted soon.

> On Jun 9, 2020, at 6:39 PM, Reynold Xin  wrote:
> 
> 
> I waited another day to account for the weekend. This vote passes with the 
> following +1 votes and no -1 votes!
> 
> I'll start the release prep later this week.
> 
> +1:
> Reynold Xin (binding)
> Prashant Sharma (binding)
> Gengliang Wang
> Sean Owen (binding)
> Mridul Muralidharan (binding)
> Takeshi Yamamuro
> Maxim Gekk
> Matei Zaharia (binding)
> Jungtaek Lim
> Denny Lee
> Russell Spitzer
> Dongjoon Hyun (binding)
> DB Tsai (binding)
> Michael Armbrust (binding)
> Tom Graves (binding)
> Bryan Cutler
> Huaxin Gao
> Jiaxin Shan
> Xingbo Jiang
> Xiao Li (binding)
> Hyukjin Kwon (binding)
> Kent Yao
> Wenchen Fan (binding)
> Shixiong Zhu (binding)
> Burak Yavuz
> Tathagata Das (binding)
> Ryan Blue
> 
> -1: None
> 
> 
> 
>> On Sat, Jun 06, 2020 at 1:08 PM, Reynold Xin  wrote:
>> Please vote on releasing the following candidate as Apache Spark version 
>> 3.0.0.
>> 
>> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes are 
>> cast, with a minimum of 3 +1 votes.
>> 
>> [ ] +1 Release this package as Apache Spark 3.0.0
>> [ ] -1 Do not release this package because ...
>> 
>> To learn more about Apache Spark, please see http://spark.apache.org/
>> 
>> The tag to be voted on is v3.0.0-rc3 (commit 
>> 3fdfce3120f307147244e5eaf46d61419a723d50):
>> https://github.com/apache/spark/tree/v3.0.0-rc3
>> 
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>> 
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>> 
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1350/
>> 
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>> 
>> The list of bug fixes going into 3.0.0 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>> 
>> This release is using the release script of the tag v3.0.0-rc3.
>> 
>> FAQ
>> 
>> =
>> How can I help test this release?
>> =
>> 
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>> 
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>> 
>> ===
>> What should happen to JIRA tickets still targeting 3.0.0?
>> ===
>> 
>> The current list of open tickets targeted at 3.0.0 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target 
>> Version/s" = 3.0.0
>> 
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>> 
>> ==
>> But my bug isn't fixed?
>> ==
>> 
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
> 


Re: [vote] Apache Spark 3.0 RC3

2020-06-09 Thread Reynold Xin
I waited another day to account for the weekend. This vote passes with the 
following +1 votes and no -1 votes!

I'll start the release prep later this week.

+1:

Reynold Xin (binding)

Prashant Sharma (binding)

Gengliang Wang

Sean Owen (binding)

Mridul Muralidharan (binding)

Takeshi Yamamuro

Maxim Gekk

Matei Zaharia (binding)

Jungtaek Lim

Denny Lee

Russell Spitzer

Dongjoon Hyun (binding)

DB Tsai (binding)

Michael Armbrust (binding)

Tom Graves (binding)

Bryan Cutler

Huaxin Gao

Jiaxin Shan

Xingbo Jiang

Xiao Li (binding)

Hyukjin Kwon (binding)

Kent Yao

Wenchen Fan (binding)

Shixiong Zhu (binding)

Burak Yavuz

Tathagata Das (binding)

Ryan Blue

-1: None

On Sat, Jun 06, 2020 at 1:08 PM, Reynold Xin < r...@databricks.com > wrote:

> 
> Please vote on releasing the following candidate as Apache Spark version
> 3.0.0.
> 
> 
> 
> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes are
> cast, with a minimum of 3 +1 votes.
> 
> 
> 
> [ ] +1 Release this package as Apache Spark 3.0.0
> 
> [ ] -1 Do not release this package because ...
> 
> 
> 
> To learn more about Apache Spark, please see http:/ / spark. apache. org/ (
> http://spark.apache.org/ )
> 
> 
> 
> The tag to be voted on is v3.0.0-rc3 (commit
> 3fdfce3120f307147244e5eaf46d61419a723d50):
> 
> https:/ / github. com/ apache/ spark/ tree/ v3. 0. 0-rc3 (
> https://github.com/apache/spark/tree/v3.0.0-rc3 )
> 
> 
> 
> The release files, including signatures, digests, etc. can be found at:
> 
> https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ v3. 0. 0-rc3-bin/ (
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/ )
> 
> 
> 
> Signatures used for Spark RCs can be found in this file:
> 
> https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ KEYS (
> https://dist.apache.org/repos/dist/dev/spark/KEYS )
> 
> 
> 
> The staging repository for this release can be found at:
> 
> https:/ / repository. apache. org/ content/ repositories/ orgapachespark-1350/
> ( https://repository.apache.org/content/repositories/orgapachespark-1350/
> )
> 
> 
> 
> The documentation corresponding to this release can be found at:
> 
> https:/ / dist. apache. org/ repos/ dist/ dev/ spark/ v3. 0. 0-rc3-docs/ (
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/ )
> 
> 
> 
> The list of bug fixes going into 3.0.0 can be found at the following URL:
> 
> https:/ / issues. apache. org/ jira/ projects/ SPARK/ versions/ 12339177 (
> https://issues.apache.org/jira/projects/SPARK/versions/12339177 )
> 
> 
> 
> This release is using the release script of the tag v3.0.0-rc3.
> 
> 
> 
> FAQ
> 
> 
> 
> =
> 
> How can I help test this release?
> 
> =
> 
> 
> 
> If you are a Spark user, you can help us test this release by taking
> 
> an existing Spark workload and running on this release candidate, then
> 
> reporting any regressions.
> 
> 
> 
> If you're working in PySpark you can set up a virtual env and install
> 
> the current RC and see if anything important breaks, in the Java/Scala
> 
> you can add the staging repository to your projects resolvers and test
> 
> with the RC (make sure to clean up the artifact cache before/after so
> 
> you don't end up building with a out of date RC going forward).
> 
> 
> 
> ===
> 
> What should happen to JIRA tickets still targeting 3.0.0?
> 
> ===
> 
> 
> 
> The current list of open tickets targeted at 3.0.0 can be found at:
> 
> https:/ / issues. apache. org/ jira/ projects/ SPARK (
> https://issues.apache.org/jira/projects/SPARK ) and search for "Target
> Version/s" = 3.0.0
> 
> 
> 
> Committers should look at those and triage. Extremely important bug
> 
> fixes, documentation, and API tweaks that impact compatibility should
> 
> be worked on immediately. Everything else please retarget to an
> 
> appropriate release.
> 
> 
> 
> ==
> 
> But my bug isn't fixed?
> 
> ==
> 
> 
> 
> In order to make timely releases, we will typically not hold the
> 
> release unless the bug in question is a regression from the previous
> 
> release. That being said, if there is something which is a regression
> 
> that has not been correctly targeted please ping me or a committer to
> 
> help target the issue.
>

smime.p7s
Description: S/MIME Cryptographic Signature


Re: [vote] Apache Spark 3.0 RC3

2020-06-09 Thread Ryan Blue
+1 (non-binding)

On Tue, Jun 9, 2020 at 4:14 PM Tathagata Das 
wrote:

> +1 (binding)
>
> On Tue, Jun 9, 2020 at 5:27 PM Burak Yavuz  wrote:
>
>> +1
>>
>> Best,
>> Burak
>>
>> On Tue, Jun 9, 2020 at 1:48 PM Shixiong(Ryan) Zhu <
>> shixi...@databricks.com> wrote:
>>
>>> +1 (binding)
>>>
>>> Best Regards,
>>> Ryan
>>>
>>>
>>> On Tue, Jun 9, 2020 at 4:24 AM Wenchen Fan  wrote:
>>>
 +1 (binding)

 On Tue, Jun 9, 2020 at 6:15 PM Dr. Kent Yao  wrote:

> +1 (non-binding)
>
>
>
> --
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

-- 
Ryan Blue
Software Engineer
Netflix


Re: [vote] Apache Spark 3.0 RC3

2020-06-09 Thread Tathagata Das
+1 (binding)

On Tue, Jun 9, 2020 at 5:27 PM Burak Yavuz  wrote:

> +1
>
> Best,
> Burak
>
> On Tue, Jun 9, 2020 at 1:48 PM Shixiong(Ryan) Zhu 
> wrote:
>
>> +1 (binding)
>>
>> Best Regards,
>> Ryan
>>
>>
>> On Tue, Jun 9, 2020 at 4:24 AM Wenchen Fan  wrote:
>>
>>> +1 (binding)
>>>
>>> On Tue, Jun 9, 2020 at 6:15 PM Dr. Kent Yao  wrote:
>>>
 +1 (non-binding)



 --
 Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

 -
 To unsubscribe e-mail: dev-unsubscr...@spark.apache.org




Re: [vote] Apache Spark 3.0 RC3

2020-06-09 Thread Burak Yavuz
+1

Best,
Burak

On Tue, Jun 9, 2020 at 1:48 PM Shixiong(Ryan) Zhu 
wrote:

> +1 (binding)
>
> Best Regards,
> Ryan
>
>
> On Tue, Jun 9, 2020 at 4:24 AM Wenchen Fan  wrote:
>
>> +1 (binding)
>>
>> On Tue, Jun 9, 2020 at 6:15 PM Dr. Kent Yao  wrote:
>>
>>> +1 (non-binding)
>>>
>>>
>>>
>>> --
>>> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>>>
>>> -
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>
>>>


Re: [vote] Apache Spark 3.0 RC3

2020-06-09 Thread Shixiong(Ryan) Zhu
+1 (binding)

Best Regards,
Ryan


On Tue, Jun 9, 2020 at 4:24 AM Wenchen Fan  wrote:

> +1 (binding)
>
> On Tue, Jun 9, 2020 at 6:15 PM Dr. Kent Yao  wrote:
>
>> +1 (non-binding)
>>
>>
>>
>> --
>> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>


Re: [vote] Apache Spark 3.0 RC3

2020-06-09 Thread Wenchen Fan
+1 (binding)

On Tue, Jun 9, 2020 at 6:15 PM Dr. Kent Yao  wrote:

> +1 (non-binding)
>
>
>
> --
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: [vote] Apache Spark 3.0 RC3

2020-06-09 Thread Dr. Kent Yao
+1 (non-binding)



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [vote] Apache Spark 3.0 RC3

2020-06-09 Thread Hyukjin Kwon
+1

2020년 6월 9일 (화) 오후 3:16, Xiao Li 님이 작성:

> +1 (binding)
>
> Xiao
>
> On Mon, Jun 8, 2020 at 10:13 PM Xingbo Jiang 
> wrote:
>
>> +1(non-binding)
>>
>> Jiaxin Shan 于2020年6月8日 周一下午9:50写道:
>>
>>> +1
>>> I build binary using the following command, test spark workloads on
>>> Kubernetes (AWS EKS) and it's working well.
>>>
>>> ./dev/make-distribution.sh --name spark-v3.0.0-rc3-20200608 --tgz
>>> -Phadoop-3.2 -Pkubernetes -Phive -Phive-thriftserver -Phadoop-cloud
>>> -Pscala-2.12
>>>
>>> On Mon, Jun 8, 2020 at 7:13 PM Bryan Cutler  wrote:
>>>
 +1 (non-binding)

 On Mon, Jun 8, 2020, 1:49 PM Tom Graves 
 wrote:

> +1
>
> Tom
>
> On Saturday, June 6, 2020, 03:09:09 PM CDT, Reynold Xin <
> r...@databricks.com> wrote:
>
>
> Please vote on releasing the following candidate as Apache Spark
> version 3.0.0.
>
> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes
> are cast, with a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.0.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v3.0.0-rc3 (commit
> 3fdfce3120f307147244e5eaf46d61419a723d50):
> https://github.com/apache/spark/tree/v3.0.0-rc3
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1350/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>
> The list of bug fixes going into 3.0.0 can be found at the following
> URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>
> This release is using the release script of the tag v3.0.0-rc3.
>
> FAQ
>
> =
> How can I help test this release?
> =
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 3.0.0?
> ===
>
> The current list of open tickets targeted at 3.0.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.0.0
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
>
>
>
>>>
>>> --
>>> Best Regards!
>>> Jiaxin Shan
>>> Tel:  412-230-7670
>>> Address: 470 2nd Ave S, Kirkland, WA
>>> 
>>>
>>>
>
> --
> 
>


Re: [vote] Apache Spark 3.0 RC3

2020-06-09 Thread Xiao Li
+1 (binding)

Xiao

On Mon, Jun 8, 2020 at 10:13 PM Xingbo Jiang  wrote:

> +1(non-binding)
>
> Jiaxin Shan 于2020年6月8日 周一下午9:50写道:
>
>> +1
>> I build binary using the following command, test spark workloads on
>> Kubernetes (AWS EKS) and it's working well.
>>
>> ./dev/make-distribution.sh --name spark-v3.0.0-rc3-20200608 --tgz
>> -Phadoop-3.2 -Pkubernetes -Phive -Phive-thriftserver -Phadoop-cloud
>> -Pscala-2.12
>>
>> On Mon, Jun 8, 2020 at 7:13 PM Bryan Cutler  wrote:
>>
>>> +1 (non-binding)
>>>
>>> On Mon, Jun 8, 2020, 1:49 PM Tom Graves 
>>> wrote:
>>>
 +1

 Tom

 On Saturday, June 6, 2020, 03:09:09 PM CDT, Reynold Xin <
 r...@databricks.com> wrote:


 Please vote on releasing the following candidate as Apache Spark
 version 3.0.0.

 The vote is open until [DUE DAY] and passes if a majority +1 PMC votes
 are cast, with a minimum of 3 +1 votes.

 [ ] +1 Release this package as Apache Spark 3.0.0
 [ ] -1 Do not release this package because ...

 To learn more about Apache Spark, please see http://spark.apache.org/

 The tag to be voted on is v3.0.0-rc3 (commit
 3fdfce3120f307147244e5eaf46d61419a723d50):
 https://github.com/apache/spark/tree/v3.0.0-rc3

 The release files, including signatures, digests, etc. can be found at:
 https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/

 Signatures used for Spark RCs can be found in this file:
 https://dist.apache.org/repos/dist/dev/spark/KEYS

 The staging repository for this release can be found at:
 https://repository.apache.org/content/repositories/orgapachespark-1350/

 The documentation corresponding to this release can be found at:
 https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/

 The list of bug fixes going into 3.0.0 can be found at the following
 URL:
 https://issues.apache.org/jira/projects/SPARK/versions/12339177

 This release is using the release script of the tag v3.0.0-rc3.

 FAQ

 =
 How can I help test this release?
 =

 If you are a Spark user, you can help us test this release by taking
 an existing Spark workload and running on this release candidate, then
 reporting any regressions.

 If you're working in PySpark you can set up a virtual env and install
 the current RC and see if anything important breaks, in the Java/Scala
 you can add the staging repository to your projects resolvers and test
 with the RC (make sure to clean up the artifact cache before/after so
 you don't end up building with a out of date RC going forward).

 ===
 What should happen to JIRA tickets still targeting 3.0.0?
 ===

 The current list of open tickets targeted at 3.0.0 can be found at:
 https://issues.apache.org/jira/projects/SPARK and search for "Target
 Version/s" = 3.0.0

 Committers should look at those and triage. Extremely important bug
 fixes, documentation, and API tweaks that impact compatibility should
 be worked on immediately. Everything else please retarget to an
 appropriate release.

 ==
 But my bug isn't fixed?
 ==

 In order to make timely releases, we will typically not hold the
 release unless the bug in question is a regression from the previous
 release. That being said, if there is something which is a regression
 that has not been correctly targeted please ping me or a committer to
 help target the issue.



>>
>> --
>> Best Regards!
>> Jiaxin Shan
>> Tel:  412-230-7670
>> Address: 470 2nd Ave S, Kirkland, WA
>> 
>>
>>

-- 



Re: [vote] Apache Spark 3.0 RC3

2020-06-08 Thread Xingbo Jiang
+1(non-binding)

Jiaxin Shan 于2020年6月8日 周一下午9:50写道:

> +1
> I build binary using the following command, test spark workloads on
> Kubernetes (AWS EKS) and it's working well.
>
> ./dev/make-distribution.sh --name spark-v3.0.0-rc3-20200608 --tgz
> -Phadoop-3.2 -Pkubernetes -Phive -Phive-thriftserver -Phadoop-cloud
> -Pscala-2.12
>
> On Mon, Jun 8, 2020 at 7:13 PM Bryan Cutler  wrote:
>
>> +1 (non-binding)
>>
>> On Mon, Jun 8, 2020, 1:49 PM Tom Graves 
>> wrote:
>>
>>> +1
>>>
>>> Tom
>>>
>>> On Saturday, June 6, 2020, 03:09:09 PM CDT, Reynold Xin <
>>> r...@databricks.com> wrote:
>>>
>>>
>>> Please vote on releasing the following candidate as Apache Spark version
>>> 3.0.0.
>>>
>>> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes
>>> are cast, with a minimum of 3 +1 votes.
>>>
>>> [ ] +1 Release this package as Apache Spark 3.0.0
>>> [ ] -1 Do not release this package because ...
>>>
>>> To learn more about Apache Spark, please see http://spark.apache.org/
>>>
>>> The tag to be voted on is v3.0.0-rc3 (commit
>>> 3fdfce3120f307147244e5eaf46d61419a723d50):
>>> https://github.com/apache/spark/tree/v3.0.0-rc3
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>>>
>>> Signatures used for Spark RCs can be found in this file:
>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1350/
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>>>
>>> The list of bug fixes going into 3.0.0 can be found at the following URL:
>>> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>>
>>> This release is using the release script of the tag v3.0.0-rc3.
>>>
>>> FAQ
>>>
>>> =
>>> How can I help test this release?
>>> =
>>>
>>> If you are a Spark user, you can help us test this release by taking
>>> an existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala
>>> you can add the staging repository to your projects resolvers and test
>>> with the RC (make sure to clean up the artifact cache before/after so
>>> you don't end up building with a out of date RC going forward).
>>>
>>> ===
>>> What should happen to JIRA tickets still targeting 3.0.0?
>>> ===
>>>
>>> The current list of open tickets targeted at 3.0.0 can be found at:
>>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>>> Version/s" = 3.0.0
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should
>>> be worked on immediately. Everything else please retarget to an
>>> appropriate release.
>>>
>>> ==
>>> But my bug isn't fixed?
>>> ==
>>>
>>> In order to make timely releases, we will typically not hold the
>>> release unless the bug in question is a regression from the previous
>>> release. That being said, if there is something which is a regression
>>> that has not been correctly targeted please ping me or a committer to
>>> help target the issue.
>>>
>>>
>>>
>
> --
> Best Regards!
> Jiaxin Shan
> Tel:  412-230-7670
> Address: 470 2nd Ave S, Kirkland, WA
> 
>
>


Re: [vote] Apache Spark 3.0 RC3

2020-06-08 Thread Jiaxin Shan
+1
I build binary using the following command, test spark workloads on
Kubernetes (AWS EKS) and it's working well.

./dev/make-distribution.sh --name spark-v3.0.0-rc3-20200608 --tgz
-Phadoop-3.2 -Pkubernetes -Phive -Phive-thriftserver -Phadoop-cloud
-Pscala-2.12

On Mon, Jun 8, 2020 at 7:13 PM Bryan Cutler  wrote:

> +1 (non-binding)
>
> On Mon, Jun 8, 2020, 1:49 PM Tom Graves 
> wrote:
>
>> +1
>>
>> Tom
>>
>> On Saturday, June 6, 2020, 03:09:09 PM CDT, Reynold Xin <
>> r...@databricks.com> wrote:
>>
>>
>> Please vote on releasing the following candidate as Apache Spark version
>> 3.0.0.
>>
>> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes
>> are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.0.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v3.0.0-rc3 (commit
>> 3fdfce3120f307147244e5eaf46d61419a723d50):
>> https://github.com/apache/spark/tree/v3.0.0-rc3
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1350/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>>
>> The list of bug fixes going into 3.0.0 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>
>> This release is using the release script of the tag v3.0.0-rc3.
>>
>> FAQ
>>
>> =
>> How can I help test this release?
>> =
>>
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===
>> What should happen to JIRA tickets still targeting 3.0.0?
>> ===
>>
>> The current list of open tickets targeted at 3.0.0 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 3.0.0
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==
>> But my bug isn't fixed?
>> ==
>>
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>>
>>

-- 
Best Regards!
Jiaxin Shan
Tel:  412-230-7670
Address: 470 2nd Ave S, Kirkland, WA


Re: [vote] Apache Spark 3.0 RC3

2020-06-08 Thread Huaxin Gao
+1 (non-binding)
 
 


-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [vote] Apache Spark 3.0 RC3

2020-06-08 Thread huaxingao
+1 (non-binding)



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [vote] Apache Spark 3.0 RC3

2020-06-08 Thread Bryan Cutler
+1 (non-binding)

On Mon, Jun 8, 2020, 1:49 PM Tom Graves 
wrote:

> +1
>
> Tom
>
> On Saturday, June 6, 2020, 03:09:09 PM CDT, Reynold Xin <
> r...@databricks.com> wrote:
>
>
> Please vote on releasing the following candidate as Apache Spark version
> 3.0.0.
>
> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes are
> cast, with a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.0.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v3.0.0-rc3 (commit
> 3fdfce3120f307147244e5eaf46d61419a723d50):
> https://github.com/apache/spark/tree/v3.0.0-rc3
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1350/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>
> The list of bug fixes going into 3.0.0 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>
> This release is using the release script of the tag v3.0.0-rc3.
>
> FAQ
>
> =
> How can I help test this release?
> =
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 3.0.0?
> ===
>
> The current list of open tickets targeted at 3.0.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.0.0
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
>
>
>


Re: [vote] Apache Spark 3.0 RC3

2020-06-08 Thread Tom Graves
 +1
Tom
On Saturday, June 6, 2020, 03:09:09 PM CDT, Reynold Xin 
 wrote:  
 
 Please vote on releasing the following candidate as Apache Spark version 3.0.0.

The vote is open until [DUE DAY] and passes if a majority +1 PMC votes are 
cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.0.0
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v3.0.0-rc3 (commit 
3fdfce3120f307147244e5eaf46d61419a723d50):
https://github.com/apache/spark/tree/v3.0.0-rc3

The release files, including signatures, digests, etc. can be found at:
https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/

Signatures used for Spark RCs can be found in this file:
https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1350/

The documentation corresponding to this release can be found at:
https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/

The list of bug fixes going into 3.0.0 can be found at the following URL:
https://issues.apache.org/jira/projects/SPARK/versions/12339177

This release is using the release script of the tag v3.0.0-rc3.

FAQ

=
How can I help test this release?
=

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===
What should happen to JIRA tickets still targeting 3.0.0?
===

The current list of open tickets targeted at 3.0.0 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" 
= 3.0.0

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==
But my bug isn't fixed?
==

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.


  

Re: [vote] Apache Spark 3.0 RC3

2020-06-08 Thread Michael Armbrust
+1 (binding)

On Mon, Jun 8, 2020 at 1:22 PM DB Tsai  wrote:

> +1 (binding)
>
> Sincerely,
>
> DB Tsai
> --
> Web: https://www.dbtsai.com
> PGP Key ID: 42E5B25A8F7A82C1
>
> On Mon, Jun 8, 2020 at 1:03 PM Dongjoon Hyun 
> wrote:
> >
> > +1
> >
> > Thanks,
> > Dongjoon.
> >
> > On Mon, Jun 8, 2020 at 6:37 AM Russell Spitzer <
> russell.spit...@gmail.com> wrote:
> >>
> >> +1 (non-binding) ran the new SCC DSV2 suite and all other tests, no
> issues
> >>
> >> On Sun, Jun 7, 2020 at 11:12 PM Yin Huai  wrote:
> >>>
> >>> Hello everyone,
> >>>
> >>> I am wondering if it makes more sense to not count Saturday and
> Sunday. I doubt that any serious testing work was done during this past
> weekend. Can we only count business days in the voting process?
> >>>
> >>> Thanks,
> >>>
> >>> Yin
> >>>
> >>> On Sun, Jun 7, 2020 at 3:24 PM Denny Lee 
> wrote:
> 
>  +1 (non-binding)
> 
>  On Sun, Jun 7, 2020 at 3:21 PM Jungtaek Lim <
> kabhwan.opensou...@gmail.com> wrote:
> >
> > I'm seeing the effort of including the correctness issue SPARK-28067
> [1] to 3.0.0 via SPARK-31894 [2]. That doesn't seem to be a regression so
> technically doesn't block the release, so while it'd be good to weigh its
> worth (it requires some SS users to discard the state so might bring less
> frightened requiring it in major version upgrade), it looks to be optional
> to include SPARK-28067 to 3.0.0.
> >
> > Besides, I see all blockers look to be resolved, thanks all for the
> amazing efforts!
> >
> > +1 (non-binding) if the decision of SPARK-28067 is "later".
> >
> > 1. https://issues.apache.org/jira/browse/SPARK-28067
> > 2. https://issues.apache.org/jira/browse/SPARK-31894
> >
> > On Mon, Jun 8, 2020 at 5:23 AM Matei Zaharia <
> matei.zaha...@gmail.com> wrote:
> >>
> >> +1
> >>
> >> Matei
> >>
> >> On Jun 7, 2020, at 6:53 AM, Maxim Gekk 
> wrote:
> >>
> >> +1 (non-binding)
> >>
> >> On Sun, Jun 7, 2020 at 2:34 PM Takeshi Yamamuro <
> linguin@gmail.com> wrote:
> >>>
> >>> +1 (non-binding)
> >>>
> >>> I don't see any ongoing PR to fix critical bugs in my area.
> >>> Bests,
> >>> Takeshi
> >>>
> >>> On Sun, Jun 7, 2020 at 7:24 PM Mridul Muralidharan <
> mri...@gmail.com> wrote:
> 
>  +1
> 
>  Regards,
>  Mridul
> 
>  On Sat, Jun 6, 2020 at 1:20 PM Reynold Xin 
> wrote:
> >
> > Apologies for the mistake. The vote is open till 11:59pm Pacific
> time on Mon June 9th.
> >
> > On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin 
> wrote:
> >>
> >> Please vote on releasing the following candidate as Apache
> Spark version 3.0.0.
> >>
> >> The vote is open until [DUE DAY] and passes if a majority +1
> PMC votes are cast, with a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 3.0.0
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see
> http://spark.apache.org/
> >>
> >> The tag to be voted on is v3.0.0-rc3 (commit
> 3fdfce3120f307147244e5eaf46d61419a723d50):
> >> https://github.com/apache/spark/tree/v3.0.0-rc3
> >>
> >> The release files, including signatures, digests, etc. can be
> found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> >>
> >> The staging repository for this release can be found at:
> >>
> https://repository.apache.org/content/repositories/orgapachespark-1350/
> >>
> >> The documentation corresponding to this release can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
> >>
> >> The list of bug fixes going into 3.0.0 can be found at the
> following URL:
> >> https://issues.apache.org/jira/projects/SPARK/versions/12339177
> >>
> >> This release is using the release script of the tag v3.0.0-rc3.
> >>
> >> FAQ
> >>
> >> =
> >> How can I help test this release?
> >> =
> >>
> >> If you are a Spark user, you can help us test this release by
> taking
> >> an existing Spark workload and running on this release
> candidate, then
> >> reporting any regressions.
> >>
> >> If you're working in PySpark you can set up a virtual env and
> install
> >> the current RC and see if anything important breaks, in the
> Java/Scala
> >> you can add the staging repository to your projects resolvers
> and test
> >> with the RC (make sure to clean up 

Re: [vote] Apache Spark 3.0 RC3

2020-06-08 Thread Dilip Biswal
+1 (non-binding)

Regards,
-- Dilip

On Mon, Jun 8, 2020 at 1:03 PM Dongjoon Hyun 
wrote:

> +1
>
> Thanks,
> Dongjoon.
>
> On Mon, Jun 8, 2020 at 6:37 AM Russell Spitzer 
> wrote:
>
>> +1 (non-binding) ran the new SCC DSV2 suite and all other tests, no issues
>>
>> On Sun, Jun 7, 2020 at 11:12 PM Yin Huai  wrote:
>>
>>> Hello everyone,
>>>
>>> I am wondering if it makes more sense to not count Saturday and Sunday.
>>> I doubt that any serious testing work was done during this past weekend.
>>> Can we only count business days in the voting process?
>>>
>>> Thanks,
>>>
>>> Yin
>>>
>>> On Sun, Jun 7, 2020 at 3:24 PM Denny Lee  wrote:
>>>
 +1 (non-binding)

 On Sun, Jun 7, 2020 at 3:21 PM Jungtaek Lim <
 kabhwan.opensou...@gmail.com> wrote:

> I'm seeing the effort of including the correctness issue SPARK-28067
> [1] to 3.0.0 via SPARK-31894 [2]. That doesn't seem to be a regression so
> technically doesn't block the release, so while it'd be good to weigh its
> worth (it requires some SS users to discard the state so might bring less
> frightened requiring it in major version upgrade), it looks to be optional
> to include SPARK-28067 to 3.0.0.
>
> Besides, I see all blockers look to be resolved, thanks all for the
> amazing efforts!
>
> +1 (non-binding) if the decision of SPARK-28067 is "later".
>
> 1. https://issues.apache.org/jira/browse/SPARK-28067
> 2. https://issues.apache.org/jira/browse/SPARK-31894
>
> On Mon, Jun 8, 2020 at 5:23 AM Matei Zaharia 
> wrote:
>
>> +1
>>
>> Matei
>>
>> On Jun 7, 2020, at 6:53 AM, Maxim Gekk 
>> wrote:
>>
>> +1 (non-binding)
>>
>> On Sun, Jun 7, 2020 at 2:34 PM Takeshi Yamamuro <
>> linguin@gmail.com> wrote:
>>
>>> +1 (non-binding)
>>>
>>> I don't see any ongoing PR to fix critical bugs in my area.
>>> Bests,
>>> Takeshi
>>>
>>> On Sun, Jun 7, 2020 at 7:24 PM Mridul Muralidharan 
>>> wrote:
>>>
 +1

 Regards,
 Mridul

 On Sat, Jun 6, 2020 at 1:20 PM Reynold Xin 
 wrote:

> Apologies for the mistake. The vote is open till 11:59pm Pacific
> time on Mon June 9th.
>
> On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin 
> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark
>> version 3.0.0.
>>
>> The vote is open until [DUE DAY] and passes if a majority +1 PMC
>> votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.0.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see
>> http://spark.apache.org/
>>
>> The tag to be voted on is v3.0.0-rc3 (commit
>> 3fdfce3120f307147244e5eaf46d61419a723d50):
>> https://github.com/apache/spark/tree/v3.0.0-rc3
>>
>> The release files, including signatures, digests, etc. can be
>> found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>>
>> https://repository.apache.org/content/repositories/orgapachespark-1350/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>>
>> The list of bug fixes going into 3.0.0 can be found at the
>> following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>
>> This release is using the release script of the tag v3.0.0-rc3.
>>
>> FAQ
>>
>> =
>> How can I help test this release?
>> =
>>
>> If you are a Spark user, you can help us test this release by
>> taking
>> an existing Spark workload and running on this release candidate,
>> then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and
>> install
>> the current RC and see if anything important breaks, in the
>> Java/Scala
>> you can add the staging repository to your projects resolvers and
>> test
>> with the RC (make sure to clean up the artifact cache
>> before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===
>> What should happen to JIRA tickets still targeting 3.0.0?
>> ===

Re: [vote] Apache Spark 3.0 RC3

2020-06-08 Thread DB Tsai
+1 (binding)

Sincerely,

DB Tsai
--
Web: https://www.dbtsai.com
PGP Key ID: 42E5B25A8F7A82C1

On Mon, Jun 8, 2020 at 1:03 PM Dongjoon Hyun  wrote:
>
> +1
>
> Thanks,
> Dongjoon.
>
> On Mon, Jun 8, 2020 at 6:37 AM Russell Spitzer  
> wrote:
>>
>> +1 (non-binding) ran the new SCC DSV2 suite and all other tests, no issues
>>
>> On Sun, Jun 7, 2020 at 11:12 PM Yin Huai  wrote:
>>>
>>> Hello everyone,
>>>
>>> I am wondering if it makes more sense to not count Saturday and Sunday. I 
>>> doubt that any serious testing work was done during this past weekend. Can 
>>> we only count business days in the voting process?
>>>
>>> Thanks,
>>>
>>> Yin
>>>
>>> On Sun, Jun 7, 2020 at 3:24 PM Denny Lee  wrote:

 +1 (non-binding)

 On Sun, Jun 7, 2020 at 3:21 PM Jungtaek Lim  
 wrote:
>
> I'm seeing the effort of including the correctness issue SPARK-28067 [1] 
> to 3.0.0 via SPARK-31894 [2]. That doesn't seem to be a regression so 
> technically doesn't block the release, so while it'd be good to weigh its 
> worth (it requires some SS users to discard the state so might bring less 
> frightened requiring it in major version upgrade), it looks to be 
> optional to include SPARK-28067 to 3.0.0.
>
> Besides, I see all blockers look to be resolved, thanks all for the 
> amazing efforts!
>
> +1 (non-binding) if the decision of SPARK-28067 is "later".
>
> 1. https://issues.apache.org/jira/browse/SPARK-28067
> 2. https://issues.apache.org/jira/browse/SPARK-31894
>
> On Mon, Jun 8, 2020 at 5:23 AM Matei Zaharia  
> wrote:
>>
>> +1
>>
>> Matei
>>
>> On Jun 7, 2020, at 6:53 AM, Maxim Gekk  wrote:
>>
>> +1 (non-binding)
>>
>> On Sun, Jun 7, 2020 at 2:34 PM Takeshi Yamamuro  
>> wrote:
>>>
>>> +1 (non-binding)
>>>
>>> I don't see any ongoing PR to fix critical bugs in my area.
>>> Bests,
>>> Takeshi
>>>
>>> On Sun, Jun 7, 2020 at 7:24 PM Mridul Muralidharan  
>>> wrote:

 +1

 Regards,
 Mridul

 On Sat, Jun 6, 2020 at 1:20 PM Reynold Xin  wrote:
>
> Apologies for the mistake. The vote is open till 11:59pm Pacific time 
> on Mon June 9th.
>
> On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin  
> wrote:
>>
>> Please vote on releasing the following candidate as Apache Spark 
>> version 3.0.0.
>>
>> The vote is open until [DUE DAY] and passes if a majority +1 PMC 
>> votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.0.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v3.0.0-rc3 (commit 
>> 3fdfce3120f307147244e5eaf46d61419a723d50):
>> https://github.com/apache/spark/tree/v3.0.0-rc3
>>
>> The release files, including signatures, digests, etc. can be found 
>> at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1350/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>>
>> The list of bug fixes going into 3.0.0 can be found at the following 
>> URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>
>> This release is using the release script of the tag v3.0.0-rc3.
>>
>> FAQ
>>
>> =
>> How can I help test this release?
>> =
>>
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, 
>> then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the 
>> Java/Scala
>> you can add the staging repository to your projects resolvers and 
>> test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===
>> What should happen to JIRA tickets still targeting 3.0.0?
>> 

Re: [vote] Apache Spark 3.0 RC3

2020-06-08 Thread Dongjoon Hyun
+1

Thanks,
Dongjoon.

On Mon, Jun 8, 2020 at 6:37 AM Russell Spitzer 
wrote:

> +1 (non-binding) ran the new SCC DSV2 suite and all other tests, no issues
>
> On Sun, Jun 7, 2020 at 11:12 PM Yin Huai  wrote:
>
>> Hello everyone,
>>
>> I am wondering if it makes more sense to not count Saturday and Sunday. I
>> doubt that any serious testing work was done during this past weekend. Can
>> we only count business days in the voting process?
>>
>> Thanks,
>>
>> Yin
>>
>> On Sun, Jun 7, 2020 at 3:24 PM Denny Lee  wrote:
>>
>>> +1 (non-binding)
>>>
>>> On Sun, Jun 7, 2020 at 3:21 PM Jungtaek Lim <
>>> kabhwan.opensou...@gmail.com> wrote:
>>>
 I'm seeing the effort of including the correctness issue SPARK-28067
 [1] to 3.0.0 via SPARK-31894 [2]. That doesn't seem to be a regression so
 technically doesn't block the release, so while it'd be good to weigh its
 worth (it requires some SS users to discard the state so might bring less
 frightened requiring it in major version upgrade), it looks to be optional
 to include SPARK-28067 to 3.0.0.

 Besides, I see all blockers look to be resolved, thanks all for the
 amazing efforts!

 +1 (non-binding) if the decision of SPARK-28067 is "later".

 1. https://issues.apache.org/jira/browse/SPARK-28067
 2. https://issues.apache.org/jira/browse/SPARK-31894

 On Mon, Jun 8, 2020 at 5:23 AM Matei Zaharia 
 wrote:

> +1
>
> Matei
>
> On Jun 7, 2020, at 6:53 AM, Maxim Gekk 
> wrote:
>
> +1 (non-binding)
>
> On Sun, Jun 7, 2020 at 2:34 PM Takeshi Yamamuro 
> wrote:
>
>> +1 (non-binding)
>>
>> I don't see any ongoing PR to fix critical bugs in my area.
>> Bests,
>> Takeshi
>>
>> On Sun, Jun 7, 2020 at 7:24 PM Mridul Muralidharan 
>> wrote:
>>
>>> +1
>>>
>>> Regards,
>>> Mridul
>>>
>>> On Sat, Jun 6, 2020 at 1:20 PM Reynold Xin 
>>> wrote:
>>>
 Apologies for the mistake. The vote is open till 11:59pm Pacific
 time on Mon June 9th.

 On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin 
 wrote:

> Please vote on releasing the following candidate as Apache Spark
> version 3.0.0.
>
> The vote is open until [DUE DAY] and passes if a majority +1 PMC
> votes are cast, with a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.0.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see
> http://spark.apache.org/
>
> The tag to be voted on is v3.0.0-rc3 (commit
> 3fdfce3120f307147244e5eaf46d61419a723d50):
> https://github.com/apache/spark/tree/v3.0.0-rc3
>
> The release files, including signatures, digests, etc. can be
> found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
>
> https://repository.apache.org/content/repositories/orgapachespark-1350/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>
> The list of bug fixes going into 3.0.0 can be found at the
> following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>
> This release is using the release script of the tag v3.0.0-rc3.
>
> FAQ
>
> =
> How can I help test this release?
> =
>
> If you are a Spark user, you can help us test this release by
> taking
> an existing Spark workload and running on this release candidate,
> then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and
> install
> the current RC and see if anything important breaks, in the
> Java/Scala
> you can add the staging repository to your projects resolvers and
> test
> with the RC (make sure to clean up the artifact cache before/after
> so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 3.0.0?
> ===
>
> The current list of open tickets targeted at 3.0.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for
> "Target Version/s" = 3.0.0
>
> Committers should look at those and triage. Extremely 

Re: [vote] Apache Spark 3.0 RC3

2020-06-08 Thread Russell Spitzer
+1 (non-binding) ran the new SCC DSV2 suite and all other tests, no issues

On Sun, Jun 7, 2020 at 11:12 PM Yin Huai  wrote:

> Hello everyone,
>
> I am wondering if it makes more sense to not count Saturday and Sunday. I
> doubt that any serious testing work was done during this past weekend. Can
> we only count business days in the voting process?
>
> Thanks,
>
> Yin
>
> On Sun, Jun 7, 2020 at 3:24 PM Denny Lee  wrote:
>
>> +1 (non-binding)
>>
>> On Sun, Jun 7, 2020 at 3:21 PM Jungtaek Lim 
>> wrote:
>>
>>> I'm seeing the effort of including the correctness issue SPARK-28067 [1]
>>> to 3.0.0 via SPARK-31894 [2]. That doesn't seem to be a regression so
>>> technically doesn't block the release, so while it'd be good to weigh its
>>> worth (it requires some SS users to discard the state so might bring less
>>> frightened requiring it in major version upgrade), it looks to be optional
>>> to include SPARK-28067 to 3.0.0.
>>>
>>> Besides, I see all blockers look to be resolved, thanks all for the
>>> amazing efforts!
>>>
>>> +1 (non-binding) if the decision of SPARK-28067 is "later".
>>>
>>> 1. https://issues.apache.org/jira/browse/SPARK-28067
>>> 2. https://issues.apache.org/jira/browse/SPARK-31894
>>>
>>> On Mon, Jun 8, 2020 at 5:23 AM Matei Zaharia 
>>> wrote:
>>>
 +1

 Matei

 On Jun 7, 2020, at 6:53 AM, Maxim Gekk 
 wrote:

 +1 (non-binding)

 On Sun, Jun 7, 2020 at 2:34 PM Takeshi Yamamuro 
 wrote:

> +1 (non-binding)
>
> I don't see any ongoing PR to fix critical bugs in my area.
> Bests,
> Takeshi
>
> On Sun, Jun 7, 2020 at 7:24 PM Mridul Muralidharan 
> wrote:
>
>> +1
>>
>> Regards,
>> Mridul
>>
>> On Sat, Jun 6, 2020 at 1:20 PM Reynold Xin 
>> wrote:
>>
>>> Apologies for the mistake. The vote is open till 11:59pm Pacific
>>> time on Mon June 9th.
>>>
>>> On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin 
>>> wrote:
>>>
 Please vote on releasing the following candidate as Apache Spark
 version 3.0.0.

 The vote is open until [DUE DAY] and passes if a majority +1 PMC
 votes are cast, with a minimum of 3 +1 votes.

 [ ] +1 Release this package as Apache Spark 3.0.0
 [ ] -1 Do not release this package because ...

 To learn more about Apache Spark, please see
 http://spark.apache.org/

 The tag to be voted on is v3.0.0-rc3 (commit
 3fdfce3120f307147244e5eaf46d61419a723d50):
 https://github.com/apache/spark/tree/v3.0.0-rc3

 The release files, including signatures, digests, etc. can be found
 at:
 https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/

 Signatures used for Spark RCs can be found in this file:
 https://dist.apache.org/repos/dist/dev/spark/KEYS

 The staging repository for this release can be found at:

 https://repository.apache.org/content/repositories/orgapachespark-1350/

 The documentation corresponding to this release can be found at:
 https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/

 The list of bug fixes going into 3.0.0 can be found at the
 following URL:
 https://issues.apache.org/jira/projects/SPARK/versions/12339177

 This release is using the release script of the tag v3.0.0-rc3.

 FAQ

 =
 How can I help test this release?
 =

 If you are a Spark user, you can help us test this release by taking
 an existing Spark workload and running on this release candidate,
 then
 reporting any regressions.

 If you're working in PySpark you can set up a virtual env and
 install
 the current RC and see if anything important breaks, in the
 Java/Scala
 you can add the staging repository to your projects resolvers and
 test
 with the RC (make sure to clean up the artifact cache before/after
 so
 you don't end up building with a out of date RC going forward).

 ===
 What should happen to JIRA tickets still targeting 3.0.0?
 ===

 The current list of open tickets targeted at 3.0.0 can be found at:
 https://issues.apache.org/jira/projects/SPARK and search for
 "Target Version/s" = 3.0.0

 Committers should look at those and triage. Extremely important bug
 fixes, documentation, and API tweaks that impact compatibility
 should
 be worked on immediately. Everything else please retarget to an
 appropriate release.

 ==
 But my bug 

Re: [vote] Apache Spark 3.0 RC3

2020-06-07 Thread Yin Huai
Hello everyone,

I am wondering if it makes more sense to not count Saturday and Sunday. I
doubt that any serious testing work was done during this past weekend. Can
we only count business days in the voting process?

Thanks,

Yin

On Sun, Jun 7, 2020 at 3:24 PM Denny Lee  wrote:

> +1 (non-binding)
>
> On Sun, Jun 7, 2020 at 3:21 PM Jungtaek Lim 
> wrote:
>
>> I'm seeing the effort of including the correctness issue SPARK-28067 [1]
>> to 3.0.0 via SPARK-31894 [2]. That doesn't seem to be a regression so
>> technically doesn't block the release, so while it'd be good to weigh its
>> worth (it requires some SS users to discard the state so might bring less
>> frightened requiring it in major version upgrade), it looks to be optional
>> to include SPARK-28067 to 3.0.0.
>>
>> Besides, I see all blockers look to be resolved, thanks all for the
>> amazing efforts!
>>
>> +1 (non-binding) if the decision of SPARK-28067 is "later".
>>
>> 1. https://issues.apache.org/jira/browse/SPARK-28067
>> 2. https://issues.apache.org/jira/browse/SPARK-31894
>>
>> On Mon, Jun 8, 2020 at 5:23 AM Matei Zaharia 
>> wrote:
>>
>>> +1
>>>
>>> Matei
>>>
>>> On Jun 7, 2020, at 6:53 AM, Maxim Gekk 
>>> wrote:
>>>
>>> +1 (non-binding)
>>>
>>> On Sun, Jun 7, 2020 at 2:34 PM Takeshi Yamamuro 
>>> wrote:
>>>
 +1 (non-binding)

 I don't see any ongoing PR to fix critical bugs in my area.
 Bests,
 Takeshi

 On Sun, Jun 7, 2020 at 7:24 PM Mridul Muralidharan 
 wrote:

> +1
>
> Regards,
> Mridul
>
> On Sat, Jun 6, 2020 at 1:20 PM Reynold Xin 
> wrote:
>
>> Apologies for the mistake. The vote is open till 11:59pm Pacific time
>> on Mon June 9th.
>>
>> On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin 
>> wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark
>>> version 3.0.0.
>>>
>>> The vote is open until [DUE DAY] and passes if a majority +1 PMC
>>> votes are cast, with a minimum of 3 +1 votes.
>>>
>>> [ ] +1 Release this package as Apache Spark 3.0.0
>>> [ ] -1 Do not release this package because ...
>>>
>>> To learn more about Apache Spark, please see
>>> http://spark.apache.org/
>>>
>>> The tag to be voted on is v3.0.0-rc3 (commit
>>> 3fdfce3120f307147244e5eaf46d61419a723d50):
>>> https://github.com/apache/spark/tree/v3.0.0-rc3
>>>
>>> The release files, including signatures, digests, etc. can be found
>>> at:
>>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>>>
>>> Signatures used for Spark RCs can be found in this file:
>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>
>>> The staging repository for this release can be found at:
>>>
>>> https://repository.apache.org/content/repositories/orgapachespark-1350/
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>>>
>>> The list of bug fixes going into 3.0.0 can be found at the following
>>> URL:
>>> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>>
>>> This release is using the release script of the tag v3.0.0-rc3.
>>>
>>> FAQ
>>>
>>> =
>>> How can I help test this release?
>>> =
>>>
>>> If you are a Spark user, you can help us test this release by taking
>>> an existing Spark workload and running on this release candidate,
>>> then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the
>>> Java/Scala
>>> you can add the staging repository to your projects resolvers and
>>> test
>>> with the RC (make sure to clean up the artifact cache before/after so
>>> you don't end up building with a out of date RC going forward).
>>>
>>> ===
>>> What should happen to JIRA tickets still targeting 3.0.0?
>>> ===
>>>
>>> The current list of open tickets targeted at 3.0.0 can be found at:
>>> https://issues.apache.org/jira/projects/SPARK and search for
>>> "Target Version/s" = 3.0.0
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should
>>> be worked on immediately. Everything else please retarget to an
>>> appropriate release.
>>>
>>> ==
>>> But my bug isn't fixed?
>>> ==
>>>
>>> In order to make timely releases, we will typically not hold the
>>> release unless the bug in question is a regression from the previous
>>> release. That being said, if there is something which is a regression
>>> that has not 

Re: [vote] Apache Spark 3.0 RC3

2020-06-07 Thread Denny Lee
+1 (non-binding)

On Sun, Jun 7, 2020 at 3:21 PM Jungtaek Lim 
wrote:

> I'm seeing the effort of including the correctness issue SPARK-28067 [1]
> to 3.0.0 via SPARK-31894 [2]. That doesn't seem to be a regression so
> technically doesn't block the release, so while it'd be good to weigh its
> worth (it requires some SS users to discard the state so might bring less
> frightened requiring it in major version upgrade), it looks to be optional
> to include SPARK-28067 to 3.0.0.
>
> Besides, I see all blockers look to be resolved, thanks all for the
> amazing efforts!
>
> +1 (non-binding) if the decision of SPARK-28067 is "later".
>
> 1. https://issues.apache.org/jira/browse/SPARK-28067
> 2. https://issues.apache.org/jira/browse/SPARK-31894
>
> On Mon, Jun 8, 2020 at 5:23 AM Matei Zaharia 
> wrote:
>
>> +1
>>
>> Matei
>>
>> On Jun 7, 2020, at 6:53 AM, Maxim Gekk  wrote:
>>
>> +1 (non-binding)
>>
>> On Sun, Jun 7, 2020 at 2:34 PM Takeshi Yamamuro 
>> wrote:
>>
>>> +1 (non-binding)
>>>
>>> I don't see any ongoing PR to fix critical bugs in my area.
>>> Bests,
>>> Takeshi
>>>
>>> On Sun, Jun 7, 2020 at 7:24 PM Mridul Muralidharan 
>>> wrote:
>>>
 +1

 Regards,
 Mridul

 On Sat, Jun 6, 2020 at 1:20 PM Reynold Xin  wrote:

> Apologies for the mistake. The vote is open till 11:59pm Pacific time
> on Mon June 9th.
>
> On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin 
> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark
>> version 3.0.0.
>>
>> The vote is open until [DUE DAY] and passes if a majority +1 PMC
>> votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.0.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v3.0.0-rc3 (commit
>> 3fdfce3120f307147244e5eaf46d61419a723d50):
>> https://github.com/apache/spark/tree/v3.0.0-rc3
>>
>> The release files, including signatures, digests, etc. can be found
>> at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>>
>> https://repository.apache.org/content/repositories/orgapachespark-1350/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>>
>> The list of bug fixes going into 3.0.0 can be found at the following
>> URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>
>> This release is using the release script of the tag v3.0.0-rc3.
>>
>> FAQ
>>
>> =
>> How can I help test this release?
>> =
>>
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===
>> What should happen to JIRA tickets still targeting 3.0.0?
>> ===
>>
>> The current list of open tickets targeted at 3.0.0 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 3.0.0
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==
>> But my bug isn't fixed?
>> ==
>>
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>>
>>
>>>
>>> --
>>> ---
>>> Takeshi Yamamuro
>>>
>>
>>


Re: [vote] Apache Spark 3.0 RC3

2020-06-07 Thread Jungtaek Lim
I'm seeing the effort of including the correctness issue SPARK-28067 [1] to
3.0.0 via SPARK-31894 [2]. That doesn't seem to be a regression so
technically doesn't block the release, so while it'd be good to weigh its
worth (it requires some SS users to discard the state so might bring less
frightened requiring it in major version upgrade), it looks to be optional
to include SPARK-28067 to 3.0.0.

Besides, I see all blockers look to be resolved, thanks all for the amazing
efforts!

+1 (non-binding) if the decision of SPARK-28067 is "later".

1. https://issues.apache.org/jira/browse/SPARK-28067
2. https://issues.apache.org/jira/browse/SPARK-31894

On Mon, Jun 8, 2020 at 5:23 AM Matei Zaharia 
wrote:

> +1
>
> Matei
>
> On Jun 7, 2020, at 6:53 AM, Maxim Gekk  wrote:
>
> +1 (non-binding)
>
> On Sun, Jun 7, 2020 at 2:34 PM Takeshi Yamamuro 
> wrote:
>
>> +1 (non-binding)
>>
>> I don't see any ongoing PR to fix critical bugs in my area.
>> Bests,
>> Takeshi
>>
>> On Sun, Jun 7, 2020 at 7:24 PM Mridul Muralidharan 
>> wrote:
>>
>>> +1
>>>
>>> Regards,
>>> Mridul
>>>
>>> On Sat, Jun 6, 2020 at 1:20 PM Reynold Xin  wrote:
>>>
 Apologies for the mistake. The vote is open till 11:59pm Pacific time
 on Mon June 9th.

 On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin  wrote:

> Please vote on releasing the following candidate as Apache Spark
> version 3.0.0.
>
> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes
> are cast, with a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.0.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v3.0.0-rc3 (commit
> 3fdfce3120f307147244e5eaf46d61419a723d50):
> https://github.com/apache/spark/tree/v3.0.0-rc3
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1350/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>
> The list of bug fixes going into 3.0.0 can be found at the following
> URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>
> This release is using the release script of the tag v3.0.0-rc3.
>
> FAQ
>
> =
> How can I help test this release?
> =
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 3.0.0?
> ===
>
> The current list of open tickets targeted at 3.0.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.0.0
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
>
>
>
>>
>> --
>> ---
>> Takeshi Yamamuro
>>
>
>


Re: [vote] Apache Spark 3.0 RC3

2020-06-07 Thread Matei Zaharia
+1

Matei

> On Jun 7, 2020, at 6:53 AM, Maxim Gekk  wrote:
> 
> +1 (non-binding)
> 
> On Sun, Jun 7, 2020 at 2:34 PM Takeshi Yamamuro  > wrote:
> +1 (non-binding)
> 
> I don't see any ongoing PR to fix critical bugs in my area.
> Bests,
> Takeshi
> 
> On Sun, Jun 7, 2020 at 7:24 PM Mridul Muralidharan  > wrote:
> +1
> 
> Regards,
> Mridul
> 
> On Sat, Jun 6, 2020 at 1:20 PM Reynold Xin  > wrote:
> Apologies for the mistake. The vote is open till 11:59pm Pacific time on Mon 
> June 9th. 
> 
> On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin  > wrote:
> Please vote on releasing the following candidate as Apache Spark version 
> 3.0.0.
> 
> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes are 
> cast, with a minimum of 3 +1 votes.
> 
> [ ] +1 Release this package as Apache Spark 3.0.0
> [ ] -1 Do not release this package because ...
> 
> To learn more about Apache Spark, please see http://spark.apache.org/ 
> 
> 
> The tag to be voted on is v3.0.0-rc3 (commit 
> 3fdfce3120f307147244e5eaf46d61419a723d50):
> https://github.com/apache/spark/tree/v3.0.0-rc3 
> 
> 
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/ 
> 
> 
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS 
> 
> 
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1350/ 
> 
> 
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/ 
> 
> 
> The list of bug fixes going into 3.0.0 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12339177 
> 
> 
> This release is using the release script of the tag v3.0.0-rc3.
> 
> FAQ
> 
> =
> How can I help test this release?
> =
> 
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
> 
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
> 
> ===
> What should happen to JIRA tickets still targeting 3.0.0?
> ===
> 
> The current list of open tickets targeted at 3.0.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK 
>  and search for "Target 
> Version/s" = 3.0.0
> 
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
> 
> ==
> But my bug isn't fixed?
> ==
> 
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
> 
> 
> 
> 
> -- 
> ---
> Takeshi Yamamuro



Re: [vote] Apache Spark 3.0 RC3

2020-06-07 Thread Maxim Gekk
+1 (non-binding)

On Sun, Jun 7, 2020 at 2:34 PM Takeshi Yamamuro 
wrote:

> +1 (non-binding)
>
> I don't see any ongoing PR to fix critical bugs in my area.
> Bests,
> Takeshi
>
> On Sun, Jun 7, 2020 at 7:24 PM Mridul Muralidharan 
> wrote:
>
>> +1
>>
>> Regards,
>> Mridul
>>
>> On Sat, Jun 6, 2020 at 1:20 PM Reynold Xin  wrote:
>>
>>> Apologies for the mistake. The vote is open till 11:59pm Pacific time on
>>> Mon June 9th.
>>>
>>> On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin  wrote:
>>>
 Please vote on releasing the following candidate as Apache Spark
 version 3.0.0.

 The vote is open until [DUE DAY] and passes if a majority +1 PMC votes
 are cast, with a minimum of 3 +1 votes.

 [ ] +1 Release this package as Apache Spark 3.0.0
 [ ] -1 Do not release this package because ...

 To learn more about Apache Spark, please see http://spark.apache.org/

 The tag to be voted on is v3.0.0-rc3 (commit
 3fdfce3120f307147244e5eaf46d61419a723d50):
 https://github.com/apache/spark/tree/v3.0.0-rc3

 The release files, including signatures, digests, etc. can be found at:
 https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/

 Signatures used for Spark RCs can be found in this file:
 https://dist.apache.org/repos/dist/dev/spark/KEYS

 The staging repository for this release can be found at:
 https://repository.apache.org/content/repositories/orgapachespark-1350/

 The documentation corresponding to this release can be found at:
 https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/

 The list of bug fixes going into 3.0.0 can be found at the following
 URL:
 https://issues.apache.org/jira/projects/SPARK/versions/12339177

 This release is using the release script of the tag v3.0.0-rc3.

 FAQ

 =
 How can I help test this release?
 =

 If you are a Spark user, you can help us test this release by taking
 an existing Spark workload and running on this release candidate, then
 reporting any regressions.

 If you're working in PySpark you can set up a virtual env and install
 the current RC and see if anything important breaks, in the Java/Scala
 you can add the staging repository to your projects resolvers and test
 with the RC (make sure to clean up the artifact cache before/after so
 you don't end up building with a out of date RC going forward).

 ===
 What should happen to JIRA tickets still targeting 3.0.0?
 ===

 The current list of open tickets targeted at 3.0.0 can be found at:
 https://issues.apache.org/jira/projects/SPARK and search for "Target
 Version/s" = 3.0.0

 Committers should look at those and triage. Extremely important bug
 fixes, documentation, and API tweaks that impact compatibility should
 be worked on immediately. Everything else please retarget to an
 appropriate release.

 ==
 But my bug isn't fixed?
 ==

 In order to make timely releases, we will typically not hold the
 release unless the bug in question is a regression from the previous
 release. That being said, if there is something which is a regression
 that has not been correctly targeted please ping me or a committer to
 help target the issue.



>
> --
> ---
> Takeshi Yamamuro
>


Re: [vote] Apache Spark 3.0 RC3

2020-06-07 Thread Takeshi Yamamuro
+1 (non-binding)

I don't see any ongoing PR to fix critical bugs in my area.
Bests,
Takeshi

On Sun, Jun 7, 2020 at 7:24 PM Mridul Muralidharan  wrote:

> +1
>
> Regards,
> Mridul
>
> On Sat, Jun 6, 2020 at 1:20 PM Reynold Xin  wrote:
>
>> Apologies for the mistake. The vote is open till 11:59pm Pacific time on
>> Mon June 9th.
>>
>> On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin  wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark version
>>> 3.0.0.
>>>
>>> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes
>>> are cast, with a minimum of 3 +1 votes.
>>>
>>> [ ] +1 Release this package as Apache Spark 3.0.0
>>> [ ] -1 Do not release this package because ...
>>>
>>> To learn more about Apache Spark, please see http://spark.apache.org/
>>>
>>> The tag to be voted on is v3.0.0-rc3 (commit
>>> 3fdfce3120f307147244e5eaf46d61419a723d50):
>>> https://github.com/apache/spark/tree/v3.0.0-rc3
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>>>
>>> Signatures used for Spark RCs can be found in this file:
>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1350/
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>>>
>>> The list of bug fixes going into 3.0.0 can be found at the following URL:
>>> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>>
>>> This release is using the release script of the tag v3.0.0-rc3.
>>>
>>> FAQ
>>>
>>> =
>>> How can I help test this release?
>>> =
>>>
>>> If you are a Spark user, you can help us test this release by taking
>>> an existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala
>>> you can add the staging repository to your projects resolvers and test
>>> with the RC (make sure to clean up the artifact cache before/after so
>>> you don't end up building with a out of date RC going forward).
>>>
>>> ===
>>> What should happen to JIRA tickets still targeting 3.0.0?
>>> ===
>>>
>>> The current list of open tickets targeted at 3.0.0 can be found at:
>>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>>> Version/s" = 3.0.0
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should
>>> be worked on immediately. Everything else please retarget to an
>>> appropriate release.
>>>
>>> ==
>>> But my bug isn't fixed?
>>> ==
>>>
>>> In order to make timely releases, we will typically not hold the
>>> release unless the bug in question is a regression from the previous
>>> release. That being said, if there is something which is a regression
>>> that has not been correctly targeted please ping me or a committer to
>>> help target the issue.
>>>
>>>
>>>

-- 
---
Takeshi Yamamuro


Re: [vote] Apache Spark 3.0 RC3

2020-06-07 Thread Mridul Muralidharan
+1

Regards,
Mridul

On Sat, Jun 6, 2020 at 1:20 PM Reynold Xin  wrote:

> Apologies for the mistake. The vote is open till 11:59pm Pacific time on
> Mon June 9th.
>
> On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin  wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 3.0.0.
>>
>> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes
>> are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.0.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v3.0.0-rc3 (commit
>> 3fdfce3120f307147244e5eaf46d61419a723d50):
>> https://github.com/apache/spark/tree/v3.0.0-rc3
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1350/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>>
>> The list of bug fixes going into 3.0.0 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>
>> This release is using the release script of the tag v3.0.0-rc3.
>>
>> FAQ
>>
>> =
>> How can I help test this release?
>> =
>>
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===
>> What should happen to JIRA tickets still targeting 3.0.0?
>> ===
>>
>> The current list of open tickets targeted at 3.0.0 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 3.0.0
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==
>> But my bug isn't fixed?
>> ==
>>
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>>
>>


Re: [vote] Apache Spark 3.0 RC3

2020-06-06 Thread Sean Owen
+1 from me as well. Same feedback as the last RC.

On Sat, Jun 6, 2020 at 3:09 PM Reynold Xin  wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 3.0.0.
>
> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes are
> cast, with a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.0.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v3.0.0-rc3 (commit
> 3fdfce3120f307147244e5eaf46d61419a723d50):
> https://github.com/apache/spark/tree/v3.0.0-rc3
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1350/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>
> The list of bug fixes going into 3.0.0 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>
> This release is using the release script of the tag v3.0.0-rc3.
>
> FAQ
>
> =
> How can I help test this release?
> =
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 3.0.0?
> ===
>
> The current list of open tickets targeted at 3.0.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.0.0
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
>
>
>


Re: [vote] Apache Spark 3.0 RC3

2020-06-06 Thread Gengliang Wang
+1 (non-binding)

On Sat, Jun 6, 2020 at 5:20 PM Prashant Sharma  wrote:

> +1
>
> On Sun, Jun 7, 2020 at 1:50 AM Reynold Xin  wrote:
>
>> Apologies for the mistake. The vote is open till 11:59pm Pacific time on
>> Mon June 9th.
>>
>> On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin  wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark version
>>> 3.0.0.
>>>
>>> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes
>>> are cast, with a minimum of 3 +1 votes.
>>>
>>> [ ] +1 Release this package as Apache Spark 3.0.0
>>> [ ] -1 Do not release this package because ...
>>>
>>> To learn more about Apache Spark, please see http://spark.apache.org/
>>>
>>> The tag to be voted on is v3.0.0-rc3 (commit
>>> 3fdfce3120f307147244e5eaf46d61419a723d50):
>>> https://github.com/apache/spark/tree/v3.0.0-rc3
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>>>
>>> Signatures used for Spark RCs can be found in this file:
>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1350/
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>>>
>>> The list of bug fixes going into 3.0.0 can be found at the following URL:
>>> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>>
>>> This release is using the release script of the tag v3.0.0-rc3.
>>>
>>> FAQ
>>>
>>> =
>>> How can I help test this release?
>>> =
>>>
>>> If you are a Spark user, you can help us test this release by taking
>>> an existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala
>>> you can add the staging repository to your projects resolvers and test
>>> with the RC (make sure to clean up the artifact cache before/after so
>>> you don't end up building with a out of date RC going forward).
>>>
>>> ===
>>> What should happen to JIRA tickets still targeting 3.0.0?
>>> ===
>>>
>>> The current list of open tickets targeted at 3.0.0 can be found at:
>>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>>> Version/s" = 3.0.0
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should
>>> be worked on immediately. Everything else please retarget to an
>>> appropriate release.
>>>
>>> ==
>>> But my bug isn't fixed?
>>> ==
>>>
>>> In order to make timely releases, we will typically not hold the
>>> release unless the bug in question is a regression from the previous
>>> release. That being said, if there is something which is a regression
>>> that has not been correctly targeted please ping me or a committer to
>>> help target the issue.
>>>
>>>
>>>


Re: [vote] Apache Spark 3.0 RC3

2020-06-06 Thread Prashant Sharma
+1

On Sun, Jun 7, 2020 at 1:50 AM Reynold Xin  wrote:

> Apologies for the mistake. The vote is open till 11:59pm Pacific time on
> Mon June 9th.
>
> On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin  wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 3.0.0.
>>
>> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes
>> are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.0.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v3.0.0-rc3 (commit
>> 3fdfce3120f307147244e5eaf46d61419a723d50):
>> https://github.com/apache/spark/tree/v3.0.0-rc3
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1350/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>>
>> The list of bug fixes going into 3.0.0 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>
>> This release is using the release script of the tag v3.0.0-rc3.
>>
>> FAQ
>>
>> =
>> How can I help test this release?
>> =
>>
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===
>> What should happen to JIRA tickets still targeting 3.0.0?
>> ===
>>
>> The current list of open tickets targeted at 3.0.0 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 3.0.0
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==
>> But my bug isn't fixed?
>> ==
>>
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>>
>>


Re: [vote] Apache Spark 3.0 RC3

2020-06-06 Thread Reynold Xin
Apologies for the mistake. The vote is open till 11:59pm Pacific time on
Mon June 9th.

On Sat, Jun 6, 2020 at 1:08 PM Reynold Xin  wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 3.0.0.
>
> The vote is open until [DUE DAY] and passes if a majority +1 PMC votes are
> cast, with a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.0.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v3.0.0-rc3 (commit
> 3fdfce3120f307147244e5eaf46d61419a723d50):
> https://github.com/apache/spark/tree/v3.0.0-rc3
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1350/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/
>
> The list of bug fixes going into 3.0.0 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>
> This release is using the release script of the tag v3.0.0-rc3.
>
> FAQ
>
> =
> How can I help test this release?
> =
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 3.0.0?
> ===
>
> The current list of open tickets targeted at 3.0.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.0.0
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
>
>
>


smime.p7s
Description: S/MIME Cryptographic Signature


[vote] Apache Spark 3.0 RC3

2020-06-06 Thread Reynold Xin
Please vote on releasing the following candidate as Apache Spark version 3.0.0.

The vote is open until [DUE DAY] and passes if a majority +1 PMC votes are 
cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.0.0

[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v3.0.0-rc3 (commit 
3fdfce3120f307147244e5eaf46d61419a723d50):

https://github.com/apache/spark/tree/v3.0.0-rc3

The release files, including signatures, digests, etc. can be found at:

https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-bin/

Signatures used for Spark RCs can be found in this file:

https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:

https://repository.apache.org/content/repositories/orgapachespark-1350/

The documentation corresponding to this release can be found at:

https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc3-docs/

The list of bug fixes going into 3.0.0 can be found at the following URL:

https://issues.apache.org/jira/projects/SPARK/versions/12339177

This release is using the release script of the tag v3.0.0-rc3.

FAQ

=

How can I help test this release?

=

If you are a Spark user, you can help us test this release by taking

an existing Spark workload and running on this release candidate, then

reporting any regressions.

If you're working in PySpark you can set up a virtual env and install

the current RC and see if anything important breaks, in the Java/Scala

you can add the staging repository to your projects resolvers and test

with the RC (make sure to clean up the artifact cache before/after so

you don't end up building with a out of date RC going forward).

===

What should happen to JIRA tickets still targeting 3.0.0?

===

The current list of open tickets targeted at 3.0.0 can be found at:

https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" 
= 3.0.0

Committers should look at those and triage. Extremely important bug

fixes, documentation, and API tweaks that impact compatibility should

be worked on immediately. Everything else please retarget to an

appropriate release.

==

But my bug isn't fixed?

==

In order to make timely releases, we will typically not hold the

release unless the bug in question is a regression from the previous

release. That being said, if there is something which is a regression

that has not been correctly targeted please ping me or a committer to

help target the issue.

smime.p7s
Description: S/MIME Cryptographic Signature