[Spark-Core]port opened by the SparkDriver is vulnerable to flooding attacks

2018-02-19 Thread sandeep_katta
SparkSubmit will open the port to communicate with the APP Master and 
executors. 

This port is not closing the IDLE connections,so it is vulnerable for DOS 
attack,I did telnet IP port and this connection is not closed. 

In order to fix this I tried to Handle in the *userEventTriggered * of 
*TransportChannelHandler.java * class,to my surprise APP master is also IDLE 
if no JOB is submitted ,so fixing this will result in terminating the 
connection for APP master also. 

If there is a long running JOB then this solution impact this Use case.

Any one have you come across this type of problem,and is there any other way 
to fix this issue 



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Takuya UESHIN
+1


On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang  wrote:

> +1
>
>
> Wenchen Fan 于2018年2月20日 周二下午1:09写道:
>
>> +1
>>
>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin 
>> wrote:
>>
>>> +1
>>>
>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal ,
>>> wrote:
>>>
>>> this file shouldn't be included? https://dist.apache.org/repos/
 dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml

>>>
>>> I've now deleted this file
>>>
>>> *From:* Sameer Agarwal 
 *Sent:* Saturday, February 17, 2018 1:43:39 PM
 *To:* Sameer Agarwal
 *Cc:* dev
 *Subject:* Re: [VOTE] Spark 2.3.0 (RC4)

 I'll start with a +1 once again.

 All blockers reported against RC3 have been resolved and the builds are
 healthy.

 On 17 February 2018 at 13:41, Sameer Agarwal 
 wrote:

> Please vote on releasing the following candidate as Apache Spark
> version 2.3.0. The vote is open until Thursday February 22, 2018 at 
> 8:00:00
> am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>
>
> [ ] +1 Release this package as Apache Spark 2.3.0
>
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.3.0-rc4: https://github.com/apache/
> spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)
>
> List of JIRA tickets resolved in this release can be found here:
> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/
>
> Release artifacts are signed with the following key:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/
> orgapachespark-1265/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-
> docs/_site/index.html
>
>
> FAQ
>
> ===
> What are the unresolved issues targeted for 2.3.0?
> ===
>
> Please see https://s.apache.org/oXKi. At the time of writing, there
> are currently no known release blockers.
>
> =
> How can I help test this release?
> =
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala you
> can add the staging repository to your projects resolvers and test with 
> the
> RC (make sure to clean up the artifact cache before/after so you don't end
> up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 2.3.0?
> ===
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should be
> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 
> as
> appropriate.
>
> ===
> Why is my bug not fixed?
> ===
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from 2.2.0. That being
> said, if there is something which is a regression from 2.2.0 and has not
> been correctly targeted please ping me or a committer to help target the
> issue (you can see the open issues listed as impacting Spark 2.3.0 at
> https://s.apache.org/WmoI).
>



 --
 Sameer Agarwal
 Computer Science | UC Berkeley
 http://cs.berkeley.edu/~sameerag

>>>
>>>
>>>
>>> --
>>> Sameer Agarwal
>>> Computer Science | UC Berkeley
>>> http://cs.berkeley.edu/~sameerag
>>>
>>>
>>


-- 
Takuya UESHIN
Tokyo, Japan

http://twitter.com/ueshin


Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Xingbo Jiang
+1


Wenchen Fan 于2018年2月20日 周二下午1:09写道:

> +1
>
> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin  wrote:
>
>> +1
>>
>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal ,
>> wrote:
>>
>> this file shouldn't be included?
>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml
>>>
>>
>> I've now deleted this file
>>
>> *From:* Sameer Agarwal 
>>> *Sent:* Saturday, February 17, 2018 1:43:39 PM
>>> *To:* Sameer Agarwal
>>> *Cc:* dev
>>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4)
>>>
>>> I'll start with a +1 once again.
>>>
>>> All blockers reported against RC3 have been resolved and the builds are
>>> healthy.
>>>
>>> On 17 February 2018 at 13:41, Sameer Agarwal 
>>> wrote:
>>>
 Please vote on releasing the following candidate as Apache Spark
 version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00
 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


 [ ] +1 Release this package as Apache Spark 2.3.0

 [ ] -1 Do not release this package because ...


 To learn more about Apache Spark, please see https://spark.apache.org/

 The tag to be voted on is v2.3.0-rc4:
 https://github.com/apache/spark/tree/v2.3.0-rc4
 (44095cb65500739695b0324c177c19dfa1471472)

 List of JIRA tickets resolved in this release can be found here:
 https://issues.apache.org/jira/projects/SPARK/versions/12339551

 The release files, including signatures, digests, etc. can be found at:
 https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/

 Release artifacts are signed with the following key:
 https://dist.apache.org/repos/dist/dev/spark/KEYS

 The staging repository for this release can be found at:
 https://repository.apache.org/content/repositories/orgapachespark-1265/

 The documentation corresponding to this release can be found at:

 https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/index.html


 FAQ

 ===
 What are the unresolved issues targeted for 2.3.0?
 ===

 Please see https://s.apache.org/oXKi. At the time of writing, there
 are currently no known release blockers.

 =
 How can I help test this release?
 =

 If you are a Spark user, you can help us test this release by taking an
 existing Spark workload and running on this release candidate, then
 reporting any regressions.

 If you're working in PySpark you can set up a virtual env and install
 the current RC and see if anything important breaks, in the Java/Scala you
 can add the staging repository to your projects resolvers and test with the
 RC (make sure to clean up the artifact cache before/after so you don't end
 up building with a out of date RC going forward).

 ===
 What should happen to JIRA tickets still targeting 2.3.0?
 ===

 Committers should look at those and triage. Extremely important bug
 fixes, documentation, and API tweaks that impact compatibility should be
 worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
 appropriate.

 ===
 Why is my bug not fixed?
 ===

 In order to make timely releases, we will typically not hold the
 release unless the bug in question is a regression from 2.2.0. That being
 said, if there is something which is a regression from 2.2.0 and has not
 been correctly targeted please ping me or a committer to help target the
 issue (you can see the open issues listed as impacting Spark 2.3.0 at
 https://s.apache.org/WmoI).

>>>
>>>
>>>
>>> --
>>> Sameer Agarwal
>>> Computer Science | UC Berkeley
>>> http://cs.berkeley.edu/~sameerag
>>>
>>
>>
>>
>> --
>> Sameer Agarwal
>> Computer Science | UC Berkeley
>> http://cs.berkeley.edu/~sameerag
>>
>>
>


Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Wenchen Fan
+1

On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin  wrote:

> +1
>
> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal ,
> wrote:
>
> this file shouldn't be included? https://dist.apache.org/repos/
>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml
>>
>
> I've now deleted this file
>
> *From:* Sameer Agarwal 
>> *Sent:* Saturday, February 17, 2018 1:43:39 PM
>> *To:* Sameer Agarwal
>> *Cc:* dev
>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4)
>>
>> I'll start with a +1 once again.
>>
>> All blockers reported against RC3 have been resolved and the builds are
>> healthy.
>>
>> On 17 February 2018 at 13:41, Sameer Agarwal  wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark version
>>> 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC
>>> and passes if a majority of at least 3 PMC +1 votes are cast.
>>>
>>>
>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>
>>> The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spar
>>> k/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)
>>>
>>> List of JIRA tickets resolved in this release can be found here:
>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/
>>>
>>> Release artifacts are signed with the following key:
>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1265/
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs
>>> /_site/index.html
>>>
>>>
>>> FAQ
>>>
>>> ===
>>> What are the unresolved issues targeted for 2.3.0?
>>> ===
>>>
>>> Please see https://s.apache.org/oXKi. At the time of writing, there are
>>> currently no known release blockers.
>>>
>>> =
>>> How can I help test this release?
>>> =
>>>
>>> If you are a Spark user, you can help us test this release by taking an
>>> existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala you
>>> can add the staging repository to your projects resolvers and test with the
>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>> up building with a out of date RC going forward).
>>>
>>> ===
>>> What should happen to JIRA tickets still targeting 2.3.0?
>>> ===
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should be
>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>>> appropriate.
>>>
>>> ===
>>> Why is my bug not fixed?
>>> ===
>>>
>>> In order to make timely releases, we will typically not hold the release
>>> unless the bug in question is a regression from 2.2.0. That being said, if
>>> there is something which is a regression from 2.2.0 and has not been
>>> correctly targeted please ping me or a committer to help target the issue
>>> (you can see the open issues listed as impacting Spark 2.3.0 at
>>> https://s.apache.org/WmoI).
>>>
>>
>>
>>
>> --
>> Sameer Agarwal
>> Computer Science | UC Berkeley
>> http://cs.berkeley.edu/~sameerag
>>
>
>
>
> --
> Sameer Agarwal
> Computer Science | UC Berkeley
> http://cs.berkeley.edu/~sameerag
>
>


Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Reynold Xin
+1

On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal , wrote:
> > > this file shouldn't be included? 
> > > https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml
> >
> > I've now deleted this file
> >
> > > From: Sameer Agarwal 
> > > Sent: Saturday, February 17, 2018 1:43:39 PM
> > > To: Sameer Agarwal
> > > Cc: dev
> > > Subject: Re: [VOTE] Spark 2.3.0 (RC4)
> > >
> > > I'll start with a +1 once again.
> > >
> > > All blockers reported against RC3 have been resolved and the builds are 
> > > healthy.
> > >
> > > > On 17 February 2018 at 13:41, Sameer Agarwal  
> > > > wrote:
> > > > > Please vote on releasing the following candidate as Apache Spark 
> > > > > version 2.3.0. The vote is open until Thursday February 22, 2018 at 
> > > > > 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes 
> > > > > are cast.
> > > > >
> > > > >
> > > > > [ ] +1 Release this package as Apache Spark 2.3.0
> > > > >
> > > > > [ ] -1 Do not release this package because ...
> > > > >
> > > > >
> > > > > To learn more about Apache Spark, please see https://spark.apache.org/
> > > > >
> > > > > The tag to be voted on is v2.3.0-rc4: 
> > > > > https://github.com/apache/spark/tree/v2.3.0-rc4 
> > > > > (44095cb65500739695b0324c177c19dfa1471472)
> > > > >
> > > > > List of JIRA tickets resolved in this release can be found here: 
> > > > > https://issues.apache.org/jira/projects/SPARK/versions/12339551
> > > > >
> > > > > The release files, including signatures, digests, etc. can be found 
> > > > > at:
> > > > > https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/
> > > > >
> > > > > Release artifacts are signed with the following key:
> > > > > https://dist.apache.org/repos/dist/dev/spark/KEYS
> > > > >
> > > > > The staging repository for this release can be found at:
> > > > > https://repository.apache.org/content/repositories/orgapachespark-1265/
> > > > >
> > > > > The documentation corresponding to this release can be found at:
> > > > > https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/index.html
> > > > >
> > > > >
> > > > > FAQ
> > > > >
> > > > > ===
> > > > > What are the unresolved issues targeted for 2.3.0?
> > > > > ===
> > > > >
> > > > > Please see https://s.apache.org/oXKi. At the time of writing, there 
> > > > > are currently no known release blockers.
> > > > >
> > > > > =
> > > > > How can I help test this release?
> > > > > =
> > > > >
> > > > > If you are a Spark user, you can help us test this release by taking 
> > > > > an existing Spark workload and running on this release candidate, 
> > > > > then reporting any regressions.
> > > > >
> > > > > If you're working in PySpark you can set up a virtual env and install 
> > > > > the current RC and see if anything important breaks, in the 
> > > > > Java/Scala you can add the staging repository to your projects 
> > > > > resolvers and test with the RC (make sure to clean up the artifact 
> > > > > cache before/after so you don't end up building with a out of date RC 
> > > > > going forward).
> > > > >
> > > > > ===
> > > > > What should happen to JIRA tickets still targeting 2.3.0?
> > > > > ===
> > > > >
> > > > > Committers should look at those and triage. Extremely important bug 
> > > > > fixes, documentation, and API tweaks that impact compatibility should 
> > > > > be worked on immediately. Everything else please retarget to 2.3.1 or 
> > > > > 2.4.0 as appropriate.
> > > > >
> > > > > ===
> > > > > Why is my bug not fixed?
> > > > > ===
> > > > >
> > > > > In order to make timely releases, we will typically not hold the 
> > > > > release unless the bug in question is a regression from 2.2.0. That 
> > > > > being said, if there is something which is a regression from 2.2.0 
> > > > > and has not been correctly targeted please ping me or a committer to 
> > > > > help target the issue (you can see the open issues listed as 
> > > > > impacting Spark 2.3.0 at https://s.apache.org/WmoI).
> > >
> > >
> > >
> > > --
> > > Sameer Agarwal
> > > Computer Science | UC Berkeley
> > > http://cs.berkeley.edu/~sameerag
>
>
>
> --
> Sameer Agarwal
> Computer Science | UC Berkeley
> http://cs.berkeley.edu/~sameerag


Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Sameer Agarwal
>
> this file shouldn't be included? https://dist.apache.org/repos/
> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml
>

I've now deleted this file

*From:* Sameer Agarwal 
> *Sent:* Saturday, February 17, 2018 1:43:39 PM
> *To:* Sameer Agarwal
> *Cc:* dev
> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4)
>
> I'll start with a +1 once again.
>
> All blockers reported against RC3 have been resolved and the builds are
> healthy.
>
> On 17 February 2018 at 13:41, Sameer Agarwal  wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC
>> and passes if a majority of at least 3 PMC +1 votes are cast.
>>
>>
>> [ ] +1 Release this package as Apache Spark 2.3.0
>>
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see https://spark.apache.org/
>>
>> The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spar
>> k/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)
>>
>> List of JIRA tickets resolved in this release can be found here:
>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/
>>
>> Release artifacts are signed with the following key:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1265/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs
>> /_site/index.html
>>
>>
>> FAQ
>>
>> ===
>> What are the unresolved issues targeted for 2.3.0?
>> ===
>>
>> Please see https://s.apache.org/oXKi. At the time of writing, there are
>> currently no known release blockers.
>>
>> =
>> How can I help test this release?
>> =
>>
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install the
>> current RC and see if anything important breaks, in the Java/Scala you can
>> add the staging repository to your projects resolvers and test with the RC
>> (make sure to clean up the artifact cache before/after so you don't end up
>> building with a out of date RC going forward).
>>
>> ===
>> What should happen to JIRA tickets still targeting 2.3.0?
>> ===
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should be
>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>> appropriate.
>>
>> ===
>> Why is my bug not fixed?
>> ===
>>
>> In order to make timely releases, we will typically not hold the release
>> unless the bug in question is a regression from 2.2.0. That being said, if
>> there is something which is a regression from 2.2.0 and has not been
>> correctly targeted please ping me or a committer to help target the issue
>> (you can see the open issues listed as impacting Spark 2.3.0 at
>> https://s.apache.org/WmoI).
>>
>
>
>
> --
> Sameer Agarwal
> Computer Science | UC Berkeley
> http://cs.berkeley.edu/~sameerag
>



-- 
Sameer Agarwal
Computer Science | UC Berkeley
http://cs.berkeley.edu/~sameerag


Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Dongjoon Hyun
In addition to Hyukjin's `github.io` result, `jekyll` also forwards the
search result links correctly.

SKIP_SCALADOC=1 SKIP_PYTHONDOC=1 SKIP_RDOC=1 jekyll serve --watch

And, connect `http://127.0.0.1:4000`.

This will be the same in Apache Spark websites.

Bests,
Dongjoon.



On Mon, Feb 19, 2018 at 8:37 PM, vaquar khan  wrote:

> +1
>
> Regards,
> Vaquar khan
>
> On Mon, Feb 19, 2018 at 10:29 PM, Xiao Li  wrote:
>
>> +1.
>>
>> So far, no function/performance regression in Spark SQL, Core and
>> PySpark.
>>
>> Thanks!
>>
>> Xiao
>>
>> 2018-02-19 19:47 GMT-08:00 Hyukjin Kwon :
>>
>>> Ah, I see. For 1), I overlooked Felix's input here. I couldn't foresee
>>> this when I added this documentation because it worked in my simple demo:
>>>
>>> https://spark-test.github.io/sparksqldoc/search.html?q=approx
>>> https://spark-test.github.io/sparksqldoc/#approx_percentile
>>>
>>> Will try to investigate this shortly too.
>>>
>>>
>>>
>>> 2018-02-20 11:45 GMT+09:00 Shivaram Venkataraman <
>>> shiva...@eecs.berkeley.edu>:
>>>
 For (1) I think it has something to do with https://dist.apache.org/r
 epos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/ not automatically
 going to https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-d
 ocs/_site/api/sql/index.html -- So if you see the link to
 approx_percentile the link we generate is https://dist.apache.org/rep
 os/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/#approx_percentile --
 This doesn't work as Felix said but https://dist.apache.org/re
 pos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/index.html#
 approx_percentile works

 I'm not sure how this will behave on the main site. FWIW
 http://spark.apache.org/docs/latest/api/python/ does redirect to
 http://spark.apache.org/docs/latest/api/python/index.html

 Thanks
 Shivaram

 On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung <
 felixcheun...@hotmail.com> wrote:

> Ah sorry I realize my wordings were unclear (not enough zzz or coffee)
>
> So to clarify,
> 1) when searching for a word in the Sql function doc, it does return
> that search result page correctly, however, none of the link in result
> opens to the actual doc page, so to take the search I included as an
> example, if you click on approx_percentile, for instance, it brings open
> the web directory instead.
>
> 2) The second is the dist location we are voting on has a .iml file,
> which is normally not included in release or release RC and it is unsigned
> and without hash (therefore seems like it should not be in the release)
>
> Thanks!
>
> _
> From: Shivaram Venkataraman 
> Sent: Tuesday, February 20, 2018 2:24 AM
> Subject: Re: [VOTE] Spark 2.3.0 (RC4)
> To: Felix Cheung 
> Cc: Sean Owen , dev 
>
>
>
> FWIW The search result link works for me
>
> Shivaram
>
> On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung <
> felixcheun...@hotmail.com> wrote:
>
>> These are two separate things:
>>
>> Does the search result links work for you?
>>
>> The second is the dist location we are voting on has a .iml file.
>>
>> _
>> From: Sean Owen 
>> Sent: Tuesday, February 20, 2018 2:19 AM
>> Subject: Re: [VOTE] Spark 2.3.0 (RC4)
>> To: Felix Cheung 
>> Cc: dev 
>>
>>
>>
>> Maybe I misunderstand, but I don't see any .iml file in the 4 results
>> on that page? it looks reasonable.
>>
>> On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung <
>> felixcheun...@hotmail.com> wrote:
>>
>>> Any idea with sql func docs search result returning broken links as
>>> below?
>>>
>>> *From:* Felix Cheung 
>>> *Sent:* Sunday, February 18, 2018 10:05:22 AM
>>> *To:* Sameer Agarwal; Sameer Agarwal
>>>
>>> *Cc:* dev
>>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4)
>>> Quick questions:
>>>
>>> is there search link for sql functions quite right?
>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs
>>> /_site/api/sql/search.html?q=app
>>>
>>> this file shouldn't be included? https://dist.apache.org/repos/
>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml
>>>
>>>
>>
>>
>
>
>

>>>
>>
>
>
> --
> Regards,
> Vaquar Khan
> +1 -224-436-0783
> Greater Chicago
>


Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread vaquar khan
+1

Regards,
Vaquar khan

On Mon, Feb 19, 2018 at 10:29 PM, Xiao Li  wrote:

> +1.
>
> So far, no function/performance regression in Spark SQL, Core and PySpark.
>
> Thanks!
>
> Xiao
>
> 2018-02-19 19:47 GMT-08:00 Hyukjin Kwon :
>
>> Ah, I see. For 1), I overlooked Felix's input here. I couldn't foresee
>> this when I added this documentation because it worked in my simple demo:
>>
>> https://spark-test.github.io/sparksqldoc/search.html?q=approx
>> https://spark-test.github.io/sparksqldoc/#approx_percentile
>>
>> Will try to investigate this shortly too.
>>
>>
>>
>> 2018-02-20 11:45 GMT+09:00 Shivaram Venkataraman <
>> shiva...@eecs.berkeley.edu>:
>>
>>> For (1) I think it has something to do with https://dist.apache.org/r
>>> epos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/ not automatically
>>> going to https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-d
>>> ocs/_site/api/sql/index.html -- So if you see the link to
>>> approx_percentile the link we generate is https://dist.apache.org/rep
>>> os/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/#approx_percentile --
>>> This doesn't work as Felix said but https://dist.apache.org/re
>>> pos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/index.html#
>>> approx_percentile works
>>>
>>> I'm not sure how this will behave on the main site. FWIW
>>> http://spark.apache.org/docs/latest/api/python/ does redirect to
>>> http://spark.apache.org/docs/latest/api/python/index.html
>>>
>>> Thanks
>>> Shivaram
>>>
>>> On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung >> > wrote:
>>>
 Ah sorry I realize my wordings were unclear (not enough zzz or coffee)

 So to clarify,
 1) when searching for a word in the Sql function doc, it does return
 that search result page correctly, however, none of the link in result
 opens to the actual doc page, so to take the search I included as an
 example, if you click on approx_percentile, for instance, it brings open
 the web directory instead.

 2) The second is the dist location we are voting on has a .iml file,
 which is normally not included in release or release RC and it is unsigned
 and without hash (therefore seems like it should not be in the release)

 Thanks!

 _
 From: Shivaram Venkataraman 
 Sent: Tuesday, February 20, 2018 2:24 AM
 Subject: Re: [VOTE] Spark 2.3.0 (RC4)
 To: Felix Cheung 
 Cc: Sean Owen , dev 



 FWIW The search result link works for me

 Shivaram

 On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung <
 felixcheun...@hotmail.com> wrote:

> These are two separate things:
>
> Does the search result links work for you?
>
> The second is the dist location we are voting on has a .iml file.
>
> _
> From: Sean Owen 
> Sent: Tuesday, February 20, 2018 2:19 AM
> Subject: Re: [VOTE] Spark 2.3.0 (RC4)
> To: Felix Cheung 
> Cc: dev 
>
>
>
> Maybe I misunderstand, but I don't see any .iml file in the 4 results
> on that page? it looks reasonable.
>
> On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung <
> felixcheun...@hotmail.com> wrote:
>
>> Any idea with sql func docs search result returning broken links as
>> below?
>>
>> *From:* Felix Cheung 
>> *Sent:* Sunday, February 18, 2018 10:05:22 AM
>> *To:* Sameer Agarwal; Sameer Agarwal
>>
>> *Cc:* dev
>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4)
>> Quick questions:
>>
>> is there search link for sql functions quite right?
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs
>> /_site/api/sql/search.html?q=app
>>
>> this file shouldn't be included? https://dist.apache.org/repos/
>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml
>>
>>
>
>



>>>
>>
>


-- 
Regards,
Vaquar Khan
+1 -224-436-0783
Greater Chicago


Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Xiao Li
+1.

So far, no function/performance regression in Spark SQL, Core and PySpark.

Thanks!

Xiao

2018-02-19 19:47 GMT-08:00 Hyukjin Kwon :

> Ah, I see. For 1), I overlooked Felix's input here. I couldn't foresee
> this when I added this documentation because it worked in my simple demo:
>
> https://spark-test.github.io/sparksqldoc/search.html?q=approx
> https://spark-test.github.io/sparksqldoc/#approx_percentile
>
> Will try to investigate this shortly too.
>
>
>
> 2018-02-20 11:45 GMT+09:00 Shivaram Venkataraman <
> shiva...@eecs.berkeley.edu>:
>
>> For (1) I think it has something to do with https://dist.apache.org/r
>> epos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/ not automatically
>> going to https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-
>> docs/_site/api/sql/index.html -- So if you see the link to
>> approx_percentile the link we generate is https://dist.apache.org/rep
>> os/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/#approx_percentile --
>> This doesn't work as Felix said but https://dist.apache.org/re
>> pos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/index.html#
>> approx_percentile works
>>
>> I'm not sure how this will behave on the main site. FWIW
>> http://spark.apache.org/docs/latest/api/python/ does redirect to
>> http://spark.apache.org/docs/latest/api/python/index.html
>>
>> Thanks
>> Shivaram
>>
>> On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung 
>> wrote:
>>
>>> Ah sorry I realize my wordings were unclear (not enough zzz or coffee)
>>>
>>> So to clarify,
>>> 1) when searching for a word in the Sql function doc, it does return
>>> that search result page correctly, however, none of the link in result
>>> opens to the actual doc page, so to take the search I included as an
>>> example, if you click on approx_percentile, for instance, it brings open
>>> the web directory instead.
>>>
>>> 2) The second is the dist location we are voting on has a .iml file,
>>> which is normally not included in release or release RC and it is unsigned
>>> and without hash (therefore seems like it should not be in the release)
>>>
>>> Thanks!
>>>
>>> _
>>> From: Shivaram Venkataraman 
>>> Sent: Tuesday, February 20, 2018 2:24 AM
>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4)
>>> To: Felix Cheung 
>>> Cc: Sean Owen , dev 
>>>
>>>
>>>
>>> FWIW The search result link works for me
>>>
>>> Shivaram
>>>
>>> On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung >> > wrote:
>>>
 These are two separate things:

 Does the search result links work for you?

 The second is the dist location we are voting on has a .iml file.

 _
 From: Sean Owen 
 Sent: Tuesday, February 20, 2018 2:19 AM
 Subject: Re: [VOTE] Spark 2.3.0 (RC4)
 To: Felix Cheung 
 Cc: dev 



 Maybe I misunderstand, but I don't see any .iml file in the 4 results
 on that page? it looks reasonable.

 On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung 
 wrote:

> Any idea with sql func docs search result returning broken links as
> below?
>
> *From:* Felix Cheung 
> *Sent:* Sunday, February 18, 2018 10:05:22 AM
> *To:* Sameer Agarwal; Sameer Agarwal
>
> *Cc:* dev
> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4)
> Quick questions:
>
> is there search link for sql functions quite right?
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs
> /_site/api/sql/search.html?q=app
>
> this file shouldn't be included? https://dist.apache.org/repos/
> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml
>
>


>>>
>>>
>>>
>>
>


Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Hyukjin Kwon
Ah, I see. For 1), I overlooked Felix's input here. I couldn't foresee this
when I added this documentation because it worked in my simple demo:

https://spark-test.github.io/sparksqldoc/search.html?q=approx
https://spark-test.github.io/sparksqldoc/#approx_percentile

Will try to investigate this shortly too.



2018-02-20 11:45 GMT+09:00 Shivaram Venkataraman :

> For (1) I think it has something to do with https://dist.apache.org/
> repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/ not automatically
> going to https://dist.apache.org/repos/dist/dev/spark/v2.3.0-
> rc4-docs/_site/api/sql/index.html -- So if you see the link to
> approx_percentile the link we generate is https://dist.apache.org/
> repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/#approx_percentile --
> This doesn't work as Felix said but https://dist.apache.org/
> repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/index.
> html#approx_percentile works
>
> I'm not sure how this will behave on the main site. FWIW
> http://spark.apache.org/docs/latest/api/python/ does redirect to
> http://spark.apache.org/docs/latest/api/python/index.html
>
> Thanks
> Shivaram
>
> On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung 
> wrote:
>
>> Ah sorry I realize my wordings were unclear (not enough zzz or coffee)
>>
>> So to clarify,
>> 1) when searching for a word in the Sql function doc, it does return that
>> search result page correctly, however, none of the link in result opens to
>> the actual doc page, so to take the search I included as an example, if you
>> click on approx_percentile, for instance, it brings open the web directory
>> instead.
>>
>> 2) The second is the dist location we are voting on has a .iml file,
>> which is normally not included in release or release RC and it is unsigned
>> and without hash (therefore seems like it should not be in the release)
>>
>> Thanks!
>>
>> _
>> From: Shivaram Venkataraman 
>> Sent: Tuesday, February 20, 2018 2:24 AM
>> Subject: Re: [VOTE] Spark 2.3.0 (RC4)
>> To: Felix Cheung 
>> Cc: Sean Owen , dev 
>>
>>
>>
>> FWIW The search result link works for me
>>
>> Shivaram
>>
>> On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung 
>> wrote:
>>
>>> These are two separate things:
>>>
>>> Does the search result links work for you?
>>>
>>> The second is the dist location we are voting on has a .iml file.
>>>
>>> _
>>> From: Sean Owen 
>>> Sent: Tuesday, February 20, 2018 2:19 AM
>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4)
>>> To: Felix Cheung 
>>> Cc: dev 
>>>
>>>
>>>
>>> Maybe I misunderstand, but I don't see any .iml file in the 4 results on
>>> that page? it looks reasonable.
>>>
>>> On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung 
>>> wrote:
>>>
 Any idea with sql func docs search result returning broken links as
 below?

 *From:* Felix Cheung 
 *Sent:* Sunday, February 18, 2018 10:05:22 AM
 *To:* Sameer Agarwal; Sameer Agarwal

 *Cc:* dev
 *Subject:* Re: [VOTE] Spark 2.3.0 (RC4)
 Quick questions:

 is there search link for sql functions quite right?
 https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs
 /_site/api/sql/search.html?q=app

 this file shouldn't be included? https://dist.apache.org/repos/
 dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml


>>>
>>>
>>
>>
>>
>


Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Shivaram Venkataraman
For (1) I think it has something to do with
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/
not automatically going to
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/index.html
-- So if you see the link to approx_percentile the link we generate is
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/#approx_percentile
-- This doesn't work as Felix said but
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/index.html#approx_percentile
works

I'm not sure how this will behave on the main site. FWIW
http://spark.apache.org/docs/latest/api/python/ does redirect to
http://spark.apache.org/docs/latest/api/python/index.html

Thanks
Shivaram

On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung 
wrote:

> Ah sorry I realize my wordings were unclear (not enough zzz or coffee)
>
> So to clarify,
> 1) when searching for a word in the Sql function doc, it does return that
> search result page correctly, however, none of the link in result opens to
> the actual doc page, so to take the search I included as an example, if you
> click on approx_percentile, for instance, it brings open the web directory
> instead.
>
> 2) The second is the dist location we are voting on has a .iml file, which
> is normally not included in release or release RC and it is unsigned and
> without hash (therefore seems like it should not be in the release)
>
> Thanks!
>
> _
> From: Shivaram Venkataraman 
> Sent: Tuesday, February 20, 2018 2:24 AM
> Subject: Re: [VOTE] Spark 2.3.0 (RC4)
> To: Felix Cheung 
> Cc: Sean Owen , dev 
>
>
>
> FWIW The search result link works for me
>
> Shivaram
>
> On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung 
> wrote:
>
>> These are two separate things:
>>
>> Does the search result links work for you?
>>
>> The second is the dist location we are voting on has a .iml file.
>>
>> _
>> From: Sean Owen 
>> Sent: Tuesday, February 20, 2018 2:19 AM
>> Subject: Re: [VOTE] Spark 2.3.0 (RC4)
>> To: Felix Cheung 
>> Cc: dev 
>>
>>
>>
>> Maybe I misunderstand, but I don't see any .iml file in the 4 results on
>> that page? it looks reasonable.
>>
>> On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung 
>> wrote:
>>
>>> Any idea with sql func docs search result returning broken links as
>>> below?
>>>
>>> *From:* Felix Cheung 
>>> *Sent:* Sunday, February 18, 2018 10:05:22 AM
>>> *To:* Sameer Agarwal; Sameer Agarwal
>>>
>>> *Cc:* dev
>>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4)
>>> Quick questions:
>>>
>>> is there search link for sql functions quite right?
>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs
>>> /_site/api/sql/search.html?q=app
>>>
>>> this file shouldn't be included? https://dist.apache.org/repos/
>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml
>>>
>>>
>>
>>
>
>
>


Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Felix Cheung
Ah sorry I realize my wordings were unclear (not enough zzz or coffee)

So to clarify,
1) when searching for a word in the Sql function doc, it does return that 
search result page correctly, however, none of the link in result opens to the 
actual doc page, so to take the search I included as an example, if you click 
on approx_percentile, for instance, it brings open the web directory instead.

2) The second is the dist location we are voting on has a .iml file, which is 
normally not included in release or release RC and it is unsigned and without 
hash (therefore seems like it should not be in the release)

Thanks!

_
From: Shivaram Venkataraman 
Sent: Tuesday, February 20, 2018 2:24 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung 
Cc: Sean Owen , dev 


FWIW The search result link works for me

Shivaram

On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung 
> wrote:
These are two separate things:

Does the search result links work for you?

The second is the dist location we are voting on has a .iml file.

_
From: Sean Owen >
Sent: Tuesday, February 20, 2018 2:19 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung >
Cc: dev >



Maybe I misunderstand, but I don't see any .iml file in the 4 results on that 
page? it looks reasonable.

On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung 
> wrote:
Any idea with sql func docs search result returning broken links as below?

From: Felix Cheung >
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal

Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
Quick questions:

is there search link for sql functions quite right? 
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/search.html?q=app

this file shouldn't be included? 
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml








Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Shivaram Venkataraman
FWIW The search result link works for me

Shivaram

On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung 
wrote:

> These are two separate things:
>
> Does the search result links work for you?
>
> The second is the dist location we are voting on has a .iml file.
>
> _
> From: Sean Owen 
> Sent: Tuesday, February 20, 2018 2:19 AM
> Subject: Re: [VOTE] Spark 2.3.0 (RC4)
> To: Felix Cheung 
> Cc: dev 
>
>
>
> Maybe I misunderstand, but I don't see any .iml file in the 4 results on
> that page? it looks reasonable.
>
> On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung 
> wrote:
>
>> Any idea with sql func docs search result returning broken links as below?
>>
>> *From:* Felix Cheung 
>> *Sent:* Sunday, February 18, 2018 10:05:22 AM
>> *To:* Sameer Agarwal; Sameer Agarwal
>>
>> *Cc:* dev
>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4)
>> Quick questions:
>>
>> is there search link for sql functions quite right?
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-
>> docs/_site/api/sql/search.html?q=app
>>
>> this file shouldn't be included? https://dist.apache.org/repos/
>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml
>>
>>
>
>


Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Felix Cheung
These are two separate things:

Does the search result links work for you?

The second is the dist location we are voting on has a .iml file.

_
From: Sean Owen 
Sent: Tuesday, February 20, 2018 2:19 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung 
Cc: dev 


Maybe I misunderstand, but I don't see any .iml file in the 4 results on that 
page? it looks reasonable.

On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung 
> wrote:
Any idea with sql func docs search result returning broken links as below?

From: Felix Cheung >
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal

Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
Quick questions:

is there search link for sql functions quite right? 
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/search.html?q=app

this file shouldn't be included? 
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml





Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Sean Owen
Maybe I misunderstand, but I don't see any .iml file in the 4 results on
that page? it looks reasonable.

On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung 
wrote:

> Any idea with sql func docs search result returning broken links as below?
>
> *From:* Felix Cheung 
> *Sent:* Sunday, February 18, 2018 10:05:22 AM
> *To:* Sameer Agarwal; Sameer Agarwal
>
> *Cc:* dev
> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4)
> Quick questions:
>
> is there search link for sql functions quite right?
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/search.html?q=app
>
> this file shouldn't be included?
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml
>
>


Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Felix Cheung
Any idea with sql func docs search result returning broken links as below?


From: Felix Cheung 
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal
Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)

Quick questions:

is there search link for sql functions quite right? 
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/search.html?q=app

this file shouldn't be included? 
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml



From: Sameer Agarwal 
Sent: Saturday, February 17, 2018 1:43:39 PM
To: Sameer Agarwal
Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)

I'll start with a +1 once again.

All blockers reported against RC3 have been resolved and the builds are healthy.

On 17 February 2018 at 13:41, Sameer Agarwal 
> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. 
The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes 
if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc4: 
https://github.com/apache/spark/tree/v2.3.0-rc4 
(44095cb65500739695b0324c177c19dfa1471472)

List of JIRA tickets resolved in this release can be found here: 
https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/

Release artifacts are signed with the following key:
https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1265/

The documentation corresponding to this release can be found at:
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/index.html


FAQ

===
What are the unresolved issues targeted for 2.3.0?
===

Please see https://s.apache.org/oXKi. At the time of writing, there are 
currently no known release blockers.

=
How can I help test this release?
=

If you are a Spark user, you can help us test this release by taking an 
existing Spark workload and running on this release candidate, then reporting 
any regressions.

If you're working in PySpark you can set up a virtual env and install the 
current RC and see if anything important breaks, in the Java/Scala you can add 
the staging repository to your projects resolvers and test with the RC (make 
sure to clean up the artifact cache before/after so you don't end up building 
with a out of date RC going forward).

===
What should happen to JIRA tickets still targeting 2.3.0?
===

Committers should look at those and triage. Extremely important bug fixes, 
documentation, and API tweaks that impact compatibility should be worked on 
immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===
Why is my bug not fixed?
===

In order to make timely releases, we will typically not hold the release unless 
the bug in question is a regression from 2.2.0. That being said, if there is 
something which is a regression from 2.2.0 and has not been correctly targeted 
please ping me or a committer to help target the issue (you can see the open 
issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).



--
Sameer Agarwal
Computer Science | UC Berkeley
http://cs.berkeley.edu/~sameerag


Re: [VOTE] Spark 2.3.0 (RC4)

2018-02-19 Thread Dongjoon Hyun
+1.

I tested RC4 on CentOS 7.4 / OpenJDK 1.8.0_161 with `-Pyarn -Phadoop-2.7
-Pkinesis-asl -Phive -Phive-thriftserver -Psparkr`.

Bests,
Dongjoon.



On Sun, Feb 18, 2018 at 3:22 PM, Denny Lee  wrote:

> +1 (non-binding)
>
> Built and tested on macOS and Ubuntu.
>
>
> On Sun, Feb 18, 2018 at 3:19 PM Ricardo Almeida <
> ricardo.alme...@actnowib.com> wrote:
>
>> +1 (non-binding)
>>
>> Built and tested on macOS 10.12.6 Java 8 (build 1.8.0_111). No
>> regressions detected so far.
>>
>>
>> On 18 February 2018 at 16:12, Sean Owen  wrote:
>>
>>> +1 from me as last time, same outcome.
>>>
>>> I saw one test fail, but passed on a second run, so just seems flaky.
>>>
>>> - subscribing topic by name from latest offsets (failOnDataLoss: true)
>>> *** FAILED ***
>>>   Error while stopping stream:
>>>   query.exception() is not empty after clean stop: org.apache.spark.sql.
>>> streaming.StreamingQueryException: Writing job failed.
>>>   === Streaming Query ===
>>>   Identifier: [id = cdd647ec-d7f0-437b-9950-ce9d79d691d1, runId =
>>> 3a7cf7ec-670a-48b6-8185-8b6cd7e27f96]
>>>   Current Committed Offsets: {KafkaSource[Subscribe[topic-4]]:
>>> {"topic-4":{"2":1,"4":1,"1":0,"3":0,"0":2}}}
>>>   Current Available Offsets: {}
>>>
>>>   Current State: TERMINATED
>>>   Thread State: RUNNABLE
>>>
>>> On Sat, Feb 17, 2018 at 3:41 PM Sameer Agarwal 
>>> wrote:
>>>
 Please vote on releasing the following candidate as Apache Spark
 version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00
 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


 [ ] +1 Release this package as Apache Spark 2.3.0

 [ ] -1 Do not release this package because ...


 To learn more about Apache Spark, please see https://spark.apache.org/

 The tag to be voted on is v2.3.0-rc4: https://github.com/apache/
 spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)

 List of JIRA tickets resolved in this release can be found here:
 https://issues.apache.org/jira/projects/SPARK/versions/12339551

 The release files, including signatures, digests, etc. can be found at:
 https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/

 Release artifacts are signed with the following key:
 https://dist.apache.org/repos/dist/dev/spark/KEYS

 The staging repository for this release can be found at:
 https://repository.apache.org/content/repositories/orgapachespark-1265/

 The documentation corresponding to this release can be found at:
 https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-
 docs/_site/index.html


 FAQ

 ===
 What are the unresolved issues targeted for 2.3.0?
 ===

 Please see https://s.apache.org/oXKi. At the time of writing, there
 are currently no known release blockers.

 =
 How can I help test this release?
 =

 If you are a Spark user, you can help us test this release by taking an
 existing Spark workload and running on this release candidate, then
 reporting any regressions.

 If you're working in PySpark you can set up a virtual env and install
 the current RC and see if anything important breaks, in the Java/Scala you
 can add the staging repository to your projects resolvers and test with the
 RC (make sure to clean up the artifact cache before/after so you don't end
 up building with a out of date RC going forward).

 ===
 What should happen to JIRA tickets still targeting 2.3.0?
 ===

 Committers should look at those and triage. Extremely important bug
 fixes, documentation, and API tweaks that impact compatibility should be
 worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
 appropriate.

 ===
 Why is my bug not fixed?
 ===

 In order to make timely releases, we will typically not hold the
 release unless the bug in question is a regression from 2.2.0. That being
 said, if there is something which is a regression from 2.2.0 and has not
 been correctly targeted please ping me or a committer to help target the
 issue (you can see the open issues listed as impacting Spark 2.3.0 at
 https://s.apache.org/WmoI).

>>>
>>