Re: [apache/spark] [SPARK-29674][CORE] Update dropwizard metrics to 4.1.x for JDK 9+ (#26332)

2019-10-30 Thread Dongjoon Hyun
The Ganglia module has only 2 files. In addition to dropping, we may choose the following two ways to support it still partially like `kafka-0.8` which Apache Spark supports in Scala 2.11 only. 1. We can stick to `dropwizard 3.x` for JDK8 (by default) and use `dropwizard 4.x` for `hadoop-3.2`

Fwd: [apache/spark] [SPARK-29674][CORE] Update dropwizard metrics to 4.1.x for JDK 9+ (#26332)

2019-10-30 Thread Sean Owen
I wanted to raise this to dev@. So, updating dropwizard metrics from 3.2.x to 4.x might be important for JDK 11 support. Our tests pass as-is without this update. But we don't test some elements of this metrics support, like Ganglia integration. And I have heard reports that downstream custom

Re: [VOTE] SPARK 3.0.0-preview (RC1)

2019-10-30 Thread Xingbo Jiang
I was trying to avoid changing the version names and revert the changes on master again. But you are right it might lead to confusions which release script is used for RC2, I'll follow your advice and create a new RC2 tag. Thanks! Xingbo On Wed, Oct 30, 2019 at 5:06 PM Dongjoon Hyun wrote: >

Re: [VOTE] SPARK 3.0.0-preview (RC1)

2019-10-30 Thread Dongjoon Hyun
Hi, Xingbo. Currently, RC2 tag is pointing RC1 tag. https://github.com/apache/spark/tree/v3.0.0-preview-rc2 Could you cut from the HEAD of master branch? Otherwise, nobody knows what release script you used for RC2. Bests, Dongjoon. On Wed, Oct 30, 2019 at 4:15 PM Xingbo Jiang wrote: > Hi

Re: [VOTE] SPARK 3.0.0-preview (RC1)

2019-10-30 Thread Xingbo Jiang
Hi all, This RC fails because: It fails to generate a PySpark release. I'll start RC2 soon. Thanks! Xingbo On Wed, Oct 30, 2019 at 4:10 PM Xingbo Jiang wrote: > Thanks Sean, since we need to generate PySpark release with a different > name, I would prefer fail RC1 and start another release

Re: [VOTE] SPARK 3.0.0-preview (RC1)

2019-10-30 Thread Sean Owen
I agree that we need a Pyspark release for this preview release. If it's a matter of producing it from the same tag, we can evaluate it within this same release candidate. Otherwise, just roll another release candidate. I was able to build it and pass all tests with JDK 8 and JDK 11 (hadoop-3.2

Re: [VOTE] SPARK 3.0.0-preview (RC1)

2019-10-30 Thread Xingbo Jiang
Thanks Sean, since we need to generate PySpark release with a different name, I would prefer fail RC1 and start another release candidate. Sean Owen 于2019年10月30日周三 下午4:00写道: > I agree that we need a Pyspark release for this preview release. If > it's a matter of producing it from the same tag,

Re: Packages to release in 3.0.0-preview

2019-10-30 Thread Sean Owen
I don't agree with this take. The bottleneck is pretty much not Spark -- it is all of its dependencies, and there are unfortunately a lot. For example, Chill (among other things) doesn't support 2.13 yet. I don't think 2.13 is that 'mainstream' yet. We are not close to Scala 2.13 support, so it

Re: Packages to release in 3.0.0-preview

2019-10-30 Thread Xingbo Jiang
scala 2.13 support is tracked by https://issues.apache.org/jira/browse/SPARK-25075 , at the current time there are still major issues remaining, thus we don't include scala 2.13 support in the 3.0.0-preview release. If the task is finished before the code freeze of Spark 3.0.0, then it's still

Re: Packages to release in 3.0.0-preview

2019-10-30 Thread antonkulaga
Why not trying the current Scala (2.13)? Spark has always been one (sometimes - two) Scala versions away from the whole Scala ecosystem and it has always been a big pain point for everybody. I understand that in the past you could not switch because of compatibility issues, but 3.x is a major

Re: [DISCUSS] Deprecate Python < 3.6 in Spark 3.0

2019-10-30 Thread Shane Knapp
sure. that shouldn't be too hard, but we've historically given very little support to it. On Wed, Oct 30, 2019 at 2:31 PM Maciej Szymkiewicz wrote: > Could we upgrade to PyPy3.6 v7.2.0? > On 10/30/19 9:45 PM, Shane Knapp wrote: > > one quick thing: we currently test against python2.7, 3.6

Re: [DISCUSS] Deprecate Python < 3.6 in Spark 3.0

2019-10-30 Thread Maciej Szymkiewicz
Could we upgrade to PyPy3.6 v7.2.0? On 10/30/19 9:45 PM, Shane Knapp wrote: > one quick thing:  we currently test against python2.7, 3.6 *and* > pypy2.5.1 (python2.7). > > what are our plans for pypy? > > > On Wed, Oct 30, 2019 at 12:26 PM Dongjoon Hyun > mailto:dongjoon.h...@gmail.com>> wrote: >

Re: [DISCUSS] Deprecate Python < 3.6 in Spark 3.0

2019-10-30 Thread Shane Knapp
also, here's my PR for dropping 2.7 tests: https://github.com/apache/spark/pull/26330 On Wed, Oct 30, 2019 at 1:45 PM Shane Knapp wrote: > one quick thing: we currently test against python2.7, 3.6 *and* pypy2.5.1 > (python2.7). > > what are our plans for pypy? > > > On Wed, Oct 30, 2019 at

Re: [DISCUSS] Deprecate Python < 3.6 in Spark 3.0

2019-10-30 Thread Shane Knapp
one quick thing: we currently test against python2.7, 3.6 *and* pypy2.5.1 (python2.7). what are our plans for pypy? On Wed, Oct 30, 2019 at 12:26 PM Dongjoon Hyun wrote: > Thank you all. I made a PR for that. > > https://github.com/apache/spark/pull/26326 > > On Tue, Oct 29, 2019 at 5:45 AM

Re: [DISCUSS] Deprecate Python < 3.6 in Spark 3.0

2019-10-30 Thread Dongjoon Hyun
Thank you all. I made a PR for that. https://github.com/apache/spark/pull/26326 On Tue, Oct 29, 2019 at 5:45 AM Takeshi Yamamuro wrote: > +1, too. > > On Tue, Oct 29, 2019 at 4:16 PM Holden Karau wrote: > >> +1 to deprecating but not yet removing support for 3.6 >> >> On Tue, Oct 29, 2019 at