Re: [VOTE] Release Apache Spark 2.4.2

2019-05-01 Thread Felix Cheung
Just my 2c

If there is a known security issue, we should fix it rather waiting for if it 
actually could be might be affecting Spark to be found by a black hat, or worse.

I don’t think any of us want to see Spark in the news for this reason.

From: Sean Owen 
Sent: Tuesday, April 30, 2019 1:52:53 PM
To: Reynold Xin
Cc: Jungtaek Lim; Dongjoon Hyun; Wenchen Fan; Michael Heuer; Terry Kim; dev; 
Xiao Li
Subject: Re: [VOTE] Release Apache Spark 2.4.2

FWIW I'm OK with this even though I proposed the backport PR for discussion. It 
really is a tough call, balancing the potential but as-yet unclear security 
benefit vs minor but real Jackson deserialization behavior change.

Because we have a pressing need for a 2.4.3 release (really a 2.4.2.1 almost) I 
think it's reasonable to defer a final call on this in 2.4.x and revert for 
now. Leaving it in 2.4.3 makes it quite permanent.

A little more color on the discussion:
- I don't think https://github.com/apache/spark/pull/22071 mitigates the 
theoretical problem here; I would guess the attack vector is deserializing a 
malicious JSON file. This is unproven either way
- The behavior change we know is basically what you see in the revert PR: 
entries like "'foo': null" aren't written by Jackson by default in 2.7+. You 
can make them so but it needs a code tweak in any app that inherits Spark's 
Jackson
- This is not related to Scala version

This is for a discussion about re-including in 2.4.4:
- Does anyone know that the Jackson issues really _could_ affect Spark
- Does anyone have concrete examples of why the behavior change is a bigger 
deal, or not as big a deal, as anticipated?

On Tue, Apr 30, 2019 at 1:34 AM Reynold Xin 
mailto:r...@databricks.com>> wrote:

Echoing both of you ... it's a bit risky to bump dependency versions in a patch 
release, especially for a super common library. (I wish we shaded Jackson).

Maybe the CVE is a sufficient reason to bump the dependency, ignoring the 
potential behavior changes that might happen, but I'd like to see a bit more 
discussions there and have 2.4.3 focusing on fixing the Scala version issue 
first.



On Mon, Apr 29, 2019 at 11:17 PM, Jungtaek Lim 
mailto:kabh...@gmail.com>> wrote:
Ah! Sorry Xiao I should check the fix version of issue (it's 2.4.3/3.0.0).

Then looks much better to revert and avoid dependency conflict in bugfix 
release. Jackson is one of known things making non-backward changes to 
non-major version, so I agree it's the thing to be careful, or shade/relocate 
and forget about it.

On Tue, Apr 30, 2019 at 3:04 PM Xiao Li 
mailto:lix...@databricks.com>> wrote:
Jungtaek,

Thanks for your inputs! Sorry for the confusion. Let me make it clear.

  *   All the previous 2.4.x [including 2.4.2] releases are using Jackson 
2.6.7.1.
  *   In the master branch, the Jackson is already upgraded to 2.9.8.
  *   Here, I just try to revert Jackson upgrade in the upcoming 2.4.3 release.

Cheers,

Xiao

On Mon, Apr 29, 2019 at 10:53 PM Jungtaek Lim 
mailto:kabh...@gmail.com>> wrote:
Just to be clear, does upgrading jackson to 2.9.8 be coupled with Scala 
version? And could you summarize one of actual broken case due to upgrade if 
you observe anything? Providing actual case would help us to weigh the impact.

Btw, my 2 cents, personally I would rather avoid upgrading dependencies in 
bugfix release unless it resolves major bugs, so reverting it from only 
branch-2.4 sounds good to me. (I still think jackson upgrade is necessary in 
master branch, avoiding lots of CVEs we will waste huge amount of time to 
identify the impact. And other libs will start making couple with jackson 2.9.x 
which conflict Spark's jackson dependency.)

If there will be a consensus regarding reverting that, we may also need to 
announce Spark 2.4.2 is discouraged to be used, otherwise end users will suffer 
from jackson version back and forth.

Thanks,
Jungtaek Lim (HeartSaVioR)

On Tue, Apr 30, 2019 at 2:30 PM Xiao Li 
mailto:lix...@databricks.com>> wrote:
Before cutting 2.4.3, I just submitted a PR 
https://github.com/apache/spark/pull/24493 for reverting the commit 
https://github.com/apache/spark/commit/6f394a20bf49f67b4d6329a1c25171c8024a2fae.

In general, we need to be very cautious about the Jackson upgrade in the patch 
releases, especially when this upgrade could break the existing behaviors of 
the external packages or data sources, and generate different results after the 
upgrade. The external packages and data sources need to change their source 
code to keep the original behaviors. The upgrade requires more discussions 
before releasing it, I think.

In the previous PR https://github.com/apache/spark/pull/22071, we turned off 
`spark.master.rest.enabled<http://spark.master.rest.enabled/>` by default and 
added the following claim in our security doc:
The Rest Submission Server and the MesosClusterDispatcher do not support 
authenticati

Re: [VOTE] Release Apache Spark 2.4.2

2019-04-29 Thread Dongjoon Hyun
Hi, All and Xiao (as a next release manager).

In any case, can the release manager include the information about the used
release script as a part of VOTE email officially?

That information will be very helpful to reproduce Spark build (in the
downstream environment)

Currently, it's not clearly which release script is used because the master
branch is also changed time to time during multiple RCs.

We only guess some githash based on the RC start time.

Bests,
Dongjoon.

On Mon, Apr 29, 2019 at 7:17 PM Wenchen Fan  wrote:

> >  it could just be fixed in master rather than back-port and re-roll the
> RC
>
> I don't think the release script is part of the released product. That
> said, we can just fix the release script in branch 2.4 without creating a
> new RC. We can even create a new repo for the release script, like
> spark-website, to make it clearer.
>
> On Tue, Apr 30, 2019 at 7:22 AM Sean Owen  wrote:
>
>> I think this is a reasonable idea; I know @vanzin had suggested it was
>> simpler to use the latest in case a bug was found in the release script and
>> then it could just be fixed in master rather than back-port and re-roll the
>> RC. That said I think we did / had to already drop the ability to build <=
>> 2.3 from the master release script already.
>>
>> On Sun, Apr 28, 2019 at 9:25 PM Wenchen Fan  wrote:
>>
>>> >  ... by using the release script of Spark 2.4 branch
>>>
>>> Shall we keep it as a policy? Previously we used the release script from
>>> the master branch to do the release work for all Spark versions, now I feel
>>> it's simpler and less error-prone to let the release script only handle one
>>> branch. We don't keep many branches as active at the same time, so the
>>> maintenance overhead for the release script should be OK.
>>>




Re: [VOTE] Release Apache Spark 2.4.2

2019-04-29 Thread Wenchen Fan
>  it could just be fixed in master rather than back-port and re-roll the RC

I don't think the release script is part of the released product. That
said, we can just fix the release script in branch 2.4 without creating a
new RC. We can even create a new repo for the release script, like
spark-website, to make it clearer.

On Tue, Apr 30, 2019 at 7:22 AM Sean Owen  wrote:

> I think this is a reasonable idea; I know @vanzin had suggested it was
> simpler to use the latest in case a bug was found in the release script and
> then it could just be fixed in master rather than back-port and re-roll the
> RC. That said I think we did / had to already drop the ability to build <=
> 2.3 from the master release script already.
>
> On Sun, Apr 28, 2019 at 9:25 PM Wenchen Fan  wrote:
>
>> >  ... by using the release script of Spark 2.4 branch
>>
>> Shall we keep it as a policy? Previously we used the release script from
>> the master branch to do the release work for all Spark versions, now I feel
>> it's simpler and less error-prone to let the release script only handle one
>> branch. We don't keep many branches as active at the same time, so the
>> maintenance overhead for the release script should be OK.
>>
>>>
>>>


Re: [VOTE] Release Apache Spark 2.4.2

2019-04-29 Thread Sean Owen
I think this is a reasonable idea; I know @vanzin had suggested it was
simpler to use the latest in case a bug was found in the release script and
then it could just be fixed in master rather than back-port and re-roll the
RC. That said I think we did / had to already drop the ability to build <=
2.3 from the master release script already.

On Sun, Apr 28, 2019 at 9:25 PM Wenchen Fan  wrote:

> >  ... by using the release script of Spark 2.4 branch
>
> Shall we keep it as a policy? Previously we used the release script from
> the master branch to do the release work for all Spark versions, now I feel
> it's simpler and less error-prone to let the release script only handle one
> branch. We don't keep many branches as active at the same time, so the
> maintenance overhead for the release script should be OK.
>
>>
>>


Re: [VOTE] Release Apache Spark 2.4.2

2019-04-26 Thread Michael Heuer
As Scala 2.11 is the default for 2.4.x, we currently include _2.11 artifacts in 
our release.  Our Python library depends on the Scala artifacts.

With Homebrew, we have -submit and -shell script wrappers for spark-submit and 
spark-shell, and those will break at runtime if we're using _2.11 artifacts on 
Spark built for Scala 2.12.


> On Apr 26, 2019, at 11:12 AM, Sean Owen  wrote:
> 
> Yeah I don't think the pyspark change was intentional; I'm trying to help 
> assess what the impact is though. 
> 
> It may be a dumb question, but, what problem does the change cause? is it 
> beyond what I mentioned below? you have a project with interdependent Python 
> and Scala components?
> 
> On Fri, Apr 26, 2019 at 11:02 AM Michael Heuer  > wrote:
> We certainly can't be the only project downstream of Spark that includes 
> Scala versioned artifacts in our release.  Our python library on PyPI depends 
> on pyspark, our Bioconda recipe depends on the pyspark Conda recipe, and our 
> Homebrew formula depends on the apache-spark Homebrew formula.
> 
> Using Scala 2.12 in the binary distribution for Spark 2.4.2 was unintentional 
> and never voted on.  There was a successful vote to default to Scala 2.12 in 
> Spark version 3.0.
> 
>michael
> 
> 
>> On Apr 26, 2019, at 9:52 AM, Sean Owen > > wrote:
>> 
>> To be clear, what's the nature of the problem there... just Pyspark apps 
>> that are using a Scala-based library? Trying to make sure we understand what 
>> is and isn't a problem here.
> 



Re: [VOTE] Release Apache Spark 2.4.2

2019-04-26 Thread Sean Owen
Yeah I don't think the pyspark change was intentional; I'm trying to help
assess what the impact is though.

It may be a dumb question, but, what problem does the change cause? is it
beyond what I mentioned below? you have a project with interdependent
Python and Scala components?

On Fri, Apr 26, 2019 at 11:02 AM Michael Heuer  wrote:

> We certainly can't be the only project downstream of Spark that includes
> Scala versioned artifacts in our release.  Our python library on PyPI
> depends on pyspark, our Bioconda recipe depends on the pyspark Conda
> recipe, and our Homebrew formula depends on the apache-spark Homebrew
> formula.
>
> Using Scala 2.12 in the binary distribution for Spark 2.4.2 was
> unintentional and never voted on.  There was a successful vote to default
> to Scala 2.12 in Spark version 3.0.
>
>michael
>
>
> On Apr 26, 2019, at 9:52 AM, Sean Owen  wrote:
>
> To be clear, what's the nature of the problem there... just Pyspark apps
> that are using a Scala-based library? Trying to make sure we understand
> what is and isn't a problem here.
>
>
>


Re: [VOTE] Release Apache Spark 2.4.2

2019-04-26 Thread Michael Heuer
We certainly can't be the only project downstream of Spark that includes Scala 
versioned artifacts in our release.  Our python library on PyPI depends on 
pyspark, our Bioconda recipe depends on the pyspark Conda recipe, and our 
Homebrew formula depends on the apache-spark Homebrew formula.

Using Scala 2.12 in the binary distribution for Spark 2.4.2 was unintentional 
and never voted on.  There was a successful vote to default to Scala 2.12 in 
Spark version 3.0.

   michael


> On Apr 26, 2019, at 9:52 AM, Sean Owen  wrote:
> 
> To be clear, what's the nature of the problem there... just Pyspark apps that 
> are using a Scala-based library? Trying to make sure we understand what is 
> and isn't a problem here.
> 
> On Fri, Apr 26, 2019 at 9:44 AM Michael Heuer  > wrote:
> This will also cause problems in Conda builds that depend on pyspark
> 
> https://anaconda.org/conda-forge/pyspark 
> 
> 
> and Homebrew builds that depend on apache-spark, as that also uses the binary 
> distribution.
> 
> https://formulae.brew.sh/formula/apache-spark#default 
> 
> 
> +1 (non-binding) to cutting a 2.4.3 release immediately.
> 
>michael
> 
> 
>> On Apr 26, 2019, at 2:05 AM, Reynold Xin > > wrote:
>> 
>> I do feel it'd be better to not switch default Scala versions in a minor 
>> release. I don't know how much downstream this impacts. Dotnet is a good 
>> data point. Anybody else hit this issue?
>> 
>> 
>> 
>> 
>> On Thu, Apr 25, 2019 at 11:36 PM, Terry Kim > > wrote:
>> Very much interested in hearing what you folks decide. We currently have a 
>> couple asking us questions at https://github.com/dotnet/spark/issues 
>> .
>> 
>> Thanks, 
>> Terry
>> 
>> -- 
>> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/ 
>> 
>> - To 
>> unsubscribe e-mail: dev-unsubscr...@spark.apache.org 
>> 
> 



Re: [VOTE] Release Apache Spark 2.4.2

2019-04-26 Thread Sean Owen
To be clear, what's the nature of the problem there... just Pyspark apps
that are using a Scala-based library? Trying to make sure we understand
what is and isn't a problem here.

On Fri, Apr 26, 2019 at 9:44 AM Michael Heuer  wrote:

> This will also cause problems in Conda builds that depend on pyspark
>
> https://anaconda.org/conda-forge/pyspark
>
> and Homebrew builds that depend on apache-spark, as that also uses the
> binary distribution.
>
> https://formulae.brew.sh/formula/apache-spark#default
>
> +1 (non-binding) to cutting a 2.4.3 release immediately.
>
>michael
>
>
> On Apr 26, 2019, at 2:05 AM, Reynold Xin  wrote:
>
> I do feel it'd be better to not switch default Scala versions in a minor
> release. I don't know how much downstream this impacts. Dotnet is a good
> data point. Anybody else hit this issue?
>
>
>
>
> On Thu, Apr 25, 2019 at 11:36 PM, Terry Kim  wrote:
>
>> Very much interested in hearing what you folks decide. We currently have
>> a couple asking us questions at https://github.com/dotnet/spark/issues.
>>
>> Thanks,
>> Terry
>>
>> --
>> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>>
>> - To
>> unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>
>
>


Re: [VOTE] Release Apache Spark 2.4.2

2019-04-26 Thread Michael Heuer
This will also cause problems in Conda builds that depend on pyspark

https://anaconda.org/conda-forge/pyspark 


and Homebrew builds that depend on apache-spark, as that also uses the binary 
distribution.

https://formulae.brew.sh/formula/apache-spark#default 


+1 (non-binding) to cutting a 2.4.3 release immediately.

   michael


> On Apr 26, 2019, at 2:05 AM, Reynold Xin  wrote:
> 
> I do feel it'd be better to not switch default Scala versions in a minor 
> release. I don't know how much downstream this impacts. Dotnet is a good data 
> point. Anybody else hit this issue?
> 
> 
> 
> 
> On Thu, Apr 25, 2019 at 11:36 PM, Terry Kim  > wrote:
> Very much interested in hearing what you folks decide. We currently have a 
> couple asking us questions at https://github.com/dotnet/spark/issues 
> .
> 
> Thanks, 
> Terry
> 
> -- 
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/ 
> 
> - To 
> unsubscribe e-mail: dev-unsubscr...@spark.apache.org 
> 



Re: [VOTE] Release Apache Spark 2.4.2

2019-04-26 Thread Sean Owen
Re: .NET, what's the particular issue in there that it's causing?
2.4.2 still builds for 2.11. I'd imagine you'd be pulling dependencies
from Maven central (?) or if needed can build for 2.11 from source.
I'm more concerned about pyspark because it builds in 2.12 jars.

On Fri, Apr 26, 2019 at 1:36 AM Terry Kim  wrote:
>
> Very much interested in hearing what you folks decide. We currently have a
> couple asking us questions at https://github.com/dotnet/spark/issues.
>
> Thanks,
> Terry
>
>
>
> --
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.4.2

2019-04-26 Thread Reynold Xin
I do feel it'd be better to not switch default Scala versions in a minor 
release. I don't know how much downstream this impacts. Dotnet is a good data 
point. Anybody else hit this issue?

On Thu, Apr 25, 2019 at 11:36 PM, Terry Kim < yumin...@gmail.com > wrote:

> 
> 
> 
> Very much interested in hearing what you folks decide. We currently have a
> couple asking us questions at https:/ / github. com/ dotnet/ spark/ issues
> ( https://github.com/dotnet/spark/issues ).
> 
> 
> 
> Thanks,
> Terry
> 
> 
> 
> --
> Sent from: http:/ / apache-spark-developers-list. 1001551. n3. nabble. com/
> ( http://apache-spark-developers-list.1001551.n3.nabble.com/ )
> 
> 
> 
> - To
> unsubscribe e-mail: dev-unsubscribe@ spark. apache. org (
> dev-unsubscr...@spark.apache.org )
> 
> 
>

Re: [VOTE] Release Apache Spark 2.4.2

2019-04-26 Thread Terry Kim
Very much interested in hearing what you folks decide. We currently have a
couple asking us questions at https://github.com/dotnet/spark/issues.

Thanks,
Terry



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.4.2

2019-04-22 Thread Shixiong(Ryan) Zhu
+1 I have tested it and looks good!

Best Regards,
Ryan


On Sun, Apr 21, 2019 at 8:49 PM Wenchen Fan  wrote:

> Yea these should be mentioned in the 2.4.1 release notes.
>
> It seems we only have one ticket that is labeled as "release-notes" for
> 2.4.2: https://issues.apache.org/jira/browse/SPARK-27419 . I'll mention
> it when I write release notes.
>
> On Mon, Apr 22, 2019 at 5:46 AM Sean Owen  wrote:
>
>> One minor comment: for 2.4.1 we had a couple JIRAs marked 'release-notes':
>>
>> https://issues.apache.org/jira/browse/SPARK-27198?jql=project%20%3D%20SPARK%20and%20fixVersion%20%20in%20(2.4.1%2C%202.4.2)%20and%20labels%20%3D%20%27release-notes%27
>>
>> They should be mentioned in
>> https://spark.apache.org/releases/spark-release-2-4-1.html possibly
>> like "Changes of behavior" in
>> https://spark.apache.org/releases/spark-release-2-4-0.html
>>
>> I can retroactively update that page; is this part of the notes for
>> the release process though? I missed this one for sure as it's easy to
>> overlook with all the pages being updated per release.
>>
>> On Thu, Apr 18, 2019 at 9:51 PM Wenchen Fan  wrote:
>> >
>> > Please vote on releasing the following candidate as Apache Spark
>> version 2.4.2.
>> >
>> > The vote is open until April 23 PST and passes if a majority +1 PMC
>> votes are cast, with
>> > a minimum of 3 +1 votes.
>> >
>> > [ ] +1 Release this package as Apache Spark 2.4.2
>> > [ ] -1 Do not release this package because ...
>> >
>> > To learn more about Apache Spark, please see http://spark.apache.org/
>> >
>> > The tag to be voted on is v2.4.2-rc1 (commit
>> a44880ba74caab7a987128cb09c4bee41617770a):
>> > https://github.com/apache/spark/tree/v2.4.2-rc1
>> >
>> > The release files, including signatures, digests, etc. can be found at:
>> > https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-bin/
>> >
>> > Signatures used for Spark RCs can be found in this file:
>> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>> >
>> > The staging repository for this release can be found at:
>> > https://repository.apache.org/content/repositories/orgapachespark-1322/
>> >
>> > The documentation corresponding to this release can be found at:
>> > https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-docs/
>> >
>> > The list of bug fixes going into 2.4.1 can be found at the following
>> URL:
>> > https://issues.apache.org/jira/projects/SPARK/versions/12344996
>> >
>> > FAQ
>> >
>> > =
>> > How can I help test this release?
>> > =
>> >
>> > If you are a Spark user, you can help us test this release by taking
>> > an existing Spark workload and running on this release candidate, then
>> > reporting any regressions.
>> >
>> > If you're working in PySpark you can set up a virtual env and install
>> > the current RC and see if anything important breaks, in the Java/Scala
>> > you can add the staging repository to your projects resolvers and test
>> > with the RC (make sure to clean up the artifact cache before/after so
>> > you don't end up building with a out of date RC going forward).
>> >
>> > ===
>> > What should happen to JIRA tickets still targeting 2.4.2?
>> > ===
>> >
>> > The current list of open tickets targeted at 2.4.2 can be found at:
>> > https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 2.4.2
>> >
>> > Committers should look at those and triage. Extremely important bug
>> > fixes, documentation, and API tweaks that impact compatibility should
>> > be worked on immediately. Everything else please retarget to an
>> > appropriate release.
>> >
>> > ==
>> > But my bug isn't fixed?
>> > ==
>> >
>> > In order to make timely releases, we will typically not hold the
>> > release unless the bug in question is a regression from the previous
>> > release. That being said, if there is something which is a regression
>> > that has not been correctly targeted please ping me or a committer to
>> > help target the issue.
>>
>


Re: [VOTE] Release Apache Spark 2.4.2

2019-04-21 Thread Wenchen Fan
Yea these should be mentioned in the 2.4.1 release notes.

It seems we only have one ticket that is labeled as "release-notes" for
2.4.2: https://issues.apache.org/jira/browse/SPARK-27419 . I'll mention it
when I write release notes.

On Mon, Apr 22, 2019 at 5:46 AM Sean Owen  wrote:

> One minor comment: for 2.4.1 we had a couple JIRAs marked 'release-notes':
>
> https://issues.apache.org/jira/browse/SPARK-27198?jql=project%20%3D%20SPARK%20and%20fixVersion%20%20in%20(2.4.1%2C%202.4.2)%20and%20labels%20%3D%20%27release-notes%27
>
> They should be mentioned in
> https://spark.apache.org/releases/spark-release-2-4-1.html possibly
> like "Changes of behavior" in
> https://spark.apache.org/releases/spark-release-2-4-0.html
>
> I can retroactively update that page; is this part of the notes for
> the release process though? I missed this one for sure as it's easy to
> overlook with all the pages being updated per release.
>
> On Thu, Apr 18, 2019 at 9:51 PM Wenchen Fan  wrote:
> >
> > Please vote on releasing the following candidate as Apache Spark version
> 2.4.2.
> >
> > The vote is open until April 23 PST and passes if a majority +1 PMC
> votes are cast, with
> > a minimum of 3 +1 votes.
> >
> > [ ] +1 Release this package as Apache Spark 2.4.2
> > [ ] -1 Do not release this package because ...
> >
> > To learn more about Apache Spark, please see http://spark.apache.org/
> >
> > The tag to be voted on is v2.4.2-rc1 (commit
> a44880ba74caab7a987128cb09c4bee41617770a):
> > https://github.com/apache/spark/tree/v2.4.2-rc1
> >
> > The release files, including signatures, digests, etc. can be found at:
> > https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-bin/
> >
> > Signatures used for Spark RCs can be found in this file:
> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> >
> > The staging repository for this release can be found at:
> > https://repository.apache.org/content/repositories/orgapachespark-1322/
> >
> > The documentation corresponding to this release can be found at:
> > https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-docs/
> >
> > The list of bug fixes going into 2.4.1 can be found at the following URL:
> > https://issues.apache.org/jira/projects/SPARK/versions/12344996
> >
> > FAQ
> >
> > =
> > How can I help test this release?
> > =
> >
> > If you are a Spark user, you can help us test this release by taking
> > an existing Spark workload and running on this release candidate, then
> > reporting any regressions.
> >
> > If you're working in PySpark you can set up a virtual env and install
> > the current RC and see if anything important breaks, in the Java/Scala
> > you can add the staging repository to your projects resolvers and test
> > with the RC (make sure to clean up the artifact cache before/after so
> > you don't end up building with a out of date RC going forward).
> >
> > ===
> > What should happen to JIRA tickets still targeting 2.4.2?
> > ===
> >
> > The current list of open tickets targeted at 2.4.2 can be found at:
> > https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 2.4.2
> >
> > Committers should look at those and triage. Extremely important bug
> > fixes, documentation, and API tweaks that impact compatibility should
> > be worked on immediately. Everything else please retarget to an
> > appropriate release.
> >
> > ==
> > But my bug isn't fixed?
> > ==
> >
> > In order to make timely releases, we will typically not hold the
> > release unless the bug in question is a regression from the previous
> > release. That being said, if there is something which is a regression
> > that has not been correctly targeted please ping me or a committer to
> > help target the issue.
>


Re: [VOTE] Release Apache Spark 2.4.2

2019-04-21 Thread Sean Owen
One minor comment: for 2.4.1 we had a couple JIRAs marked 'release-notes':
https://issues.apache.org/jira/browse/SPARK-27198?jql=project%20%3D%20SPARK%20and%20fixVersion%20%20in%20(2.4.1%2C%202.4.2)%20and%20labels%20%3D%20%27release-notes%27

They should be mentioned in
https://spark.apache.org/releases/spark-release-2-4-1.html possibly
like "Changes of behavior" in
https://spark.apache.org/releases/spark-release-2-4-0.html

I can retroactively update that page; is this part of the notes for
the release process though? I missed this one for sure as it's easy to
overlook with all the pages being updated per release.

On Thu, Apr 18, 2019 at 9:51 PM Wenchen Fan  wrote:
>
> Please vote on releasing the following candidate as Apache Spark version 
> 2.4.2.
>
> The vote is open until April 23 PST and passes if a majority +1 PMC votes are 
> cast, with
> a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 2.4.2
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v2.4.2-rc1 (commit 
> a44880ba74caab7a987128cb09c4bee41617770a):
> https://github.com/apache/spark/tree/v2.4.2-rc1
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1322/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-docs/
>
> The list of bug fixes going into 2.4.1 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12344996
>
> FAQ
>
> =
> How can I help test this release?
> =
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 2.4.2?
> ===
>
> The current list of open tickets targeted at 2.4.2 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target 
> Version/s" = 2.4.2
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.4.2

2019-04-21 Thread vaquar khan
+1

Regards,
Vaquar khan

On Sun, Apr 21, 2019, 11:19 PM Felix Cheung 
wrote:

> +1
>
> R tests, package tests on r-hub. Manually check commits under R, doc etc
>
>
> --
> *From:* Sean Owen 
> *Sent:* Saturday, April 20, 2019 11:27 AM
> *To:* Wenchen Fan
> *Cc:* Spark dev list
> *Subject:* Re: [VOTE] Release Apache Spark 2.4.2
>
> +1 from me too.
>
> It seems like there is support for merging the Jackson change into
> 2.4.x (and, I think, a few more minor dependency updates) but this
> doesn't have to go into 2.4.2. That said, if there is another RC for
> any reason, I think we could include it. Otherwise can wait for 2.4.3.
>
> On Thu, Apr 18, 2019 at 9:51 PM Wenchen Fan  wrote:
> >
> > Please vote on releasing the following candidate as Apache Spark version
> 2.4.2.
> >
> > The vote is open until April 23 PST and passes if a majority +1 PMC
> votes are cast, with
> > a minimum of 3 +1 votes.
> >
> > [ ] +1 Release this package as Apache Spark 2.4.2
> > [ ] -1 Do not release this package because ...
> >
> > To learn more about Apache Spark, please see http://spark.apache.org/
> >
> > The tag to be voted on is v2.4.2-rc1 (commit
> a44880ba74caab7a987128cb09c4bee41617770a):
> > https://github.com/apache/spark/tree/v2.4.2-rc1
> >
> > The release files, including signatures, digests, etc. can be found at:
> > https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-bin/
> >
> > Signatures used for Spark RCs can be found in this file:
> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> >
> > The staging repository for this release can be found at:
> > https://repository.apache.org/content/repositories/orgapachespark-1322/
> >
> > The documentation corresponding to this release can be found at:
> > https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-docs/
> >
> > The list of bug fixes going into 2.4.1 can be found at the following URL:
> > https://issues.apache.org/jira/projects/SPARK/versions/12344996
> >
> > FAQ
> >
> > =
> > How can I help test this release?
> > =
> >
> > If you are a Spark user, you can help us test this release by taking
> > an existing Spark workload and running on this release candidate, then
> > reporting any regressions.
> >
> > If you're working in PySpark you can set up a virtual env and install
> > the current RC and see if anything important breaks, in the Java/Scala
> > you can add the staging repository to your projects resolvers and test
> > with the RC (make sure to clean up the artifact cache before/after so
> > you don't end up building with a out of date RC going forward).
> >
> > ===
> > What should happen to JIRA tickets still targeting 2.4.2?
> > ===
> >
> > The current list of open tickets targeted at 2.4.2 can be found at:
> > https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 2.4.2
> >
> > Committers should look at those and triage. Extremely important bug
> > fixes, documentation, and API tweaks that impact compatibility should
> > be worked on immediately. Everything else please retarget to an
> > appropriate release.
> >
> > ==
> > But my bug isn't fixed?
> > ==
> >
> > In order to make timely releases, we will typically not hold the
> > release unless the bug in question is a regression from the previous
> > release. That being said, if there is something which is a regression
> > that has not been correctly targeted please ping me or a committer to
> > help target the issue.
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: [VOTE] Release Apache Spark 2.4.2

2019-04-21 Thread Felix Cheung
+1

R tests, package tests on r-hub. Manually check commits under R, doc etc



From: Sean Owen 
Sent: Saturday, April 20, 2019 11:27 AM
To: Wenchen Fan
Cc: Spark dev list
Subject: Re: [VOTE] Release Apache Spark 2.4.2

+1 from me too.

It seems like there is support for merging the Jackson change into
2.4.x (and, I think, a few more minor dependency updates) but this
doesn't have to go into 2.4.2. That said, if there is another RC for
any reason, I think we could include it. Otherwise can wait for 2.4.3.

On Thu, Apr 18, 2019 at 9:51 PM Wenchen Fan  wrote:
>
> Please vote on releasing the following candidate as Apache Spark version 
> 2.4.2.
>
> The vote is open until April 23 PST and passes if a majority +1 PMC votes are 
> cast, with
> a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 2.4.2
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v2.4.2-rc1 (commit 
> a44880ba74caab7a987128cb09c4bee41617770a):
> https://github.com/apache/spark/tree/v2.4.2-rc1
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1322/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-docs/
>
> The list of bug fixes going into 2.4.1 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12344996
>
> FAQ
>
> =
> How can I help test this release?
> =
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 2.4.2?
> ===
>
> The current list of open tickets targeted at 2.4.2 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target 
> Version/s" = 2.4.2
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.4.2

2019-04-20 Thread Sean Owen
+1 from me too.

It seems like there is support for merging the Jackson change into
2.4.x (and, I think, a few more minor dependency updates) but this
doesn't have to go into 2.4.2. That said, if there is another RC for
any reason, I think we could include it. Otherwise can wait for 2.4.3.

On Thu, Apr 18, 2019 at 9:51 PM Wenchen Fan  wrote:
>
> Please vote on releasing the following candidate as Apache Spark version 
> 2.4.2.
>
> The vote is open until April 23 PST and passes if a majority +1 PMC votes are 
> cast, with
> a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 2.4.2
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v2.4.2-rc1 (commit 
> a44880ba74caab7a987128cb09c4bee41617770a):
> https://github.com/apache/spark/tree/v2.4.2-rc1
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1322/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-docs/
>
> The list of bug fixes going into 2.4.1 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12344996
>
> FAQ
>
> =
> How can I help test this release?
> =
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 2.4.2?
> ===
>
> The current list of open tickets targeted at 2.4.2 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target 
> Version/s" = 2.4.2
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.4.2

2019-04-19 Thread shane knapp
-1, as i'd like to be sure that the python test infra change for jenkins is
included (https://github.com/apache/spark/pull/24379)

On Fri, Apr 19, 2019 at 12:01 PM Michael Armbrust 
wrote:

> +1 (binding), we've test this and it LGTM.
>
> On Thu, Apr 18, 2019 at 7:51 PM Wenchen Fan  wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.4.2.
>>
>> The vote is open until April 23 PST and passes if a majority +1 PMC votes
>> are cast, with
>> a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 2.4.2
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v2.4.2-rc1 (commit
>> a44880ba74caab7a987128cb09c4bee41617770a):
>> https://github.com/apache/spark/tree/v2.4.2-rc1
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1322/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-docs/
>>
>> The list of bug fixes going into 2.4.1 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12344996
>>
>> FAQ
>>
>> =
>> How can I help test this release?
>> =
>>
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===
>> What should happen to JIRA tickets still targeting 2.4.2?
>> ===
>>
>> The current list of open tickets targeted at 2.4.2 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 2.4.2
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==
>> But my bug isn't fixed?
>> ==
>>
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>

-- 
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu


Re: [VOTE] Release Apache Spark 2.4.2

2019-04-19 Thread Michael Armbrust
+1 (binding), we've test this and it LGTM.

On Thu, Apr 18, 2019 at 7:51 PM Wenchen Fan  wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 2.4.2.
>
> The vote is open until April 23 PST and passes if a majority +1 PMC votes
> are cast, with
> a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 2.4.2
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v2.4.2-rc1 (commit
> a44880ba74caab7a987128cb09c4bee41617770a):
> https://github.com/apache/spark/tree/v2.4.2-rc1
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1322/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-docs/
>
> The list of bug fixes going into 2.4.1 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12344996
>
> FAQ
>
> =
> How can I help test this release?
> =
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 2.4.2?
> ===
>
> The current list of open tickets targeted at 2.4.2 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 2.4.2
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
>


[VOTE] Release Apache Spark 2.4.2

2019-04-18 Thread Wenchen Fan
Please vote on releasing the following candidate as Apache Spark version
2.4.2.

The vote is open until April 23 PST and passes if a majority +1 PMC votes
are cast, with
a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.4.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.4.2-rc1 (commit
a44880ba74caab7a987128cb09c4bee41617770a):
https://github.com/apache/spark/tree/v2.4.2-rc1

The release files, including signatures, digests, etc. can be found at:
https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-bin/

Signatures used for Spark RCs can be found in this file:
https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1322/

The documentation corresponding to this release can be found at:
https://dist.apache.org/repos/dist/dev/spark/v2.4.2-rc1-docs/

The list of bug fixes going into 2.4.1 can be found at the following URL:
https://issues.apache.org/jira/projects/SPARK/versions/12344996

FAQ

=
How can I help test this release?
=

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===
What should happen to JIRA tickets still targeting 2.4.2?
===

The current list of open tickets targeted at 2.4.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target
Version/s" = 2.4.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==
But my bug isn't fixed?
==

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.