If we could work on this quickly - it might get on to future RCs.


________________________________
From: Stavros Kontopoulos <stavros.kontopou...@lightbend.com>
Sent: Monday, September 17, 2018 2:35 PM
To: Yinan Li
Cc: Xiao Li; eerla...@redhat.com; van...@cloudera.com.invalid; Sean Owen; 
Wenchen Fan; dev
Subject: Re: [VOTE] SPARK 2.4.0 (RC1)

Hi Xiao,

I just tested it, it seems ok. There are some questions about which properties 
we should keep when restoring the config. Otherwise it looks ok to me.
The reason this should go in 2.4 is that streaming on k8s is something people 
want to try day one (or at least it is cool to try) and since 2.4 comes with 
k8s support being refactored a lot,
it would be disappointing not to have it in...IMHO.

Best,
Stavros

On Mon, Sep 17, 2018 at 11:13 PM, Yinan Li 
<liyinan...@gmail.com<mailto:liyinan...@gmail.com>> wrote:
We can merge the PR and get SPARK-23200 resolved if the whole point is to make 
streaming on k8s work first. But given that this is not a blocker for 2.4, I 
think we can take a bit more time here and get it right. With that being said, 
I would expect it to be resolved soon.

On Mon, Sep 17, 2018 at 11:47 AM Xiao Li 
<gatorsm...@gmail.com<mailto:gatorsm...@gmail.com>> wrote:
Hi, Erik and Stavros,

This bug fix SPARK-23200 is not a blocker of the 2.4 release. It sounds 
important for the Streaming on K8S. Could the K8S oriented committers speed up 
the reviews?

Thanks,

Xiao

Erik Erlandson <eerla...@redhat.com<mailto:eerla...@redhat.com>> 于2018年9月17日周一 
上午11:04写道:

I have no binding vote but I second Stavros’ recommendation for spark-23200

Per parallel threads on Py2 support I would also like to propose deprecating 
Py2 starting with this 2.4 release

On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin <van...@cloudera.com.invalid> 
wrote:
You can log in to https://repository.apache.org and see what's wrong.
Just find that staging repo and look at the messages. In your case it
seems related to your signature.

failureMessageNo public key: Key with id: (xxxx) was not able to be
located on http://gpg-keyserver.de/. Upload your public key and try
the operation again.
On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan 
<cloud0...@gmail.com<mailto:cloud0...@gmail.com>> wrote:
>
> I confirmed that 
> https://repository.apache.org/content/repositories/orgapachespark-1285 is not 
> accessible. I did it via ./dev/create-release/do-release-docker.sh -d 
> /my/work/dir -s publish , not sure what's going wrong. I didn't see any error 
> message during it.
>
> Any insights are appreciated! So that I can fix it in the next RC. Thanks!
>
> On Mon, Sep 17, 2018 at 11:31 AM Sean Owen 
> <sro...@apache.org<mailto:sro...@apache.org>> wrote:
>>
>> I think one build is enough, but haven't thought it through. The
>> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
>> best advertised as a 'beta'. So maybe publish a no-hadoop build of it?
>> Really, whatever's the easy thing to do.
>> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan 
>> <cloud0...@gmail.com<mailto:cloud0...@gmail.com>> wrote:
>> >
>> > Ah I missed the Scala 2.12 build. Do you mean we should publish a Scala 
>> > 2.12 build this time? Current for Scala 2.11 we have 3 builds: with hadoop 
>> > 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for Scala 
>> > 2.12?
>> >
>> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen 
>> > <sro...@apache.org<mailto:sro...@apache.org>> wrote:
>> >>
>> >> A few preliminary notes:
>> >>
>> >> Wenchen for some weird reason when I hit your key in gpg --import, it
>> >> asks for a passphrase. When I skip it, it's fine, gpg can still verify
>> >> the signature. No issue there really.
>> >>
>> >> The staging repo gives a 404:
>> >> https://repository.apache.org/content/repositories/orgapachespark-1285/
>> >> 404 - Repository "orgapachespark-1285 (staging: open)"
>> >> [id=orgapachespark-1285] exists but is not exposed.
>> >>
>> >> The (revamped) licenses are OK, though there are some minor glitches
>> >> in the final release tarballs (my fault) : there's an extra directory,
>> >> and the source release has both binary and source licenses. I'll fix
>> >> that. Not strictly necessary to reject the release over those.
>> >>
>> >> Last, when I check the staging repo I'll get my answer, but, were you
>> >> able to build 2.12 artifacts as well?
>> >>
>> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan 
>> >> <cloud0...@gmail.com<mailto:cloud0...@gmail.com>> wrote:
>> >> >
>> >> > Please vote on releasing the following candidate as Apache Spark 
>> >> > version 2.4.0.
>> >> >
>> >> > The vote is open until September 20 PST and passes if a majority +1 PMC 
>> >> > votes are cast, with
>> >> > a minimum of 3 +1 votes.
>> >> >
>> >> > [ ] +1 Release this package as Apache Spark 2.4.0
>> >> > [ ] -1 Do not release this package because ...
>> >> >
>> >> > To learn more about Apache Spark, please see http://spark.apache.org/
>> >> >
>> >> > The tag to be voted on is v2.4.0-rc1 (commit 
>> >> > 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
>> >> > https://github.com/apache/spark/tree/v2.4.0-rc1
>> >> >
>> >> > The release files, including signatures, digests, etc. can be found at:
>> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
>> >> >
>> >> > Signatures used for Spark RCs can be found in this file:
>> >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>> >> >
>> >> > The staging repository for this release can be found at:
>> >> > https://repository.apache.org/content/repositories/orgapachespark-1285/
>> >> >
>> >> > The documentation corresponding to this release can be found at:
>> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
>> >> >
>> >> > The list of bug fixes going into 2.4.0 can be found at the following 
>> >> > URL:
>> >> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
>> >> >
>> >> > FAQ
>> >> >
>> >> > =========================
>> >> > How can I help test this release?
>> >> > =========================
>> >> >
>> >> > If you are a Spark user, you can help us test this release by taking
>> >> > an existing Spark workload and running on this release candidate, then
>> >> > reporting any regressions.
>> >> >
>> >> > If you're working in PySpark you can set up a virtual env and install
>> >> > the current RC and see if anything important breaks, in the Java/Scala
>> >> > you can add the staging repository to your projects resolvers and test
>> >> > with the RC (make sure to clean up the artifact cache before/after so
>> >> > you don't end up building with a out of date RC going forward).
>> >> >
>> >> > ===========================================
>> >> > What should happen to JIRA tickets still targeting 2.4.0?
>> >> > ===========================================
>> >> >
>> >> > The current list of open tickets targeted at 2.4.0 can be found at:
>> >> > https://issues.apache.org/jira/projects/SPARK and search for "Target 
>> >> > Version/s" = 2.4.0
>> >> >
>> >> > Committers should look at those and triage. Extremely important bug
>> >> > fixes, documentation, and API tweaks that impact compatibility should
>> >> > be worked on immediately. Everything else please retarget to an
>> >> > appropriate release.
>> >> >
>> >> > ==================
>> >> > But my bug isn't fixed?
>> >> > ==================
>> >> >
>> >> > In order to make timely releases, we will typically not hold the
>> >> > release unless the bug in question is a regression from the previous
>> >> > release. That being said, if there is something which is a regression
>> >> > that has not been correctly targeted please ping me or a committer to
>> >> > help target the issue.



--
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: 
dev-unsubscr...@spark.apache.org<mailto:dev-unsubscr...@spark.apache.org>




--
Stavros Kontopoulos
Senior Software Engineer
Lightbend, Inc.
p:  +30 6977967274
<tel:%2B1%20650%20678%200020>
e: stavros.kontopou...@lightbend.com<mailto:dave.mar...@lightbend.com>

[https://docs.google.com/a/lightbend.com/uc?id=0B5AMuG_Ml2ddbFJqVWJxeHV0bzg&export=download]

Reply via email to