Re: Publishing official docker images for KubernetesSchedulerBackend

2017-12-14 Thread Erik Erlandson
Currently the containers are based off alpine, which pulls in BSD2 and MIT licensing: https://github.com/apache/spark/pull/19717#discussion_r154502824 to the best of my understanding, neither of those poses a problem. If we based the image off of centos I'd also expect the licensing of any image

Re: spark-tests.appspot status?

2017-12-14 Thread Josh Rosen
Yep, it turns out that there was a problem with the Jenkins job. I've restarted it and it should be backfilling now (this might take a while). On Thu, Dec 14, 2017 at 1:57 PM Xin Lu wrote: > Most likely the job that uploads this stuff at databricks is broken. > > On Thu,

Re: Publishing official docker images for KubernetesSchedulerBackend

2017-12-14 Thread Mark Hamstra
What licensing issues come into play? On Thu, Dec 14, 2017 at 4:00 PM, Erik Erlandson wrote: > We've been discussing the topic of container images a bit more. The > kubernetes back-end operates by executing some specific CMD and ENTRYPOINT > logic, which is different than

Re: Publishing official docker images for KubernetesSchedulerBackend

2017-12-14 Thread Erik Erlandson
We've been discussing the topic of container images a bit more. The kubernetes back-end operates by executing some specific CMD and ENTRYPOINT logic, which is different than mesos, and which is probably not practical to unify at this level. However: These CMD and ENTRYPOINT configurations are

Re: Timeline for Spark 2.3

2017-12-14 Thread Erik Erlandson
I wanted to check in on the state of the 2.3 freeze schedule. Original proposal was "late Dec", which is a bit open to interpretation. We are working to get some refactoring done on the integration testing for the Kubernetes back-end in preparation for testing upcoming release candidates,

Re: spark-tests.appspot status?

2017-12-14 Thread Xin Lu
Most likely the job that uploads this stuff at databricks is broken. On Thu, Dec 14, 2017 at 12:41 PM, Imran Rashid wrote: > Hi, > > I was trying to look at some flaky tests and old jiras, and noticed that > spark-tests.appspot.com is still live, but hasn't updated with

spark-tests.appspot status?

2017-12-14 Thread Imran Rashid
Hi, I was trying to look at some flaky tests and old jiras, and noticed that spark-tests.appspot.com is still live, but hasn't updated with any builds from the last 2 months. I was curious what the status is -- intentionally deprecated? just needs a restart? more dev work required? its pretty

Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

2017-12-14 Thread Holden Karau
I think the final requirements published being PMC only (w/maybe the person who set it up being an exception) is generally the case for each of the languages (e.g. Maven requires PMC to do final push, the dist download requires a final svn mv by PMC, etc.). On Thu, Dec 14, 2017 at 1:38 PM, Felix

Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

2017-12-14 Thread Felix Cheung
;) The credential to the user to publish to PyPI is PMC only. +Holden Had discussed this in the other thread I sent to private@ last week. On Thu, Dec 14, 2017 at 4:34 AM Sean Owen wrote: > On the various access questions here -- what do you need to have that > access? We

Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

2017-12-14 Thread Sean Owen
On the various access questions here -- what do you need to have that access? We definitely need to give you all necessary access if you're the release manager! On Thu, Dec 14, 2017 at 6:32 AM Felix Cheung wrote: > And I don’t have access to publish python. > > On Wed,

Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

2017-12-14 Thread Felix Cheung
And I don’t have access to publish python. On Wed, Dec 13, 2017 at 9:55 AM Shivaram Venkataraman < shiva...@eecs.berkeley.edu> wrote: > The R artifacts have some issue that Felix and I are debugging. Lets not > block the announcement for that. > > Thanks > > Shivaram > > On Wed, Dec 13, 2017 at