Re: Signing releases with pwendell or release manager's key?

2017-09-19 Thread Holden Karau
Another option is I can just run the build locally, this might be better approach since it will help make sure we have the dependencies documented for the eventual transition to dockerized builds? On Tue, Sep 19, 2017 at 9:53 AM, Holden Karau wrote: > Thanks for the

Re: A little Scala 2.12 help

2017-09-19 Thread Jacek Laskowski
Hi, Nice catch, Sean! Learnt this today. They did say you could learn a lot with Spark! :) Pozdrawiam, Jacek Laskowski https://about.me/JacekLaskowski Spark Structured Streaming (Apache Spark 2.2+) https://bit.ly/spark-structured-streaming Mastering Apache Spark 2

Re: Signing releases with pwendell or release manager's key?

2017-09-19 Thread Holden Karau
Thanks for the reminder :) On Tue, Sep 19, 2017 at 9:02 AM Luciano Resende wrote: > Manually signing seems a good compromise for now, but note that there are > two places that this needs to happen, the artifacts that goes to dist.a.o > as well as the ones that are

Re: Signing releases with pwendell or release manager's key?

2017-09-19 Thread Luciano Resende
Manually signing seems a good compromise for now, but note that there are two places that this needs to happen, the artifacts that goes to dist.a.o as well as the ones that are published to maven. On Tue, Sep 19, 2017 at 8:53 AM, Ryan Blue wrote: > +1. Thanks for

Re: Signing releases with pwendell or release manager's key?

2017-09-19 Thread Ryan Blue
+1. Thanks for coming up with a solution, everyone! I think the manually signed RC as a work around will work well, and it will be an improvement for the rest to be updated. On Mon, Sep 18, 2017 at 8:25 PM, Patrick Wendell wrote: > Sounds good - thanks Holden! > > On

Re: A little Scala 2.12 help

2017-09-19 Thread Sean Owen
I figured this out. It's another effect of a new behavior in 2.12: Eta-expansion of zero-argument method values is deprecated Imagine: def f(): String = "foo" def g(fn: () => String) = ??? g(f) works in 2.11 without warning. It generates a warning in 2.12, because it wants you to explicitly make

Re: [Spark Core] Custom Catalog. Integration between Apache Ignite and Apache Spark

2017-09-19 Thread Nikolay Izhikov
Guys, Anyone had a chance to look at my message? 15.09.2017 15:50, Nikolay Izhikov пишет: Hello, guys. I’m contributor of Apache Ignite project which is self-described as an in-memory computing platform. It has Data Grid features: distribute, transactional key-value store [1], Distributed

About 2.2.1 release

2017-09-19 Thread sujith71955
Hi Folks, Just wanted to know about the spark 2.2.1 release date, please let me know the expected release date for this version. Thanks, Sujith -- Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/ - To