Re: [VOTE] SPARK 2.4.0 (RC2)

2018-09-27 Thread Wenchen Fan
Yes, that's proposed by Sean. This time we should publish a Scala 2.12 build, both in maven and the download page. On Fri, Sep 28, 2018 at 11:34 AM Saisai Shao wrote: > Only "without-hadoop" profile has 2.12 binary, is it expected? > > Thanks > Saisai > > Wenchen Fan 于2018年9月28日周五 上午11:08写道: >

Re: [VOTE] SPARK 2.4.0 (RC2)

2018-09-27 Thread Saisai Shao
Only "without-hadoop" profile has 2.12 binary, is it expected? Thanks Saisai Wenchen Fan 于2018年9月28日周五 上午11:08写道: > I'm adding my own +1, since all the problems mentioned in the RC1 voting > email are all resolved. And there is no blocker issue for 2.4.0 AFAIK. > > On Fri, Sep 28, 2018 at

Re: [VOTE] SPARK 2.4.0 (RC2)

2018-09-27 Thread Wenchen Fan
I'm adding my own +1, since all the problems mentioned in the RC1 voting email are all resolved. And there is no blocker issue for 2.4.0 AFAIK. On Fri, Sep 28, 2018 at 10:59 AM Wenchen Fan wrote: > Please vote on releasing the following candidate as Apache Spark version > 2.4.0. > > The vote is

[VOTE] SPARK 2.4.0 (RC2)

2018-09-27 Thread Wenchen Fan
Please vote on releasing the following candidate as Apache Spark version 2.4.0. The vote is open until October 1 PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes. [ ] +1 Release this package as Apache Spark 2.4.0 [ ] -1 Do not release this package because ... To

[DISCUSS] SPIP: Native support of session window

2018-09-27 Thread Jungtaek Lim
Hi all, I would like to initiate discussion thread to discuss "Native support of session window". Origin issue is filed to SPARK-10816 [1] but I can file another one for representing SPIP if necessary. WIP but working PR is available as well, so we can even test it directly or see the difference

Re: Adding Extension to Load Custom functions into Thriftserver/SqlShell

2018-09-27 Thread Russell Spitzer
Yeah I had no specific reason, BaseSessionStateBuilder is probably better. I'll Jira it up On Thu, Sep 27, 2018 at 4:47 PM Herman van Hovell wrote: > Hey Russel, > > I took a quick look at your path. I think it is more inline with they way > the current extensions work, if you call the

Re: Adding Extension to Load Custom functions into Thriftserver/SqlShell

2018-09-27 Thread Mark Hamstra
Yes, the "startWithContext" code predates SparkSessions in Thriftserver, so it doesn't really work the way you want it to with Session initiation. On Thu, Sep 27, 2018 at 11:13 AM Russell Spitzer wrote: > While that's easy for some users, we basically want to load up some > functions by default

Re: Adding Extension to Load Custom functions into Thriftserver/SqlShell

2018-09-27 Thread Russell Spitzer
I wrote a quick patch and attached it if anyone wants to think about this in context. I can always rebase this to master. On Thu, Sep 27, 2018 at 1:39 PM Russell Spitzer wrote: > And incase anyone is wondering, the reason I want this may be avoided with > DataSourceV2 depending on some of the

Re: Adding Extension to Load Custom functions into Thriftserver/SqlShell

2018-09-27 Thread Russell Spitzer
And incase anyone is wondering, the reason I want this may be avoided with DataSourceV2 depending on some of the function pushdown discussions. We want to add functions which work only with the Cassandra DataSource (ttl and writetime), I've done the work to add in the custom expressions and

Re: Adding Extension to Load Custom functions into Thriftserver/SqlShell

2018-09-27 Thread Russell Spitzer
It would be a @dev internal api I think If we wanted to go extremely general with post session init, it could be added to SparkExtensions def postSessionInit(session: SparkSession) : Unit Which would allow you to do just about anything after sessionState was done initialized. Or if we

Re: Adding Extension to Load Custom functions into Thriftserver/SqlShell

2018-09-27 Thread Reynold Xin
Thoughts on how the api would look like? On Thu, Sep 27, 2018 at 11:13 AM Russell Spitzer wrote: > While that's easy for some users, we basically want to load up some > functions by default into all session catalogues regardless of who made > them. We do this with certain rules and strategies

Re: Adding Extension to Load Custom functions into Thriftserver/SqlShell

2018-09-27 Thread Russell Spitzer
While that's easy for some users, we basically want to load up some functions by default into all session catalogues regardless of who made them. We do this with certain rules and strategies using the SparkExtensions, so all apps that run through our submit scripts get a config parameter added and

SPIP: Support Kafka delegation token in Structured Streaming

2018-09-27 Thread Gabor Somogyi
Hi all, I am writing this e-mail in order to discuss the delegation token support for kafka feature which is reported in SPARK-25501 . I've prepared a SPIP

Re: [Discuss] Datasource v2 support for Kerberos

2018-09-27 Thread Steve Loughran
> On 25 Sep 2018, at 07:52, tigerquoll wrote: > > To give some Kerberos specific examples, The spark-submit args: > -–conf spark.yarn.keytab=path_to_keytab -–conf > spark.yarn.principal=princi...@realm.com > > are currently not passed through to the data sources. > > > I'm not sure why