Re: JDBC connector for DataSourceV2

2019-07-12 Thread Shiv Prashant Sood
Thanks all. I can also contribute toward this effort. Regards, Shiv Sent from my iPhone > On Jul 12, 2019, at 6:51 PM, Xianyin Xin wrote: > > If there’s nobody working on that, I’d like to contribute. > > Loop in @Gengliang Wang. > > Xianyin > > From: Ryan Blue > Reply-To: > Date: Satu

Re: JDBC connector for DataSourceV2

2019-07-12 Thread Ryan Blue
Sounds great! Ping me on the review, I think this will be really valuable. On Fri, Jul 12, 2019 at 6:51 PM Xianyin Xin wrote: > If there’s nobody working on that, I’d like to contribute. > > > > Loop in @Gengliang Wang. > > > > Xianyin > > > > *From: *Ryan Blue > *Reply-To: * > *Date: *Saturday

Re: JDBC connector for DataSourceV2

2019-07-12 Thread Xianyin Xin
If there’s nobody working on that, I’d like to contribute. Loop in @Gengliang Wang. Xianyin From: Ryan Blue Reply-To: Date: Saturday, July 13, 2019 at 6:54 AM To: Shiv Prashant Sood Cc: Spark Dev List Subject: Re: JDBC connector for DataSourceV2 I'm not aware of a JDBC connecto

Re: JDBC connector for DataSourceV2

2019-07-12 Thread Ryan Blue
I'm not aware of a JDBC connector effort. It would be great to have someone build one! On Fri, Jul 12, 2019 at 3:33 PM Shiv Prashant Sood wrote: > Can someone please help understand the current Status of DataSource V2 > based JDBC connector? I see connectors for various file formats in Master, >

JDBC connector for DataSourceV2

2019-07-12 Thread Shiv Prashant Sood
Can someone please help understand the current Status of DataSource V2 based JDBC connector? I see connectors for various file formats in Master, but can't find a JDBC implementation or related JIRA. DatasourceV2 APIs to me look in good shape to attempt a JDBC connector for READ/WRITE path. Thank

Re: Release Apache Spark 2.4.4 before 3.0.0

2019-07-12 Thread Dongjoon Hyun
Thank you, Jacek. BTW, I added `@private` since we need PMC's help to make an Apache Spark release. Can I get more feedbacks from the other PMC members? Please me know if you have any concerns (e.g. Release date or Release manager?) As one of the community members, I assumed the followings (if

Partition pruning by IDs from another table

2019-07-12 Thread Tomas Bartalos
Hello, I have 2 parquet tables: stored - table of 10 M records data - table of 100K records *This is fast:* val dataW = data.where("registration_ts in (20190516204l, 20190515143l,20190510125l, 20190503151l)") dataW.count res44: Long = 42 //takes 3 seconds stored.join(broadcast(dataW), Seq("registr

Re: Help: What's the biggest length of SQL that's supported in SparkSQL?

2019-07-12 Thread Reynold Xin
No sorry I'm not at liberty to share other people's code. On Fri, Jul 12, 2019 at 9:33 AM, Gourav Sengupta < gourav.sengu...@gmail.com > wrote: > > Hi Reynold, > > > I am genuinely curious about queries which are more than 1 MB and am > stunned by tens of MB's. Any samples to share :)  > >