Re: Release Apache Spark 2.4.4 before 3.0.0

2019-07-11 Thread Jacek Laskowski
Hi, Thanks Dongjoon Hyun for stepping up as a release manager! Much appreciated. If there's a volunteer to cut a release, I'm always to support it. In addition, the more frequent releases the better for end users so they have a choice to upgrade and have all the latest fixes or wait. It's their

Re: Release Apache Spark 2.4.4 before 3.0.0

2019-07-11 Thread Dongjoon Hyun
Additionally, one more correctness patch landed yesterday. - SPARK-28015 Check stringToDate() consumes entire input for the and -[m]m formats Bests, Dongjoon. On Tue, Jul 9, 2019 at 10:11 AM Dongjoon Hyun wrote: > Thank you for the reply, Sean. Sure. 2.4.x should be a LTS

unsubscribe

2019-07-11 Thread Bill Bejeck
unsubscribe

Re: Help: What's the biggest length of SQL that's supported in SparkSQL?

2019-07-11 Thread Reynold Xin
There is no explicit limit but a JVM string cannot be bigger than 2G. It will also at some point run out of memory with too big of a query plan tree or become incredibly slow due to query planning complexity. I've seen queries that are tens of MBs in size. On Thu, Jul 11, 2019 at 5:01 AM, 李书明

Help: What's the biggest length of SQL that's supported in SparkSQL?

2019-07-11 Thread 李书明
I have a question about the limit(biggest) of SQL's length that is supported in SparkSQL. I can't find the answer in the documents of Spark. Maybe Interger.MAX_VALUE or not ?