Re: Help: What's the biggest length of SQL that's supported in SparkSQL?
No sorry I'm not at liberty to share other people's code. On Fri, Jul 12, 2019 at 9:33 AM, Gourav Sengupta < gourav.sengu...@gmail.com > wrote: > > Hi Reynold, > > > I am genuinely curious about queries which are more than 1 MB and am > stunned by tens of MB's. Any samples to share :) > > > Regards, > Gourav > > On Thu, Jul 11, 2019 at 5:03 PM Reynold Xin < rxin@ databricks. com ( > r...@databricks.com ) > wrote: > > >> There is no explicit limit but a JVM string cannot be bigger than 2G. It >> will also at some point run out of memory with too big of a query plan >> tree or become incredibly slow due to query planning complexity. I've seen >> queries that are tens of MBs in size. >> >> >> >> >> >> >> On Thu, Jul 11, 2019 at 5:01 AM, 李书明 < alemmontree@ 126. com ( >> alemmont...@126.com ) > wrote: >> >>> I have a question about the limit(biggest) of SQL's length that is >>> supported in SparkSQL. I can't find the answer in the documents of Spark. >>> >>> >>> Maybe Interger.MAX_VALUE or not ? >>> >> >> > >
Re: Help: What's the biggest length of SQL that's supported in SparkSQL?
Hi Reynold, I am genuinely curious about queries which are more than 1 MB and am stunned by tens of MB's. Any samples to share :) Regards, Gourav On Thu, Jul 11, 2019 at 5:03 PM Reynold Xin wrote: > There is no explicit limit but a JVM string cannot be bigger than 2G. It > will also at some point run out of memory with too big of a query plan tree > or become incredibly slow due to query planning complexity. I've seen > queries that are tens of MBs in size. > > > > On Thu, Jul 11, 2019 at 5:01 AM, 李书明 wrote: > >> I have a question about the limit(biggest) of SQL's length that is >> supported in SparkSQL. I can't find the answer in the documents of Spark. >> >> Maybe Interger.MAX_VALUE or not ? >> >> >
Re: Help: What's the biggest length of SQL that's supported in SparkSQL?
There is no explicit limit but a JVM string cannot be bigger than 2G. It will also at some point run out of memory with too big of a query plan tree or become incredibly slow due to query planning complexity. I've seen queries that are tens of MBs in size. On Thu, Jul 11, 2019 at 5:01 AM, 李书明 < alemmont...@126.com > wrote: > > I have a question about the limit(biggest) of SQL's length that is > supported in SparkSQL. I can't find the answer in the documents of Spark. > > > Maybe Interger.MAX_VALUE or not ? > > > >