I want to understand whats use of default size for a given datatype?
Following link mention that its for internal size estimation.
https://spark.apache.org/docs/latest/api/java/org/apache/spark/sql/types/DataType.html
Above behavior is also reflected in code where default value seems to be
used
ided by the Scala language. I am new to programming in
>> Scala, so I don't know whether the Scala ecosystem provides any good tools
>> for reverse-engineering a BNF from a class which extends
>> scala.util.parsing.combinator.syntactical.StandardTokenParsers.
>>
>> Th
t a single place.
Regards,
Vivek
On Fri, Sep 11, 2015 at 4:29 PM, Ted Yu <yuzhih...@gmail.com> wrote:
> You may have seen this:
> https://spark.apache.org/docs/latest/sql-programming-guide.html
>
> Please suggest what should be added.
>
> Cheers
>
> On Fri, Sep 11
Hi all,
I am looking for a reference manual for Spark SQL some thing like many
database vendors have. I could find one for hive ql
https://cwiki.apache.org/confluence/display/Hive/LanguageManual but not
anything specific to spark sql.
Please suggest. SQL reference specific to latest release will