100% agreed with Sathish, In case I am not offending anyone, this kind of
questions basically comes from individuals who are still in the mindset of
JAVA way of solving problems which used to be around 10 years back.
Therefore you will see a lot of user issues who are still used to writing
around
Agreed. For the same reason dataframes / dataset which is another DSL used
in Spark
On Wed, Jul 26, 2017 at 1:00 AM Georg Heiler
wrote:
> Because sparks dsl partially supports compile time type safety. E.g. the
> compiler will notify you that a sql function was
Because sparks dsl partially supports compile time type safety. E.g. the
compiler will notify you that a sql function was misspelled when using the
dsl opposed to the plain sql string which is only parsed at runtime.
Sathish Kumaran Vairavelu schrieb am Di. 25.
Juli
Just a thought. SQL itself is a DSL. Why DSL on top of another DSL?
On Tue, Jul 25, 2017 at 4:47 AM kant kodali wrote:
> Hi All,
>
> I am thinking to express Spark SQL using JSON in the following the way.
>
> For Example:
>
> *Query using Spark DSL*
>
>
Hi All,
I am thinking to express Spark SQL using JSON in the following the way.
For Example:
*Query using Spark DSL*
DS.filter(col("name").equalTo("john"))
.groupBy(functions.window(df1.col("TIMESTAMP"), "24 hours",
"24 hours"), df1.col("hourlyPay"))