Re: some Ideas on expressing Spark SQL using JSON

2017-07-30 Thread Gourav Sengupta
100% agreed with Sathish, In case I am not offending anyone, this kind of questions basically comes from individuals who are still in the mindset of JAVA way of solving problems which used to be around 10 years back. Therefore you will see a lot of user issues who are still used to writing around

Re: some Ideas on expressing Spark SQL using JSON

2017-07-26 Thread Sathish Kumaran Vairavelu
Agreed. For the same reason dataframes / dataset which is another DSL used in Spark On Wed, Jul 26, 2017 at 1:00 AM Georg Heiler wrote: > Because sparks dsl partially supports compile time type safety. E.g. the > compiler will notify you that a sql function was

Re: some Ideas on expressing Spark SQL using JSON

2017-07-26 Thread Georg Heiler
Because sparks dsl partially supports compile time type safety. E.g. the compiler will notify you that a sql function was misspelled when using the dsl opposed to the plain sql string which is only parsed at runtime. Sathish Kumaran Vairavelu schrieb am Di. 25. Juli

Re: some Ideas on expressing Spark SQL using JSON

2017-07-25 Thread Sathish Kumaran Vairavelu
Just a thought. SQL itself is a DSL. Why DSL on top of another DSL? On Tue, Jul 25, 2017 at 4:47 AM kant kodali wrote: > Hi All, > > I am thinking to express Spark SQL using JSON in the following the way. > > For Example: > > *Query using Spark DSL* > >

some Ideas on expressing Spark SQL using JSON

2017-07-25 Thread kant kodali
Hi All, I am thinking to express Spark SQL using JSON in the following the way. For Example: *Query using Spark DSL* DS.filter(col("name").equalTo("john")) .groupBy(functions.window(df1.col("TIMESTAMP"), "24 hours", "24 hours"), df1.col("hourlyPay"))