I think Kant meant time windowing functions. You can use
`window(TIMESTAMP, '24 hours', '24 hours')`
On Tue, Jul 25, 2017 at 9:26 AM, Keith Chapman
wrote:
> Here is an example of a window lead function,
>
> select *, lead(someColumn1) over ( partition by someColumn2
Here is an example of a window lead function,
select *, lead(someColumn1) over ( partition by someColumn2 order by
someColumn13 asc nulls first) as someName from someTable
Regards,
Keith.
http://keith-chapman.com
On Tue, Jul 25, 2017 at 9:15 AM, kant kodali wrote:
> How
How do I Specify windowInterval and slideInteval using raw sql string?
On Tue, Jul 25, 2017 at 8:52 AM, Keith Chapman
wrote:
> You could issue a raw sql query to spark, there is no particular advantage
> or disadvantage of doing so. Spark would build a logical plan from
You could issue a raw sql query to spark, there is no particular advantage
or disadvantage of doing so. Spark would build a logical plan from the raw
sql (or DSL) and optimize on that. Ideally you would end up with the same
physical plan, irrespective of it been written in raw sql / DSL.
Regards,
HI All,
I just want to run some spark structured streaming Job similar to this
DS.filter(col("name").equalTo("john"))
.groupBy(functions.window(df1.col("TIMESTAMP"), "24 hours",
"24 hours"), df1.col("hourlyPay"))
.agg(sum("hourlyPay").as("total"));
I am wondering if I can