[ 
https://issues.apache.org/jira/browse/SPARK-16195?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dilip Biswal updated SPARK-16195:
---------------------------------
    Description: 
In SQL, its allowed to specify an empty OVER clause in the window expression.

{code}
select area, sum(product) over () as c from windowData
where product > 3 group by area, product
having avg(month) > 0 order by avg(month), product
{code}

In this case the analytic function sum is presented based on all the rows of 
the result set

Currently its not allowed through dataset API.


  was:
In SQL, its allowed to specify an empty OVER clause in the window expression.
 
select area, sum(product) over () as c from windowData
where product > 3 group by area, product
having avg(month) > 0 order by avg(month), product

In this case the analytic function sum is presented based on all the rows of 
the result set

Currently its not allowed through dataset API.



> Allow users to specify empty over clause in window expressions through 
> dataset API
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-16195
>                 URL: https://issues.apache.org/jira/browse/SPARK-16195
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Dilip Biswal
>            Priority: Minor
>
> In SQL, its allowed to specify an empty OVER clause in the window expression.
> {code}
> select area, sum(product) over () as c from windowData
> where product > 3 group by area, product
> having avg(month) > 0 order by avg(month), product
> {code}
> In this case the analytic function sum is presented based on all the rows of 
> the result set
> Currently its not allowed through dataset API.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to