[ 
https://issues.apache.org/jira/browse/SPARK-34296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17275514#comment-17275514
 ] 

Yuming Wang commented on SPARK-34296:
-------------------------------------

PostgrelSQL:
{noformat}
postgres@d40c1bcc8f5a:~$ psql
psql (11.3 (Debian 11.3-1.pgdg90+1))
Type "help" for help.

postgres=# create table test_unboundedpreceding(SELLER_ID int);
CREATE TABLE
postgres=#     SELECT
postgres-#       DENSE_RANK() OVER (
postgres(#         ORDER BY SELLER_ID ROWS BETWEEN 10 PRECEDING
postgres(#         AND CURRENT ROW
postgres(#       ) AS SELLER_RANK
postgres-#     FROM
postgres-#       test_unboundedpreceding;
 seller_rank
-------------
(0 rows)
{noformat}


> AggregateWindowFunction#frame should not always use UnboundedPreceding
> ----------------------------------------------------------------------
>
>                 Key: SPARK-34296
>                 URL: https://issues.apache.org/jira/browse/SPARK-34296
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.2, 3.1.0, 3.2.0
>            Reporter: Yuming Wang
>            Priority: Major
>
> How to reproduce this issue:
> {code:sql}
> CREATE TABLE test_unboundedpreceding(SELLER_ID INT) using parquet;
> SELECT
>   DENSE_RANK() OVER (
>     ORDER BY SELLER_ID ROWS BETWEEN 10 PRECEDING
>     AND CURRENT ROW
>   ) AS SELLER_RANK
> FROM
>   test_unboundedpreceding
> {code}
> It will throw:
> {noformat}
> Error: Error running query: org.apache.spark.sql.AnalysisException: Window 
> Frame specifiedwindowframe(RowFrame, -10, currentrow$()) must match the 
> required frame specifiedwindowframe(RowFrame, unboundedpreceding$(), 
> currentrow$()); (state=,code=0)
> {noformat}
> Related code:
> https://github.com/apache/spark/blob/cde697a479a2f67c6bc4281f39a1ab2ff6a9d17d/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L514



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to