[ 
https://issues.apache.org/jira/browse/SPARK-41898?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-41898:
------------------------------------

    Assignee: Apache Spark

> Window.rowsBetween should handle `float("-inf")` and `float("+inf")` as 
> argument
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-41898
>                 URL: https://issues.apache.org/jira/browse/SPARK-41898
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Connect
>    Affects Versions: 3.4.0
>            Reporter: Sandeep Singh
>            Assignee: Apache Spark
>            Priority: Major
>
> {code:java}
> df = self.spark.createDataFrame([(1, "1"), (2, "2"), (1, "2"), (1, "2")], 
> ["key", "value"])
> w = Window.partitionBy("value").orderBy("key")
> from pyspark.sql import functions as F
> sel = df.select(
>     df.value,
>     df.key,
>     F.max("key").over(w.rowsBetween(0, 1)),
>     F.min("key").over(w.rowsBetween(0, 1)),
>     F.count("key").over(w.rowsBetween(float("-inf"), float("inf"))),
>     F.row_number().over(w),
>     F.rank().over(w),
>     F.dense_rank().over(w),
>     F.ntile(2).over(w),
> )
> rs = sorted(sel.collect()){code}
> {code:java}
> Traceback (most recent call last):   File 
> "/Users/s.singh/personal/spark-oss/python/pyspark/sql/tests/test_functions.py",
>  line 821, in test_window_functions     
> F.count("key").over(w.rowsBetween(float("-inf"), float("inf"))),   File 
> "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/window.py", 
> line 152, in rowsBetween     raise TypeError(f"start must be a int, but got 
> {type(start).__name__}") TypeError: start must be a int, but got float {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to