[ 
https://issues.apache.org/jira/browse/SPARK-26860?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16767387#comment-16767387
 ] 

Jagadesh Kiran N edited comment on SPARK-26860 at 2/13/19 5:09 PM:
-------------------------------------------------------------------

I will add the below statements to differentiate the same and raise the PR  

ROWS BETWEEN doesn't care about the exact values. It cares only about the order 
of rows, and takes fixed number of preceding and following rows when computing 
frame.
RANGE BETWEEN considers values when computing frame.


was (Author: jagadesh.kiran):
I will the below statements to differentiate the same and raise the PR  

ROWS BETWEEN doesn't care about the exact values. It cares only about the order 
of rows, and takes fixed number of preceding and following rows when computing 
frame.
RANGE BETWEEN considers values when computing frame.

> RangeBetween docs appear to be wrong 
> -------------------------------------
>
>                 Key: SPARK-26860
>                 URL: https://issues.apache.org/jira/browse/SPARK-26860
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.4.0
>            Reporter: Shelby Vanhooser
>            Priority: Major
>              Labels: docs, easyfix, python
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> The docs describing 
> [RangeBetween|http://spark.apache.org/docs/2.4.0/api/python/_modules/pyspark/sql/window.html#Window.rangeBetween]
>  for PySpark appear to be duplicates of 
> [RowsBetween|http://spark.apache.org/docs/2.4.0/api/python/_modules/pyspark/sql/window.html#Window.rowsBetween]
>  even though these are functionally different windows.  Rows between 
> reference proceeding and succeeding rows, but rangeBetween is based on the 
> values in these rows.  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to