[ https://issues.apache.org/jira/browse/SPARK-12095?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15083847#comment-15083847 ]
Yin Huai commented on SPARK-12095: ---------------------------------- Does the error message that you got when using dataframe API mentioned any thing? If not, feel free to create a JIRA (PR is welcome) to add a doc to 1.6 branch (relevant parts are https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/expressions/Window.scala, https://github.com/apache/spark/blob/branch-1.6/sql/core/src/main/scala/org/apache/spark/sql/functions.scala#L557-L768, https://github.com/apache/spark/blob/branch-1.6/python/pyspark/sql/functions.py#L151-L183 and https://github.com/apache/spark/blob/branch-1.6/python/pyspark/sql/functions.py#L600-L649). https://issues.apache.org/jira/browse/SPARK-8641 added the native window function support. So, starting from 2.0, you can use window functions with SQLContext. > Window function rowsBetween throws exception > -------------------------------------------- > > Key: SPARK-12095 > URL: https://issues.apache.org/jira/browse/SPARK-12095 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.5.1 > Reporter: Irakli Machabeli > > From pyspark : > windowSpec=Window.partitionBy('A', 'B').orderBy('A','B', > 'C').rowsBetween('UNBOUNDED PRECEDING','CURRENT') > Py4JError: An error occurred while calling o1107.rowsBetween. Trace: > py4j.Py4JException: Method rowsBetween([class java.lang.String, class > java.lang.Long]) does not exist > from SQL query parser fails immediately: > Py4JJavaError: An error occurred while calling o18.sql. > : java.lang.RuntimeException: [1.20] failure: ``union'' expected but `(' found > select rank() OVER (PARTITION BY c1 ORDER BY c2 ) as rank from tbl > ^ > at scala.sys.package$.error(package.scala:27) > at > org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:36) -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org