[ https://issues.apache.org/jira/browse/SPARK-24575?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Herman van Hovell resolved SPARK-24575. --------------------------------------- Resolution: Fixed Assignee: Anton Okolnychyi > Prohibit window expressions inside WHERE and HAVING clauses > ----------------------------------------------------------- > > Key: SPARK-24575 > URL: https://issues.apache.org/jira/browse/SPARK-24575 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.4.0 > Reporter: Anton Okolnychyi > Assignee: Anton Okolnychyi > Priority: Minor > Fix For: 2.4.0 > > > Why window functions inside WHERE and HAVING clauses should be prohibited is > described > [here|https://stackoverflow.com/questions/13997177/why-no-windowed-functions-in-where-clauses]. > Spark, on the other hand, does not handle this explicitly and will fail with > non-descriptive exceptions. > {code} > val df = Seq((1, 2), (1, 3), (2, 4), (5, 5)).toDF("a", "b") > df.createTempView("t1") > spark.sql("SELECT t1.a FROM t1 WHERE RANK() OVER(ORDER BY t1.b) = > 1").show(false) > {code} > {noformat} > Exception in thread "main" java.lang.UnsupportedOperationException: Cannot > evaluate expression: rank(input[1, int, false]) windowspecdefinition(input[1, > int, false] ASC NULLS FIRST, specifiedwindowframe(RowFrame, > unboundedpreceding$(), currentrow$())) > at > org.apache.spark.sql.catalyst.expressions.Unevaluable$class.doGenCode(Expression.scala:261) > at > org.apache.spark.sql.catalyst.expressions.WindowExpression.doGenCode(windowExpressions.scala:278) > at > org.apache.spark.sql.catalyst.expressions.Expression$$anonfun$genCode$2.apply(Expression.scala:108) > at > org.apache.spark.sql.catalyst.expressions.Expression$$anonfun$genCode$2.apply(Expression.scala:105) > at scala.Option.getOrElse(Option.scala:121) > ... > {noformat} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org