[ https://issues.apache.org/jira/browse/SPARK-35568?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yuming Wang updated SPARK-35568: -------------------------------- Description: How to reproduce: {code:scala} sql( """ |SELECT s.store_id, f.product_id |FROM (SELECT DISTINCT * FROM fact_sk) f | JOIN (SELECT | *, | ROW_NUMBER() OVER (PARTITION BY store_id ORDER BY state_province DESC) AS rn | FROM dim_store) s | ON f.store_id = s.store_id |WHERE s.country = 'DE' AND s.rn = 1 |""".stripMargin).show {code} {noformat} Caused by: java.lang.UnsupportedOperationException: WholeStageCodegen (3) does not implement doExecuteBroadcast at org.apache.spark.sql.execution.SparkPlan.doExecuteBroadcast(SparkPlan.scala:297) at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecuteBroadcast(AdaptiveSparkPlanExec.scala:323) at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeBroadcast$1(SparkPlan.scala:192) at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:217) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:214) at org.apache.spark.sql.execution.SparkPlan.executeBroadcast(SparkPlan.scala:188) {noformat} was: {code:sql} sql( """ |SELECT s.store_id, f.product_id |FROM (SELECT DISTINCT * FROM fact_sk) f | JOIN (SELECT | *, | ROW_NUMBER() OVER (PARTITION BY store_id ORDER BY state_province DESC) AS rn | FROM dim_store) s | ON f.store_id = s.store_id |WHERE s.country = 'DE' AND s.rn = 1 |""".stripMargin).show {code} {noformat} Caused by: java.lang.UnsupportedOperationException: WholeStageCodegen (3) does not implement doExecuteBroadcast at org.apache.spark.sql.execution.SparkPlan.doExecuteBroadcast(SparkPlan.scala:297) at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecuteBroadcast(AdaptiveSparkPlanExec.scala:323) at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeBroadcast$1(SparkPlan.scala:192) at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:217) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:214) at org.apache.spark.sql.execution.SparkPlan.executeBroadcast(SparkPlan.scala:188) {noformat} > UnsupportedOperationException: WholeStageCodegen (3) does not implement > doExecuteBroadcast > ------------------------------------------------------------------------------------------ > > Key: SPARK-35568 > URL: https://issues.apache.org/jira/browse/SPARK-35568 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.2.0 > Reporter: Yuming Wang > Priority: Major > > How to reproduce: > {code:scala} > sql( > """ > |SELECT s.store_id, f.product_id > |FROM (SELECT DISTINCT * FROM fact_sk) f > | JOIN (SELECT > | *, > | ROW_NUMBER() OVER (PARTITION BY store_id ORDER BY > state_province DESC) AS rn > | FROM dim_store) s > | ON f.store_id = s.store_id > |WHERE s.country = 'DE' AND s.rn = 1 > |""".stripMargin).show > {code} > {noformat} > Caused by: java.lang.UnsupportedOperationException: WholeStageCodegen (3) > does not implement doExecuteBroadcast > at > org.apache.spark.sql.execution.SparkPlan.doExecuteBroadcast(SparkPlan.scala:297) > at > org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecuteBroadcast(AdaptiveSparkPlanExec.scala:323) > at > org.apache.spark.sql.execution.SparkPlan.$anonfun$executeBroadcast$1(SparkPlan.scala:192) > at > org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:217) > at > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) > at > org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:214) > at > org.apache.spark.sql.execution.SparkPlan.executeBroadcast(SparkPlan.scala:188) > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org