MaxGekk commented on code in PR #41578:
URL: https://github.com/apache/spark/pull/41578#discussion_r1233874129


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala:
##########
@@ -509,7 +509,24 @@ object UnsupportedOperationChecker extends Logging {
           throwError("Sampling is not supported on streaming 
DataFrames/Datasets")
 
         case Window(_, _, _, child) if child.isStreaming =>
-          throwError("Non-time-based windows are not supported on streaming 
DataFrames/Datasets")
+          val windowExpression = subPlan.asInstanceOf[Window].windowExpressions
+          val windowFuncs = windowExpression.flatMap { e =>
+            e.collect {
+              case we: WindowExpression =>
+                s"'${we.windowFunction}' as column '${e.toAttribute.sql}'"
+            }
+          }.mkString(", ")
+          val windowSpec = windowExpression.flatMap { e =>
+            e.collect {
+              case we: WindowExpression => we.windowSpec.sql
+            }
+          }.mkString(", ")
+          throw new AnalysisException(
+            s"Window function is not supported in $windowFuncs on streaming 
DataFrames/Datasets. " +
+            "Structured Streaming only supports time-window aggregation using 
the `window` " +
+            s"function. (window specification: '$windowSpec')",

Review Comment:
   Could you introduce an error class with this error format, and put it to 
`error-classes.json`, please. Also cc @HeartSaVioR for checking semantic.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to