AngersZhuuuu commented on a change in pull request #30212:
URL: https://github.com/apache/spark/pull/30212#discussion_r531392644



##########
File path: 
sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
##########
@@ -587,13 +587,24 @@ fromClause
     ;
 
 aggregationClause
-    : GROUP BY groupingExpressions+=expression (',' 
groupingExpressions+=expression)* (
+    : GROUP BY groupingExpressionWithGroupingAnalytics+=groupByClause
+      (',' groupingExpressionWithGroupingAnalytics+=groupByClause)*
+    | GROUP BY groupingExpressions+=expression (',' 
groupingExpressions+=expression)* (
       WITH kind=ROLLUP
     | WITH kind=CUBE
     | kind=GROUPING SETS '(' groupingSet (',' groupingSet)* ')')?

Review comment:
       > > GROUP BY A, B , grouping sets(a, (a, b))
   > 
   > Ah, you mean `GROUP BY A, B grouping sets(a, (a, b))`? It seems the parser 
cannot accpet `GROUP BY A, B , grouping sets(a, (a, b))`.
   
   I mean we need  
   ```
   | kind=GROUPING SETS '(' groupingSet (',' groupingSet)* ')')?
   ```
   to support `group by a, b grouping sets(a, (a, b))`
   and since we add `GROUPING SET` in   groupingExpressionWithGroupingAnalytics
   so we  support `group by a, b, grouping sets(a, (a, b))`. Also 
   `| GROUP BY kind=GROUPING SETS '(' groupingSet (',' groupingSet)* ')'` this 
line can be remove since  we add `GROUPING SETS` in  
groupingExpressionWithGroupingAnalytics
   
   Currently we only support this in grammar, but can't execute because of 
https://github.com/apache/spark/pull/30144




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to