srowen commented on a change in pull request #27283: [SPARK-30574][DOC] 
Document GROUP BY Clause of SELECT statement in SQL Reference.
URL: https://github.com/apache/spark/pull/27283#discussion_r368613485
 
 

 ##########
 File path: docs/sql-ref-syntax-qry-select-groupby.md
 ##########
 @@ -18,5 +18,206 @@ license: |
   See the License for the specific language governing permissions and
   limitations under the License.
 ---
+The <code>GROUP BY</code> clause is used to group the rows based on a set of 
specified grouping expressions and compute aggregations on 
+the group of rows based on one or more specified aggregate functions. Spark 
also supports advanced aggregations to do multiple 
+aggregations for the same input record set via `GROUPING SETS`, `CUBE`, 
`ROLLUP` clauses.
 
-**This page is under construction**
+### Syntax
+{% highlight sql %}
+GROUP BY [ GROUPING SETS grouping_sets ] group_expression [ , group_expression 
[ , ... ] ]
+    [ ( WITH ROLLUP | WITH CUBE | GROUPING SETS grouping_sets ) ) ]
+{% endhighlight %}
+
+### Parameters
+<dl>
+  <dt><code><em>GROUPING SETS</em></code></dt>
+  <dd>
+    Groups the rows for each subset of the expressions specified in the 
grouping sets. For example, 
+    <code>GROUP BY GROUPING SETS (warehouse, product)</code> is semantically 
equivalent
+    to union of results of <code>GROUP BY warehouse</code> and <code>GROUP BY 
product</code>. This clause
+    is a shorthand for a <code>UNION ALL</code> where each leg of the 
<code>UNION ALL</code> 
 
 Review comment:
   I'd say "is shorthand", but either way be consistent as "short-hand" appears 
below

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to