funcpp opened a new pull request, #2289:
URL: https://github.com/apache/datafusion-sqlparser-rs/pull/2289

   ## Summary
   
   Add support for parenthesized multi-column aliases in SELECT items, as 
defined in the [Spark SQL 
grammar](https://github.com/apache/spark/blob/master/sql/api/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBaseParser.g4):
   
   ```
   namedExpression
       : expression (AS? (name=errorCapturingIdentifier | identifierList))?
       ;
   
   identifierList
       : LEFT_PAREN identifierSeq RIGHT_PAREN
       ;
   ```
   
   This enables syntax like:
   ```sql
   SELECT stack(2, 'a', 'b', 'c', 'd') AS (col1, col2)
   ```
   
   ### Changes
   - Add `SelectItem::ExprWithAliases` variant for multi-column aliases
   - Add `Dialect::supports_select_item_multi_column_alias()`, enabled for 
Databricks and Generic dialects
   - Parse `AS (ident, ident, ...)` when the dialect supports it
   
   ### Context
   While not documented in the [Databricks SQL 
reference](https://docs.databricks.com/aws/en/sql/language-manual/sql-ref-syntax-qry-select.html),
 this syntax is part of the Spark SQL grammar that Databricks implements. 
Verified to execute successfully on Databricks Runtime:
   ```sql
   SELECT stack(2, 'a', 'b', 'c', 'd') AS (col1, col2)
   -- Returns:
   -- col1  col2
   -- a     b
   -- c     d
   ```
   
   ## Test plan
   - [x] Round-trip test: `SELECT stack(...) AS (col1, col2)` with and without 
FROM
   - [x] Negative test: non-supporting dialects reject the syntax
   - [x] Existing tests unaffected
   - [x] `cargo fmt`, `cargo clippy`, full test suite pass


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to