[ 
https://issues.apache.org/jira/browse/SPARK-52007?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mihailo Aleksic updated SPARK-52007:
------------------------------------
    Description: 
In current version following query works:
{code:java}
select * from values(1,2) group by grouping sets (col1,col2,col1+col2) order by 
`(col1#x + col2#y)`
{code}

where #x and #y are expression IDs generated for related attributes. This isn't 
right as it leads to nondeterministic behavior (queries that pass would fail 
next time). In this issue I propose that we fix that.

  was:
In current version following query works:
{code:java}

select * from values(1,2) group by grouping sets (col1,col2,col1+col2) order by 
`(col1#x + col2#y)`
{code}

where #x and #y are expression IDs generated for related attributes. This isn't 
right as it leads to nondeterministic behavior (queries that pass would fail 
next time). In this issue I propose that we fix that.


> Expression IDs shouldn't be present in grouping expressions when using 
> grouping sets
> ------------------------------------------------------------------------------------
>
>                 Key: SPARK-52007
>                 URL: https://issues.apache.org/jira/browse/SPARK-52007
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 4.1.0
>            Reporter: Mihailo Aleksic
>            Priority: Major
>
> In current version following query works:
> {code:java}
> select * from values(1,2) group by grouping sets (col1,col2,col1+col2) order 
> by `(col1#x + col2#y)`
> {code}
> where #x and #y are expression IDs generated for related attributes. This 
> isn't right as it leads to nondeterministic behavior (queries that pass would 
> fail next time). In this issue I propose that we fix that.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to