[ 
https://issues.apache.org/jira/browse/SPARK-19035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15787416#comment-15787416
 ] 

Sean Owen commented on SPARK-19035:
-----------------------------------

The difference in behavior does sound like at least a cosmetic bug, but it may 
not actually work in either case, and that could be correct. Why not group on b 
here? Grouping on a different non-deterministic value doesn't sound reasonable 
anyway.

> rand() function in case when cause failed
> -----------------------------------------
>
>                 Key: SPARK-19035
>                 URL: https://issues.apache.org/jira/browse/SPARK-19035
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0, 2.0.1, 2.0.2
>            Reporter: Feng Yuan
>
> *In this case:*
>                select 
>                        case when a=1 then 1 else concat(a,cast(rand() as 
> string)) end b,count(1) 
>                from 
>                        yuanfeng1_a 
>                group by 
>                        case when a=1 then 1 else concat(a,cast(rand() as 
> string)) end;
> *Throw error:*
> Error in query: expression 'yuanfeng1_a.`a`' is neither present in the group 
> by, nor is it an aggregate function. Add to group by or wrap in first() (or 
> first_value) if you don't care which value you get.;;
> Aggregate [CASE WHEN (a#2075 = 1) THEN cast(1 as string) ELSE 
> concat(cast(a#2075 as string), cast(rand(519367429988179997) as string)) 
> END], [CASE WHEN (a#2075 = 1) THEN cast(1 as string) ELSE concat(cast(a#2075 
> as string), cast(rand(8090243936131101651) as string)) END AS b#2074]
> +- MetastoreRelation default, yuanfeng1_a
> select case when a=1 then 1 else rand() end b,count(1) from yuanfeng1_a group 
> by case when a=1 then rand() end also output this
> *Notice*:
> If replace rand() as 1,it work.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to