[ 
https://issues.apache.org/jira/browse/SPARK-18504?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15685461#comment-15685461
 ] 

Nattavut Sutyanyong edited comment on SPARK-18504 at 11/22/16 2:32 AM:
-----------------------------------------------------------------------

Revise the reproduction script to be more concise

{code}
Seq((1,1),(1,2)).toDF("c1","c2").createOrReplaceTempView("t")
sql("select (select sum(-1) from t t2 where t1.c2=t2.c1 group by t2.c2) from t 
t1").show

+---------------------------+
|scalarsubquery((c2 = c1#5))|
+---------------------------+
|                         -1|
|                         -1|
|                       null|
+---------------------------+
{code}

The result should return only one "-1" row, not two.

This JIRA will block this type of query where the column(s) in the GROUP BY 
clause are not part of the WHERE clause.


was (Author: nsyca):
Revise the reproduction script to be more concise

{code}
Seq((1,1),(1,2)).toDF("c1","c2").createOrReplaceTempView("t")
sql("select (select sum(-1) from t t2 where t1.c2=t2.c1 group by t2.c2) from t 
t1").show

+---------------------------+
|scalarsubquery((c2 = c1#5))|
+---------------------------+
|                         -1|
|                         -1|
|                       null|
+---------------------------+
{code}

The result should return only one "-1" row.

This JIRA will block this type of query where the column(s) in the GROUP BY 
clause are not part of the WHERE clause.

> Scalar subquery with extra group by columns returning incorrect result
> ----------------------------------------------------------------------
>
>                 Key: SPARK-18504
>                 URL: https://issues.apache.org/jira/browse/SPARK-18504
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Nattavut Sutyanyong
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to