[ 
https://issues.apache.org/jira/browse/SPARK-39167?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17538193#comment-17538193
 ] 

Apache Spark commented on SPARK-39167:
--------------------------------------

User 'panbingkun' has created a pull request for this issue:
https://github.com/apache/spark/pull/36580

> Throw an exception w/ an error class for multiple rows from a subquery used 
> as an expression
> --------------------------------------------------------------------------------------------
>
>                 Key: SPARK-39167
>                 URL: https://issues.apache.org/jira/browse/SPARK-39167
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.4.0
>            Reporter: Max Gekk
>            Priority: Major
>
> Users can trigger an illegal state exception by the SQL statement:
> {code:sql}
> > select (select a from (select 1 as a union all select 2 as a) t) as b
> {code}
> {code:java}
> Caused by: java.lang.IllegalStateException: more than one row returned by a 
> subquery used as an expression:
> Subquery subquery#242, [id=#100]
> +- AdaptiveSparkPlan isFinalPlan=true
>    +- == Final Plan ==
>       Union
>       :- *(1) Project [1 AS a#240]
>       :  +- *(1) Scan OneRowRelation[]
>       +- *(2) Project [2 AS a#241]
>          +- *(2) Scan OneRowRelation[]
>    +- == Initial Plan ==
>       Union
>       :- Project [1 AS a#240]
>       :  +- Scan OneRowRelation[]
>       +- Project [2 AS a#241]
>          +- Scan OneRowRelation[]
>       at 
> org.apache.spark.sql.execution.ScalarSubquery.updateResult(subquery.scala:83)
> {code}
> but such kind of exceptions are not supposed to be visible to users. Need to 
> introduce an error class (or re-use an existing one), and replace the 
> IllegalStateException.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to