[ 
https://issues.apache.org/jira/browse/SPARK-30998?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-30998:
----------------------------------
    Affects Version/s:     (was: 2.4.6)
                       2.0.2
                       2.1.3
                       2.2.3
                       2.3.4
                       2.4.5

> ClassCastException when a generator having nested inner generators
> ------------------------------------------------------------------
>
>                 Key: SPARK-30998
>                 URL: https://issues.apache.org/jira/browse/SPARK-30998
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.2, 2.1.3, 2.2.3, 2.3.4, 2.4.5, 3.0.0, 3.1.0
>            Reporter: Takeshi Yamamuro
>            Assignee: Takeshi Yamamuro
>            Priority: Major
>             Fix For: 3.0.0, 2.4.6
>
>
> A query below failed in the master/branch-3.0/branch-2.4;
> {code}
> scala> sql("select array(array(1, 2), array(3)) 
> ar").select(explode(explode($"ar"))).show()
> 20/03/01 13:51:56 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)/ 
> 1]
> java.lang.ClassCastException: scala.collection.mutable.ArrayOps$ofRef cannot 
> be cast to org.apache.spark.sql.catalyst.util.ArrayData
>       at 
> org.apache.spark.sql.catalyst.expressions.ExplodeBase.eval(generators.scala:313)
>       at 
> org.apache.spark.sql.execution.GenerateExec.$anonfun$doExecute$8(GenerateExec.scala:108)
>       at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
>       at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
>       at scala.collection.Iterator$ConcatIterator.hasNext(Iterator.scala:222)
>       at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
>     ...
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to