[ 
https://issues.apache.org/jira/browse/SPARK-16792?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15736773#comment-15736773
 ] 

Apache Spark commented on SPARK-16792:
--------------------------------------

User 'michalsenkyr' has created a pull request for this issue:
https://github.com/apache/spark/pull/16240

> Dataset containing a Case Class with a List type causes a CompileException 
> (converting sequence to list)
> --------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16792
>                 URL: https://issues.apache.org/jira/browse/SPARK-16792
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Jamie Hutton
>            Priority: Critical
>
> The issue occurs when we run a .map over a dataset containing Case Class with 
> a List in it. A self contained test case is below:
> case class TestCC(key: Int, letters: List[String]) //List causes the issue - 
> a Seq/Array works fine
> /*simple test data*/
> val ds1 = sc.makeRDD(Seq(
> (List("D")),
> (List("S","H")),
> (List("F","H")),
> (List("D","L","L"))
> )).map(x=>(x.length,x)).toDF("key","letters").as[TestCC]
> //This will fail
> val test1=ds1.map{_.key}
> test1.show
> Error: 
> Caused by: org.codehaus.commons.compiler.CompileException: File 
> 'generated.java', Line 72, Column 70: No applicable constructor/method found 
> for actual parameters "int, scala.collection.Seq"; candidates are: 
> "TestCC(int, scala.collection.immutable.List)"
> It seems to be internally converting the List to a sequence, then it cant 
> convert it back...
> If you change the List[String] to Seq[String] or Array[String] the issue 
> doesnt appear



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to