Zach Kull created SPARK-18139: --------------------------------- Summary: Dataset mapGroups with return typ Seq[Product] produces scala.ScalaReflectionException: object $line262.$read not found Key: SPARK-18139 URL: https://issues.apache.org/jira/browse/SPARK-18139 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 2.0.1 Reporter: Zach Kull
mapGroups fails on Dataset if return type is only a Seq[Product]. It succeeds if return type is more complex like Seq[(Int,Product)]. See the following code sample: {code} case class A(b:Int, c:Int) // Sample Dataset[A] val ds = ss.createDataset(Seq(A(1,2),A(2,2))) // The following aggregation should produce a Dataset[Seq[A]], but FAILS with scala.ScalaReflectionException: object $line262.$read not found. val ds2 = ds.groupByKey(_.b).mapGroups{ case (g,i) => (i.toSeq) } // Produces Dataset[(Int, Seq[A])] -> OK val ds1 = ds.groupByKey(_.b).mapGroups{ case (g,i) => (g,i.toSeq) } // reproducable when trying to manuely create the following Encoder val e = newProductSeqEncoder[A] {code} Full Exception: scala.ScalaReflectionException: object $line262.$read not found. at scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:162) at scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:22) at $typecreator4$1.apply(<console>:116) at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:232) at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:232) at org.apache.spark.sql.SQLImplicits$$typecreator9$1.apply(SQLImplicits.scala:125) at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:232) at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:232) at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:49) at org.apache.spark.sql.SQLImplicits.newProductSeqEncoder(SQLImplicits.scala:125) ... 75 elided -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org