[ https://issues.apache.org/jira/browse/SPARK-8288?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16181716#comment-16181716 ]
Drew Robb commented on SPARK-8288: ---------------------------------- I do not yet have a fully working fix. I think that the best approach might be instead to change things on the scrooge end. > ScalaReflection should also try apply methods defined in companion objects > when inferring schema from a Product type > -------------------------------------------------------------------------------------------------------------------- > > Key: SPARK-8288 > URL: https://issues.apache.org/jira/browse/SPARK-8288 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 1.4.0 > Reporter: Cheng Lian > > This ticket is derived from PARQUET-293 (which actually describes a Spark SQL > issue). > My comment on that issue quoted below: > {quote} > ... The reason of this exception is that, the Scala code Scrooge generates > is actually a trait extending {{Product}}: > {code} > trait Junk > extends ThriftStruct > with scala.Product2[Long, String] > with java.io.Serializable > {code} > while Spark expects a case class, something like: > {code} > case class Junk(junkID: Long, junkString: String) > {code} > The key difference here is that the latter case class version has a > constructor whose arguments can be transformed into fields of the DataFrame > schema. The exception was thrown because Spark can't find such a constructor > from trait {{Junk}}. > {quote} > We can make {{ScalaReflection}} try {{apply}} methods in companion objects, > so that trait types generated by Scrooge can also be used for Spark SQL > schema inference. -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org