[ https://issues.apache.org/jira/browse/SPARK-1845?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Michael Armbrust resolved SPARK-1845. ------------------------------------- Resolution: Fixed > Use AllScalaRegistrar for SparkSqlSerializer to register serializers of Scala > collections. > ------------------------------------------------------------------------------------------ > > Key: SPARK-1845 > URL: https://issues.apache.org/jira/browse/SPARK-1845 > Project: Spark > Issue Type: Bug > Components: SQL > Reporter: Takuya Ueshin > Assignee: Takuya Ueshin > > When I execute {{orderBy}} or {{limit}} for {{SchemaRDD}} including > {{ArrayType}} or {{MapType}}, {{SparkSqlSerializer}} throws the following > exception: > {quote} > com.esotericsoftware.kryo.KryoException: Class cannot be created (missing > no-arg constructor): scala.collection.immutable.$colon$colon > {quote} > or > {quote} > com.esotericsoftware.kryo.KryoException: Class cannot be created (missing > no-arg constructor): scala.collection.immutable.Vector > {quote} > or > {quote} > com.esotericsoftware.kryo.KryoException: Class cannot be created (missing > no-arg constructor): scala.collection.immutable.HashMap$HashTrieMap > {quote} > and so on. > This is because registrations of serializers for each concrete collections > are missing in {{SparkSqlSerializer}}. > I believe it should use {{AllScalaRegistrar}}. > {{AllScalaRegistrar}} covers a lot of serializers for concrete classes of > {{Seq}}, {{Map}} for {{ArrayType}}, {{MapType}}. -- This message was sent by Atlassian JIRA (v6.2#6252)