Are you registering multiple RDDs of case classes as tables concurrently?
 You are possibly hitting SPARK-2178
<https://issues.apache.org/jira/browse/SPARK-2178> which is caused by
SI-6240 <https://issues.scala-lang.org/browse/SI-6240>.


On Tue, Jul 15, 2014 at 10:49 AM, Keith Simmons <keith.simm...@gmail.com>
wrote:

> HI folks,
>
> I'm running into the following error when trying to perform a join in my
> code:
>
> java.lang.NoClassDefFoundError: Could not initialize class
> org.apache.spark.sql.catalyst.types.LongType$
>
> I see similar errors for StringType$ and also:
>
>  scala.reflect.runtime.ReflectError: value apache is not a package.
>
> Strangely, if I just work with a single table, everything is fine. I can
> iterate through the records in both tables and print them out without a
> problem.
>
> Furthermore, this code worked without an exception in Spark 1.0.0 (thought
> the join caused some field corruption, possibly related to
> https://issues.apache.org/jira/browse/SPARK-1994
> <https://www.google.com/url?q=https%3A%2F%2Fissues.apache.org%2Fjira%2Fbrowse%2FSPARK-1994&sa=D&sntz=1&usg=AFQjCNHNxePxWgmuymCQSprulDZZcOn4-Q>).
>  The data is coming from a custom protocol buffer based format on hdfs that
> is being mapped into the individual record types without a problem.
>
> The immediate cause seems to be a task trying to deserialize one or more
> SQL case classes before loading the spark uber jar, but I have no idea why
> this is happening, or why it only happens when I do a join.  Ideas?
>
> Keith
>
> P.S. If it's relevant, we're using the Kryo serializer.
>
>
>

Reply via email to