Those funny class names come from scala's specialization -- its compiling a
different version of OpenHashMap for each primitive you stick in the type
parameter.  Here's a super simple example:

*➜  **~ * more Foo.scala



class Foo[@specialized X]

*➜  **~ * scalac Foo.scala



*➜  **~ * ls Foo*.class



Foo$mcB$sp.class Foo$mcC$sp.class Foo$mcD$sp.class Foo$mcF$sp.class
Foo$mcI$sp.class Foo$mcJ$sp.class Foo$mcS$sp.class Foo$mcV$sp.class
Foo$mcZ$sp.class Foo.class

Sadly, I'm not sure of a foolproof way of getting all those specialized
versions registered except for registering with these strange names.
Here's an example of how its done by chill for Tuples (which is what spark
is relying on for its own registration of tuples):

https://github.com/twitter/chill/blob/6d03f6976f33f6e2e16b8e254fead1625720c281/chill-scala/src/main/scala/com/twitter/chill/TupleSerializers.scala#L861

On Mon, Mar 30, 2015 at 3:59 PM, Arun Lists <lists.a...@gmail.com> wrote:

> I am trying to register classes with KryoSerializer. I get the following
> error message:
>
> How do I find out what class is being referred to by: *OpenHashMap$mcI$sp
> ?*
>
> *com.esotericsoftware.kryo.KryoException:
> java.lang.IllegalArgumentException: Class is not registered:
> com.comp.common.base.OpenHashMap$mcI$sp*
>
> *Note: To register this class use: *
> *kryo.register(com.dtex.common.base.OpenHashMap$mcI$sp.class);*
>
> I have registered other classes with it by using:
>
> sparkConf.registerKryoClasses(Array(
>
>   classOf[MyClass]
>
> ))
>
>
> Thanks,
>
> arun
>
>
>

Reply via email to