Hello All,

If Kryo serialization is enabled, doesn't Spark take care of registration
of built-in classes, i.e., are we not supposed to register just the custom
classes?

When using DataFrames, this does not seem to be the case. I had to register
the following classes

conf.registerKryoClasses(Array(classOf[org.apache.spark.sql.types.StructType],
        classOf[org.apache.spark.sql.types.StructField],
        classOf[Array[org.apache.spark.sql.types.StructField]],
        classOf[org.apache.spark.sql.types.LongType$],
        classOf[org.apache.spark.sql.types.Metadata],
        classOf[scala.collection.immutable.Map$EmptyMap$],
        classOf[org.apache.spark.sql.catalyst.InternalRow],
   classOf[Array[org.apache.spark.sql.catalyst.InternalRow]],
 classOf[org.apache.spark.sql.catalyst.expressions.UnsafeRow],
 classOf[Array[org.apache.spark.sql.catalyst.expressions.UnsafeRow]],
 Class.forName("org.apache.spark.sql.execution.joins.UnsafeHashedRelation"),
        Class.forName("java.util.HashMap"),
        classOf[scala.reflect.ClassTag$$anon$1],
        Class.forName("java.lang.Class"),
 Class.forName("org.apache.spark.sql.execution.columnar.CachedBatch")))

I got the following exception

com.esotericsoftware.kryo.KryoException:
java.lang.IllegalArgumentException: Class is not registered: byte[][]

But byte is not a class. So I couldn't register it -- compiler complains
that byte is not a class. How can I register byte[][] in Scala?

Does this point to some other issue?

In some other posts, I noticed use of kryo.register(). In this case, how do
we pass the kryo object to SparkContext?

Thanks in advance.

Regards,
Raghava.

Reply via email to