I already set .set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
to enable kryo and .set("spark.kryo.registrationRequired", "true") to force kryo. Strangely, I see the issue of this missing Dataset[] Trying to register regular classes like Date .registerKryoClasses(Array(classOf[Date])) works just fine. but registering the spark internal Dataset[] is not working / as far as I read the docs should be handled by spark. Vadim Semenov <vadim.seme...@datadoghq.com> schrieb am Mi., 21. Dez. 2016 um 17:12 Uhr: > to enable kryo serializer you just need to pass > `spark.serializer=org.apache.spark.serializer.KryoSerializer` > > the `spark.kryo.registrationRequired` controls the following behavior: > > Whether to require registration with Kryo. If set to 'true', Kryo will > throw an exception if an unregistered class is serialized. If set to false > (the default), Kryo will write unregistered class names along with each > object. Writing class names can cause significant performance overhead, so > enabling this option can enforce strictly that a user has not omitted > classes from registration. > > > as described here http://spark.apache.org/docs/latest/configuration.html > > if it's set to `true` you need to manually register classes as described > here: http://spark.apache.org/docs/latest/tuning.html#data-serialization > > > On Wed, Dec 21, 2016 at 8:49 AM, geoHeil <georg.kf.hei...@gmail.com> > wrote: > > To force spark to use kryo serialization I set > spark.kryo.registrationRequired to true. > > Now spark complains that: Class is not registered: > org.apache.spark.sql.types.DataType[] is not registered. > How can I fix this? So far I could not successfully register this class. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-kryo-serialization-register-Datatype-tp28243.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > > >