[ https://issues.apache.org/jira/browse/SPARK-32283?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Lorenz Bühmann updated SPARK-32283: ----------------------------------- Priority: Minor (was: Major) > Multiple Kryo registrators can't be used anymore > ------------------------------------------------ > > Key: SPARK-32283 > URL: https://issues.apache.org/jira/browse/SPARK-32283 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 3.0.0 > Reporter: Lorenz Bühmann > Priority: Minor > > This is a regression in Spark 3.0 as it is working with Spark 2. > According to the docs, it should be possible to register multiple Kryo > registrators via Spark config option spark.kryo.registrator . > In Spark 3.0 the code to parse Kryo config options has been refactored into > Scala class > [Kryo|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/internal/config/Kryo.scala]. > The code to parse the registrators is in [Line > 29-32|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/internal/config/Kryo.scala#L29-L32] > {code:scala} > val KRYO_USER_REGISTRATORS = ConfigBuilder("spark.kryo.registrator") > .version("0.5.0") > .stringConf > .createOptional > {code} > but it should be > {code:scala} > val KRYO_USER_REGISTRATORS = ConfigBuilder("spark.kryo.registrator") > .version("0.5.0") > .stringConf > .toSequence > .createOptional > {code} > to split the comma seprated list. > In previous Spark 2.x it was done differently directly in [KryoSerializer > Line > 77-79|https://github.com/apache/spark/blob/branch-2.4/core/src/main/scala/org/apache/spark/serializer/KryoSerializer.scala#L77-L79] > {code:scala} > private val userRegistrators = conf.get("spark.kryo.registrator", "") > .split(',').map(_.trim) > .filter(!_.isEmpty) > {code} > Hope this helps. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org