[ 
https://issues.apache.org/jira/browse/SPARK-38088?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Wieleba updated SPARK-38088:
-----------------------------------
    Attachment:     (was: image-2022-02-02-10-08-51-144.png)

> Kryo DataWritingSparkTaskResult registration error
> --------------------------------------------------
>
>                 Key: SPARK-38088
>                 URL: https://issues.apache.org/jira/browse/SPARK-38088
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.1.2
>            Reporter: Michał Wieleba
>            Priority: Major
>         Attachments: image-2022-02-02-10-09-14-858.png
>
>
> Spark 3.1.2, Scala 2.12
> I'm registering classes with _sparkConf.registerKryoClasses(Array( ..._ 
> method. Inside there are spark structured streaming code. Following settings 
> are added as well:
> sparkConf.set("spark.serializer", 
> "org.apache.spark.serializer.KryoSerializer")
> sparkConf.set("spark.kryo.registrationRequired", "true")
> Unfortunately, during execution following error is thrown:
> Caused by: java.lang.IllegalArgumentException: Class is not registered: 
> org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTaskResult
> Note: To register this class use: 
> kryo.register(org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTaskResult.class);
>  
> As far as I can see in 
> [https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/WriteToDataSourceV2Exec.scala]
>  
> class 
> org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTaskResult is 
> private (private[v2] case class DataWritingSparkTaskResult) therefore not 
> available to register.
>  
> !image-2022-02-02-10-09-14-858.png!
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to