[ 
https://issues.apache.org/jira/browse/FLINK-35596?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Venkata krishnan Sowrirajan updated FLINK-35596:
------------------------------------------------
    Affects Version/s: 1.18.0
                           (was: 1.19.0)

> Flink application fails with The implementation of the BlockElement is not 
> serializable
> ---------------------------------------------------------------------------------------
>
>                 Key: FLINK-35596
>                 URL: https://issues.apache.org/jira/browse/FLINK-35596
>             Project: Flink
>          Issue Type: Bug
>    Affects Versions: 1.18.0
>            Reporter: Venkata krishnan Sowrirajan
>            Priority: Major
>              Labels: pull-request-available
>
>  
> Flink application fails with 
> _org.apache.flink.api.common.InvalidProgramException: The implementation of 
> the BlockElement is not serializable. The object probably contains or 
> references non serializable fields._
> Looks like as part of [[FLINK-33058][formats] Add encoding option to Avro 
> format|https://github.com/apache/flink/pull/23395/files#top] new 
> _AvroEncoding_ enum is introduced but this also uses the TextElement to 
> format the description for Javadocs.
> This is internally used in the _AvroRowDataSerializationSchema_ and 
> _AvroRowDataDeSerializationSchema_ which needs to be serialized while the 
> _BlockElement_ is not serializable.
> {code:java}
> org.apache.flink.api.common.InvalidProgramException: The implementation of 
> the BlockElement is not serializable. The object probably contains or 
> references non serializable fields. at 
> org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:164) at 
> org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132) at 
> org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132) at 
> org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132) at 
> org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132) at 
> org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:69) at 
> org.apache.flink.connector.kafka.sink.KafkaSinkBuilder.setRecordSerializer(KafkaSinkBuilder.java:152)
>  at 
> org.apache.flink.streaming.connectors.kafka.table.KafkaDynamicSink.getSinkRuntimeProvider(KafkaDynamicSink.java:207)
>  at 
> org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecSink.createSinkTransformation(CommonExecSink.java:150)
>  at 
> org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecSink.translateToPlanInternal(StreamExecSink.java:176)
>  at 
> org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(ExecNodeBase.java:159)
>  at 
> org.apache.flink.table.planner.delegation.StreamPlanner.$anonfun$translateToPlan$1(StreamPlanner.scala:85)
>  at 
> scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233) at 
> scala.collection.Iterator.foreach(Iterator.scala:937) at 
> scala.collection.Iterator.foreach$(Iterator.scala:937) at 
> scala.collection.AbstractIterator.foreach(Iterator.scala:1425) at 
> scala.collection.IterableLike.foreach(IterableLike.scala:70) at 
> scala.collection.IterableLike.foreach$(IterableLike.scala:69) at 
> scala.collection.AbstractIterable.foreach(Iterable.scala:54) at 
> scala.collection.TraversableLike.map(TraversableLike.scala:233) at 
> scala.collection.TraversableLike.map$(TraversableLike.scala:226) at 
> scala.collection.AbstractTraversable.map(Traversable.scala:104) at 
> org.apache.flink.table.planner.delegation.StreamPlanner.translateToPlan(StreamPlanner.scala:84)
>  at 
> org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:197)
>  at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1733)
>  at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:825)
>  at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:918)
>  at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:730)
>  at 
> org.apache.flink.streaming.connectors.kafka.table.KafkaTableITCase.testKafkaSourceSink(KafkaTableITCase.java:140)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
>  at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>  at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
>  at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>  at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) 
> at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) 
> at 
> org.apache.flink.util.TestNameProvider$1.evaluate(TestNameProvider.java:45) 
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:61) at 
> org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) at 
> org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
>  at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103){code}
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to