charlespnh opened a new issue, #35190:
URL: https://github.com/apache/beam/issues/35190

   ### What happened?
   
   ```
   pipeline:
     type: chain
     transforms:
       - type: ReadFromKafka
         name: ReadFromMyTopic
         config:
           format: STRING
           topic: test
           bootstrap_servers: kafka:9092 # 10.128.0.13:9092
           auto_offset_reset_config: earliest
   
       - type: LogForTesting
   ```
   
   Running the pipeline (Beam 2.65.0) gives this unexpected error:
   ```
     File 
"/home/beam/scripts/kafka-iceberg-spark/local/env/lib/python3.10/site-packages/apache_beam/yaml/yaml_transform.py",
 line 509, in expand_leaf_transform
       raise ValueError(
   ValueError: Error applying transform "ReadFromMyTopic" at line 4: 
java.lang.NullPointerException: To read from Kafka in AVRO format, you must 
provide a schema.
           at 
org.apache.beam.vendor.guava.v32_1_2_jre.com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921)
           at 
org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.validate(KafkaReadSchemaTransformConfiguration.java:83)
           at 
org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider$KafkaReadSchemaTransform.expand(KafkaReadSchemaTransformProvider.java:156)
           at 
org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider$KafkaReadSchemaTransform.expand(KafkaReadSchemaTransformProvider.java:133)
           at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:559)
           at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:490)
           at 
org.apache.beam.sdk.values.PCollectionRowTuple.apply(PCollectionRowTuple.java:215)
           at 
org.apache.beam.sdk.managed.ManagedSchemaTransformProvider$ManagedSchemaTransform.expand(ManagedSchemaTransformProvider.java:184)
           at 
org.apache.beam.sdk.managed.ManagedSchemaTransformProvider$ManagedSchemaTransform.expand(ManagedSchemaTransformProvider.java:155)
           at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:559)
           at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:507)
           at 
org.apache.beam.sdk.expansion.service.TransformProvider.apply(TransformProvider.java:121)
           at 
org.apache.beam.sdk.expansion.service.ExpansionService.expand(ExpansionService.java:657)
           at 
org.apache.beam.sdk.expansion.service.ExpansionService.expand(ExpansionService.java:758)
           at 
org.apache.beam.model.expansion.v1.ExpansionServiceGrpc$MethodHandlers.invoke(ExpansionServiceGrpc.java:306)
           at 
org.apache.beam.vendor.grpc.v1p69p0.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:182)
           at 
org.apache.beam.vendor.grpc.v1p69p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:356)
           at 
org.apache.beam.vendor.grpc.v1p69p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:861)
           at 
org.apache.beam.vendor.grpc.v1p69p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
           at 
org.apache.beam.vendor.grpc.v1p69p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:133)
           at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
           at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
           at java.base/java.lang.Thread.run(Thread.java:840)
   
   
   java.lang.NullPointerException: To read from Kafka in AVRO format, you must 
provide a schema.
   ```
   
   We're not checking for the STRING format here 
https://github.com/apache/beam/blob/master/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaReadSchemaTransformConfiguration.java#L50-L85...
   
   ### Issue Priority
   
   Priority: 2 (default / most bugs should be filed as P2)
   
   ### Issue Components
   
   - [ ] Component: Python SDK
   - [ ] Component: Java SDK
   - [ ] Component: Go SDK
   - [ ] Component: Typescript SDK
   - [ ] Component: IO connector
   - [x] Component: Beam YAML
   - [ ] Component: Beam examples
   - [ ] Component: Beam playground
   - [ ] Component: Beam katas
   - [ ] Component: Website
   - [ ] Component: Infrastructure
   - [ ] Component: Spark Runner
   - [ ] Component: Flink Runner
   - [ ] Component: Samza Runner
   - [ ] Component: Twister2 Runner
   - [ ] Component: Hazelcast Jet Runner
   - [ ] Component: Google Cloud Dataflow Runner


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@beam.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to