boyuanzz commented on a change in pull request #11749:
URL: https://github.com/apache/beam/pull/11749#discussion_r464667015



##########
File path: 
sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java
##########
@@ -906,19 +926,89 @@ public void setValueDeserializer(String 
valueDeserializer) {
       Coder<K> keyCoder = getKeyCoder(coderRegistry);
       Coder<V> valueCoder = getValueCoder(coderRegistry);
 
-      // Handles unbounded source to bounded conversion if maxNumRecords or 
maxReadTime is set.
-      Unbounded<KafkaRecord<K, V>> unbounded =
-          org.apache.beam.sdk.io.Read.from(
-              
toBuilder().setKeyCoder(keyCoder).setValueCoder(valueCoder).build().makeSource());
+      // The Read will be expanded into SDF transform when "beam_fn_api" is 
enabled and
+      // "beam_fn_api_use_deprecated_read" is not enabled.
+      if (!ExperimentalOptions.hasExperiment(input.getPipeline().getOptions(), 
"beam_fn_api")
+          || ExperimentalOptions.hasExperiment(
+              input.getPipeline().getOptions(), 
"beam_fn_api_use_deprecated_read")) {
+        // Handles unbounded source to bounded conversion if maxNumRecords or 
maxReadTime is set.
+        Unbounded<KafkaRecord<K, V>> unbounded =
+            org.apache.beam.sdk.io.Read.from(
+                
toBuilder().setKeyCoder(keyCoder).setValueCoder(valueCoder).build().makeSource());
+
+        PTransform<PBegin, PCollection<KafkaRecord<K, V>>> transform = 
unbounded;
+
+        if (getMaxNumRecords() < Long.MAX_VALUE || getMaxReadTime() != null) {
+          transform =
+              
unbounded.withMaxReadTime(getMaxReadTime()).withMaxNumRecords(getMaxNumRecords());
+        }
 
-      PTransform<PBegin, PCollection<KafkaRecord<K, V>>> transform = unbounded;
+        return input.getPipeline().apply(transform);
+      } else {
+        ReadViaSDF<K, V, Manual> readTransform =
+            ReadViaSDF.<K, V, Manual>read()
+                .withConsumerConfigOverrides(getConsumerConfig())
+                .withOffsetConsumerConfigOverrides(getOffsetConsumerConfig())
+                .withConsumerFactoryFn(getConsumerFactoryFn())
+                .withKeyDeserializerProvider(getKeyDeserializerProvider())
+                .withValueDeserializerProvider(getValueDeserializerProvider())
+                .withManualWatermarkEstimator()
+                .withTimestampPolicyFactory(getTimestampPolicyFactory());
+        if (isCommitOffsetsInFinalizeEnabled()) {
+          readTransform = readTransform.commitOffsets();
+        }
 
-      if (getMaxNumRecords() < Long.MAX_VALUE || getMaxReadTime() != null) {
-        transform =
-            
unbounded.withMaxReadTime(getMaxReadTime()).withMaxNumRecords(getMaxNumRecords());
+        return input
+            .getPipeline()
+            .apply(Impulse.create())
+            .apply(ParDo.of(new GenerateKafkaSourceDescription(this)))
+            .setCoder(SerializableCoder.of(KafkaSourceDescription.class))
+            .apply(readTransform)
+            .setCoder(KafkaRecordCoder.of(keyCoder, valueCoder));

Review comment:
       The coder needs to be set when in x-lang case. It seems like there is 
something not correct when x-lang expand the transform.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to