Hi Amir,

BEAM-103 affects the runtime execution of the Beam sources (now
parallelized). The error above occurs during the translation phase of the
Beam program into the Flink API. I couldn't reproduce the error. I ran the
following program and it printed incoming Kafka records:

FlinkPipelineOptions options =
PipelineOptionsFactory.fromArgs(args).as(FlinkPipelineOptions.class);
options.setStreaming(true);
options.setRunner(FlinkPipelineRunner.class);

final Pipeline p = Pipeline.create(options);

final PCollection<KV<String, byte[]>> input = p.apply(KafkaIO.read()
    .withTopics(ImmutableList.of("test"))
    .withKeyCoder(StringUtf8Coder.of())
    .withBootstrapServers("localhost:9092")
    .withoutMetadata());

input.apply(ParDo.of(new DoFn<KV<String, byte[]>, String>() {
  @Override
  public void processElement(ProcessContext c) throws Exception {
    System.out.println("element: " + c.element());
  }
}));

p.run();


Note, that this is Kafka 9.

Cheers,
Max



On Mon, May 16, 2016 at 6:48 AM, Frances Perry <[email protected]> wrote:

> This might have been https://issues.apache.org/jira/browse/BEAM-103 ?
>
> But if so, the good news it was fixed four days ago ;-)
>
> On Mon, May 9, 2016 at 3:11 PM, amir bahmanyari <[email protected]>
> wrote:
>
>> Hi Colleagues,
>> I get this exception at runtime...
>> Any ideas pls?
>> Thanks
>>
>>
>> java.lang.UnsupportedOperationException: The transform KafkaIO.Read
>> [KafkaIO.TypedWithoutMetadata] is currently not supported.
>>         at
>> org.apache.beam.runners.flink.translation.FlinkStreamingPipelineTranslator.visitTransform(FlinkStreamingPipelineTranslator.java:106)
>>
>>
>

Reply via email to