Kafka Avro Schema Registry Support

2018-09-27 Thread rahul patwari
Hi, We have a usecase to read data from Kafka serialized with KafkaAvroSerializer and schema is present in Schema Registry. When we are trying to use ValueDeserializer as io.confluent.kafka.serializers.KafkaAvroDeserializer to get GenericRecord, we are seeing errors. Does KafkaIO.read() supports

Re: Kafka Avro Schema Registry Support

2018-09-27 Thread Raghu Angadi
You can set key/value deserializers : https://github.com/apache/beam/blob/master/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java#L101 What are the errors you see? Also note that Beam includes AvroCoder for handling Avro records in Beam. On Thu, Sep 27, 2018 at 6:05 AM r

Re: Modular IO presentation at Apachecon

2018-09-27 Thread Pablo Estrada
I'll take this chance to plug in my little directory of Beam tools/materials: https://github.com/pabloem/awesome-beam Please feel free to send PRs : ) On Wed, Sep 26, 2018 at 10:29 PM Ankur Goenka wrote: > Thanks for sharing. Great slides and looking for the recorded session. > > Do we have a c

Re: Kafka Avro Schema Registry Support

2018-09-27 Thread Vishwas Bm
Hi Raghu, The deserializer is provided by confluent *io.confluent.kafka.serializers* package. When we set valueDeserializer as KafkaAvroDeserializer. We are getting below error: The method withValueDeserializer(Class>) in the type KafkaIO.Read is not applicable for the arguments (Class) >F

Re: Modular IO presentation at Apachecon

2018-09-27 Thread Eugene Kirpichov
Thanks Ismael and everyone else! Unfortunately I do not believe that this session was recorded on video :( Juan - yes, this is some of the important future work, and I think it's not hard to add to many connectors; contributions would be welcome. In terms of a "per-key" Wait transform, yeah, that d

Re: Advice for piping many CSVs with different columns names to one bigQuery table

2018-09-27 Thread OrielResearch Eila Arich-Landkof
Thank you! Probably around 50. Best, Eila On Thu, Sep 27, 2018 at 1:23 AM Ankur Goenka wrote: > Hi Eila, > > That seems reasonable to me. > > Here is a reference on writing to BQ > https://github.com/apache/beam/blob/1ffba44f7459307f5a134b8f4ea47ddc5ca8affc/sdks/python/apache_beam/examples/comp

Re: Kafka Avro Schema Registry Support

2018-09-27 Thread Raghu Angadi
It is a compilation error due to type mismatch for value type. Please match key and value types for KafkaIO reader. I.e. if you have KafkaIO.read()., 'withValueDeserializer()' needs a class object which extends 'Deserializer'. Since KafkaAvroDeserializer extends 'Deserializer', so your ValueType

Agenda for the Beam Summit London 2018

2018-09-27 Thread Griselda Cuevas
Hi Beam Community, We have finalized the agenda for the Beam Summit London 2018, it's here: https://www.linkedin.com/feed/update/urn:li:activity:6450125487321735168/ We had a great amount of talk proposals, thank you so much to everyone who submitted one! We also sold out the event, so we're ver

Re: Agenda for the Beam Summit London 2018

2018-09-27 Thread Jean-Baptiste Onofré
Great !! Thanks Gris. Looking forward to see you all next Monday in London. Regards JB Le 27 sept. 2018 à 18:03, à 18:03, Griselda Cuevas a écrit: >Hi Beam Community, > >We have finalized the agenda for the Beam Summit London 2018, it's >here: >https://www.linkedin.com/feed/update/urn:li:activi

Re: Agenda for the Beam Summit London 2018

2018-09-27 Thread Pablo Estrada
Very exciting. I will have to miss it, but I'm excited to see what comes out of it:) Thanks to Gris, Matthias and other organizers. Best -P. On Thu, Sep 27, 2018, 4:26 PM Jean-Baptiste Onofré wrote: > Great !! Thanks Gris. > > Looking forward to see you all next Monday in London. > > Regards > >

Re: Agenda for the Beam Summit London 2018

2018-09-27 Thread Andrew Psaltis
This is great. Any chance it will be recorded or at a minimum the slides made available after. Unfortunately, I won't be able to make it to London next week. Best, Andrew On Fri, Sep 28, 2018 at 10:11 AM Pablo Estrada wrote: > Very exciting. I will have to miss it, but I'm excited to see what c