Hello Bryan, In my case the schemas are not compatible, In Schema A I have a map type and I want to flatten this map. Also I want to cast string to integer.
Chris On Wed, Oct 11, 2017 at 5:03 PM, Bryan Bende <[email protected]> wrote: > Chris, > > Any processor that uses a record reader and record writer can > inherently do schema conversion by using schema A for the reader, and > schema B for the writer, assuming the schemas are compatible. > > Compatible in this sense would mean one of the following... > > - Schema B has the same field names as schema A, but with some of the > filed types being different > - Schema B has a subset of the fields in schema A, possibly some > changing type as above > - Schema B has additional fields and they have default values since > they won't exist in the records coming from schema A > > If you have Avro data in Kafka that already has the schema embedded in > it, then you can use ConsumeKafkaRecord with an AvroReader and set the > Schema Access Strategy to "Embedded Avro", and then use a > AvroRecordSetWriter and set the Schema Access Strategy to one of the > other options like Schema Name (which needs a schema registry) or > Schema Text which allows you to enter in a full schema. > > You could also do the same thing anywhere else in your flow using > ConvertRecord. > > Thanks, > > Bryan > > > On Wed, Oct 11, 2017 at 7:55 AM, Chris Herssens > <[email protected]> wrote: > > Hello All, > > > > I would like to convert an avro schema to another avro schema. Since > Nifi > > reads the avro data from kafka, I can't use the ConvertAvroSchema > processor. > > Which processor can I use ? > > Is it possible to use ConsumeKafkaRecord processor for that ? If yes how > do > > we specify the dynamic properties ? If possible can you give me an > example > > ? > > > > Regards, > > > > Chris >
