enging things such as feeding kafka and maybe also
> hadoop. So I am experimenting a lot and want to find the best possible
> setup.
>
> Greetings ans thanks again.
>
> Uwe
>
>
> Gesendet: Dienstag, 12. September 2017 um 03:05 Uhr
> Von: "Koji Kawamura"
&
ember 2017 um 03:05 Uhr
Von: "Koji Kawamura"
An: users@nifi.apache.org
Betreff: Re: QueryDatabaseTable - Schema
Hi Uwe,
I had a similar expectation when I was using QueryDatabaseTable or any
other processor creating Avro FlowFile which has its schema embedded,
combining new record reader/wri
One thing about its limitations had to do with timing, the record-aware stuff
happened after QDT. Would be great to have QDT use a record writer, then
depending on the writer you could choose your schema output strategy as Koji
outlined.
I'm not sure if there is a JIRA for this or not (or any o
Hi Uwe,
I had a similar expectation when I was using QueryDatabaseTable or any
other processor creating Avro FlowFile which has its schema embedded,
combining new record reader/writer controllers.
Now, NiFi has "Inherit Record Schema" option as "Schema Access
Strategy" of RecordWriter, already me
Hello,
I was wondering why if the QueryDatabaseTable processor creates internally an Avro schema, why is this schema not available as an attribute or saved to the registry?
If it would, then one could reuse the schema. E.g. if I use the ConvertRecord processor and I specify an AvroReader as