[ 
https://issues.apache.org/jira/browse/NIFI-3739?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990955#comment-15990955
 ] 

Mark Payne commented on NIFI-3739:
----------------------------------

I've pushed a new commit that I believe handles the cases outlined above. I 
used a schema that marks a field as not being nullable, though I know that many 
records don't contain the field. I then attempted to write data as Avro on the 
PublishKafkaRecord side. This routed to failure as expected (I confirmed the 
previous behavior before the patch, of rolling back the session due to 
referencing the old version of the FlowFile). On the Consume side, I then 
attempted to parse JSON data as Avro and verified that the data went to the 
'parse.failure' relationship.

> Create Processors for publishing records to and consuming records from Kafka
> ----------------------------------------------------------------------------
>
>                 Key: NIFI-3739
>                 URL: https://issues.apache.org/jira/browse/NIFI-3739
>             Project: Apache NiFi
>          Issue Type: New Feature
>          Components: Extensions
>            Reporter: Mark Payne
>            Assignee: Mark Payne
>             Fix For: 1.2.0
>
>
> With the new record readers & writers that have been added in now, it would 
> be good to allow records to be pushed to and pulled from kafka. Currently, we 
> support demarcated data but sometimes we can't correctly demarcate data in a 
> way that keeps the format valid (json is a good example). We should have 
> processors that use the record readers and writers for this.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to