That's great to hear! Im will give it a try. What about performing
different operations (such as insert for a set of records, updates for
another set of record and the same for deletes) in a transaction? Is it
possible with PutDatabaseRecord?

Em sex., 10 de fev. de 2023 às 14:29, Matt Burgess <mattyb...@apache.org>
escreveu:

> I agree with Chris here about using PutDatabaseRecord instead of the
> Split and PutSQL. PutDatabaseRecord will process all records in a
> FlowFile as a transaction, so in PutDatabaseRecord you can set an
> AvroReader (to read the records coming out of ExecuteSQL) and the
> statement type (such as INSERT) and it will process all those records
> as one transaction.
>
> Regards,
> Matt
>
> On Fri, Feb 10, 2023 at 12:04 PM João Marcos Polo Junior
> <jpolojun...@gmail.com> wrote:
> >
> > Thanks for replying it. I'm afraid this approach do not work with
> transaction. How can I process all records in the same transaction?
> >
> > Em sex., 10 de fev. de 2023 às 13:05, Chris Sampson <
> chris.samp...@naimuri.com> escreveu:
> >>
> >> I don't use the database/SQL processors much, but see questions about
> these on the Apache NiFi Slack channels quite regularly - you might have
> better look using ExecuteSQLRecord (can output in Avro or JSON, etc. using
> your wanted RecordSetWriter Controller Service) then feed that through to a
> PutDatabaseRecord (can read Avro, JSON, etc. depending upon your configured
> RecordReader).
> >>
> >> If you want to change the data in between then consider other Record
> based processors such as UpdateRecord or QueryRecord.
> >>
> >> On Fri, 10 Feb 2023, 15:57 João Marcos Polo Junior, <
> jpolojun...@gmail.com> wrote:
> >>>
> >>> I’m trying to create a flow (nifi 1.18) to query a database
> (ExecuteSQL), transform it records to json (ConvertAvroToJson), split it
> into json objects (SplitJson) and then perform the necessary actions into
> another database (PutSQL). All json objects splitted from the same original
> flowfile needs to be processed in a transaction and for that i’m using a
> PutSQL with Fragmented Transactions set it to true.
> >>>
> >>> First problem: I cant set the Transaction Timeout to more than “30
> sec” because my flowfiles (waiting in the upstream queue) dont ever get
> processed and dont get to go to the failure connection. They stay stuck in
> the upstream connection, get penalized, but never processed or redirected
> to failure when the timeout (more than 30sec) reaches the end.
> >>>
> >>>
> >>>
> >>> Second problem: I want to combine the Transaction Timeout attribute
> with the Penalty Time, Yield Time or maybe Run Schedule but thats not
> working either.
> >>>
> >>> Is there a solution for these problem? Is there something I have to
> configure in the DBCPConnectionPool for that to work?
> >>>
> >>> Here’s a similar problem in version 1.12:
> https://issues.apache.org/jira/browse/NIFI-8733
> >>>
> >>>
> >>>
> >>> Thanks in advance!
> >>>
> >>>
>

Reply via email to