Brian This is a great question. The answer is absolutely you can. You do so by implementing a more low level interface. Can send you more on that later if needed. For an example of a processor that uses this look at merge content.
You will be taking control of session creation and commit. And what you will do is have one session for the original object and a series of sessions as needed for your resulting items. Keep in mind of course if you rollback the original one these other sessions are not affected. In short you can leave the cozy protected confines if the single session model and do some powerful stuff. Have a great Mothers Day all... Joe On May 10, 2015 8:00 AM, "Brian Ghigiarelli" <[email protected]> wrote: > I have a processor that reads in a large FlowFile and outputs multiple > FlowFiles from its contents (similar to SplitText). Is there any good way > for me to periodically transfer and commit batches of transferred FlowFiles > while continuing to read and process the original input FlowFile? > > At the moment, the session commit throws an exception based on the > transactionality of the session that recognizes that the original FlowFile > hasn't been removed or transferred. > > My goal for this is speed of getting data out of the FlowFile and sending > it on to the rest of our flow as quickly as possible, instead of processing > the whole thing (and handling any errors along the way) before > transferring. > > Thanks, > > -- > Brian Ghigiarelli >
