Hello,

Did you try setting 'Max Rows Per Flow File' at ExecuteSQL processor?
If the OOM happened when NiFi writes all results into a single
FlowFile, then the property can help breaking the result set into
several FlowFiles to avoid that.

Thanks,
Koji
On Fri, Sep 21, 2018 at 3:56 PM Dnyaneshwar Pawar
<dnyaneshwar_pa...@persistent.com> wrote:
>
> Hi,
>
>
>
> How to execute/process High volume data with ExecuteSQL processor:
>
>
>
> We tried to execute query for db2 database which has around 10 lakh records. 
> While executing this query
>
> we are getting OutOfMemory error and that request(flowfile) is stuck in 
> queue. When we restart nifi, it still stuck in queue and as soon as we start 
> nifi,
>
> we are again getting same error as it is stuck in queue. Is there any way to 
> configure retry for queue(connection to 2 processor).
>
>
>
> We also tried to change property for Flow File repository in nifi.properties 
> (nifi.flowfile.repository.implementation) to 
> 'org.apache.nifi.controller.repository.VolatileFlowFileRepository',
>
> This is removing flowfile in query while restarting nifi. But it has risk of 
> data loss in the event of power/machine failure for other processes.
>
> So please suggest how to execute high volume data query execution or any 
> retry mechanism available for queued flowfile.
>
>
>
>
>
> Regards,
>
> Dnyaneshwar Pawar
>
>

Reply via email to