Thanks Matt for reply... So is there any other way to read data from have table into batches because my table having millions of row and i am using NiFi 1.3
I tried with below processor:- 1. ExecuteHQL(Customer processor of Kylo) -- creating single file to downstream. 2. SelectHiveQL - "Max Rows Per Flow File" property not supporting NiFi 1.3.0 3. GenerateTableFetch and QueryDatabaseTable -- Sent from: http://apache-nifi-developer-list.39713.n7.nabble.com/