Hi All,

I have tried to perform the below operation.

dat file(input)-->JSON-->SQL-->SQLServer


GetFile-->SplitText-->SplitText-->ExtractText-->ReplaceText-->ConvertJsonToSQL-->PutSQL.

My Input File(.dat)-->3,00,000 rows.

*Objective:* Move the data from '.dat' file into SQLServer.

I can able to Store the data in SQL Server by using combination of above
processors.But it takes almost 4-5 hrs to move complete data into SQLServer.

Combination of SplitText's perform data read quickly.But Extract Text takes
long time to pass given data matches with user defined expression.If input
comes 107 MB but it send outputs in KB size only even ReplaceText processor
also processing data in KB Size only.

In accordance with above slow processing leads the more time taken for data
into SQLsever.


Extract Text,ReplaceText,ConvertJsonToSQL processors send's outgoing flow
file in Kilobytes only.

If i have specify concurrent tasks for those
ExtractText,ReplaceText,ConvertJsonToSQL then it occupy the 100% cpu and
disk usage.

It just 30 MB data ,But processors takes 6 hrs for data movement into
SQLServer.

Faced Problem is..,


   1.        Almost 6 hrs taken for move the 3lakhs data into SQL Server.
   2.        ExtractText,ReplaceText take long time for processing data(it
   send output flowfile kb size only).

Can anyone help me to solve below *requirement*?

Need to reduce the number of time taken by the processors for move the
lakhs of data into SQL Server.



If anything i'm done wrong,please help me to done it right.

Reply via email to