We have a system which is constantly importing flat file data feeds into
normalized tables in a DB warehouse over 10-20 connections. Each data feed
row results in a single transaction of multiple single row writes to
multiple normalized tables.

 

The more columns in the feed row, the more write operations, longer the
transaction.

 

Operators are noticing that splitting a single feed of say - 100 columns -
into two consecutive feeds of 50 columns improves performance dramatically.
I am wondering whether the multi-threaded and very busy import environment
causes non-linear performance degradation for longer transactions. Would the
operators be advised to rewrite the feeds to result in more smaller
transactions rather than fewer, longer ones?

 

Carlo

Reply via email to