>We currently process each rule in its own transaction, so to commit or 
>roll-back depending on retuned 
>values from the other statements. We cannot process all the rules in one 
>transaction (unless if we use 
>savepoints?). Also the inventory changes constantly so getting a snapshot at 
>starttime will not make 
>sense when getting to rule 5000+.

Are the data sensible after, say 3500 rules are run? If each rule is a separate 
unit of work and you do not require all rules to be run for the data to be 
ready for further use, then it might be understandable with many transactions. 
On the other hand, if all 6000 rules must be applied before the data are ready, 
then I'd say put all rules within one common transaction. Of course, it is more 
likely that you have some rules that doesn't make sense unless grouped with 
other rules, whereas there are natural points within your rule system where a 
break makes sense, so the most sensible thing could well be to reduce the 
number of transactions, but not putting all rules within the same transaction.

Typically, a transaction sees all changes done within its own transaction, but 
not changes done in other, concurrent transactions. I think that putting more 
rules within the same transaction could make deadlocks more likely to occur. Of 
course, deadlocks do not occur unless there are simultaneous transactions that 
update the same records in a different order, so I'm far from saying that this 
will happen, but it might be good thinking about it when grouping rules into 
transactions.

Set

Reply via email to