Setting max rows is the main way to get around these sorts of problems. Aside from the non standardness of using an offset, the only time I've really seen a good potential use for it is in list pagination but even then if a user is attempting to view page 1000 then I think it's more of a UI issue than anything else. IMO if the user can quickly view any of the first maybe 50 odd pages then that it perfectly fine.
Regards Scott On 11/11/2009, at 9:41 PM, ian tabangay wrote:
Hi Scott,Actually, my main concern is more on the processing of a document. This usually involves alot of inserts and updates at the very least. I think the bottleneck is more in the database inserts/updates. Indeed, scaling up our hardware (and alot of database load balancing/clustering and indexes) could help speed up these processes. As for querying a large set of result sets,we had problems dealing with the current implemtation of entity listiterator. There was a linear time difference for requesting for a smallnumber of result sets as the total number of rows grow to the hundredthousands. We modified the sql processor to use limit and offset to make requesting time constant regardless of the number of rows the entity would have. The limitation of using a "non standard" sql command was acceptablefor us since we wouldnt be changing our databases anyways. --- Ian TabangayOn Wed, Nov 11, 2009 at 4:02 PM, Scott Gray <scott.g...@hotwaxmedia.com >wrote:Hi Ian,I didn't say OFBiz wasn't intended to handle this sort of load, I just said that the purchase order code may not have been optimized for it. I have no doubt that OFBiz can handle what you're describing but you may find portions of the code could be improved to speed things up a bit, like using the cache where appropriate, using the entity iterator rather than retrieving fulllists for large result sets, improving queries to make better use of indexes, etc. After that it really just comes down to your hardware. Regards Scott
smime.p7s
Description: S/MIME cryptographic signature