On 2/20/07, Mark Waser <[EMAIL PROTECTED]> wrote:
Yes, you do -- and guess how enterprise class databases do it . . . . Those suckers are *seriously* optimized, particularly for set operations. You could hand-code weather simulations so that they are faster than an equivalent system coded in a database because the data is highly localized (and I'm not talking about things like massive Fourier transforms) but there isn't anything that handles huge quantities of widely and wildly inter-related data better than a decent database. With a database, you don't have to think about memory (and swapping, etc.) -- it's handled. When you write your own system, you end up bleeding serious quantities of time and then end up on the rocks when you have to move from 32 to 64-bits (right, Ben?).
I am thinking about things like massive Fourier transforms, so maybe that's where we differ. Specifically, anything aiming in the direction of AGI will have to perform, among other things: exaflop-range number-crunching runs for physical/spatial simulation, image analysis etc searches in spaces like "all possible S-expressions of less than 100 nodes with this set of operators" looking for a function that satisfies some criteria For that sort of work, if your working set doesn't fit in RAM it'll sit down anyway, so you'd better assume it does; and databases aren't optimized for heavy number crunching unless I'm missing something. What sort of computation were you thinking of? Also, why would 32 -> 64 bit be a problem, provided you planned for it in advance? ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?list_id=303