>> I am thinking about things like massive Fourier transforms, so maybe that's 
>> where we differ. Specifically, anything aiming in the direction of AGI will 
>> have to perform, among other things: 

Ah.  Well . . . . if that is the direction that your AGI is taking, then you 
should ignore my speechifying . . . .  :-)

Though, if you're talking about searches like "all *previously generated* 
S-expressions of less than 100 nodes with this set of operators", I'll bet that 
my enterprise database will scale further and be faster than your custom 
solution.

>> exaflop-range number-crunching runs for physical/spatial simulation, image 
>> analysis etc

human intelligence doesn't do any of this and you'd want to do it all in 
hardware and localized IO systems anyways     :-)

>> What sort of computation were you thinking of? 

As I've been saying -- set operations, pattern-matching, fuzzy retrieval, 
cloning of large contexts, non-interwoven functions/operations over *large* sets

>> Also, why would 32 -> 64 bit be a problem, provided you planned for it in 
>> advance?

Name all the large, long-term projects that you know of that *haven't* gotten 
bitten by something like this.  Now, name all of the large, long-term projects 
that you know of that HAVE gotten bitten repeatedly by the state of the art 
moving past something that they have custom programmed and can't easily 
integrate.  If the second number isn't a lot larger than the first, you're not 
living in my world.    :-)


  ----- Original Message ----- 
  From: Russell Wallace 
  To: agi@v2.listbox.com 
  Sent: Tuesday, February 20, 2007 6:02 PM
  Subject: **SPAM** Re: [agi] Development Environments for AI (a few 
non-religious comments!)


  On 2/20/07, Mark Waser <[EMAIL PROTECTED]> wrote:

    Yes, you do -- and guess how enterprise class databases do it . . . .  
Those suckers are *seriously* optimized, particularly for set operations.  You 
could hand-code weather simulations so that they are faster than an equivalent 
system coded in a database because the data is highly localized (and I'm not 
talking about things like massive Fourier transforms) but there isn't anything 
that handles huge quantities of widely and wildly inter-related data better 
than a decent database.  With a database, you don't have to think about memory 
(and swapping, etc.) -- it's handled.  When you write your own system, you end 
up bleeding serious quantities of time and then end up on the rocks when you 
have to move from 32 to 64-bits (right, Ben?). 

  I am thinking about things like massive Fourier transforms, so maybe that's 
where we differ. Specifically, anything aiming in the direction of AGI will 
have to perform, among other things: 

  exaflop-range number-crunching runs for physical/spatial simulation, image 
analysis etc
  searches in spaces like "all possible S-expressions of less than 100 nodes 
with this set of operators" looking for a function that satisfies some criteria 

  For that sort of work, if your working set doesn't fit in RAM it'll sit down 
anyway, so you'd better assume it does; and databases aren't optimized for 
heavy number crunching unless I'm missing something. What sort of computation 
were you thinking of? 

  Also, why would 32 -> 64 bit be a problem, provided you planned for it in 
advance?


------------------------------------------------------------------------------
  This list is sponsored by AGIRI: http://www.agiri.org/email
  To unsubscribe or change your options, please go to:
  http://v2.listbox.com/member/?list_id=303 

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to