Eric, can you give an example of a one way function (such as a cryptographic hash or cipher) produced by evolution or by a genetic algorithm? A one-way function f has the property that y = f(x) is easy to compute, but it is hard to find x given f and y. Other examples might be modular exponentiation in large finite groups, or multiplication of prime numbers with thousands of digits.
By "incrementally updatable", I mean that you can make a small change to a system and the result will be a small change in behavior. For example, most DNA mutations have a small effect. We try to design software systems with this property so we can modify them without breaking them. However, as the system gets bigger, there is more interaction between components, until it reaches the point where every change introduces more bugs than it fixes and the code becomes unmaintainable. This is what happens when the system crosses the boundary from stability to chaotic. My argument for Kauffman's observation that complex systems sit on this boundary is that stable systems are less useful, but chaotic systems can't be developed as a long sequence of small steps. We are able to produce cryptosystems only because they are relatively simple, and even then it is hard. I don't dispute that learning some simple grammars is NP-hard. However, I don't believe that natural language is one of these grammars. It certainly is not "simple". The human brain is less powerful than a Turing machine, so it has no special ability to solve NP-hard problems. The fact that humans can learn natural language is proof enough that it can be done. -- Matt Mahoney, [EMAIL PROTECTED] ----- Original Message ---- From: Eric Baum <[EMAIL PROTECTED]> To: agi@v2.listbox.com Sent: Sunday, November 12, 2006 9:29:13 AM Subject: Re: [agi] Natural versus formal AI interface languages Matt wrote: Anyway, my point is that decoding the human genome or natural language is n= ot as hard as breaking encryption. It cannot be because these systems are = incrementally updatable, unlike ciphers. This allows you to use search str= ategies that run in polynomial time. A key search requires exponential tim= e, or else the cipher is broken. Modeling language or the genome in O(n) t= ime or even O(n^2) time with n =3D 10^9 is much faster than brute force cry= ptanalysis in O(2^n) time with n =3D 128. I don't know what you mean by incrementally updateable, but if you look up the literature on language learning, you will find that learning various sorts of relatively simple grammars from examples, or even if memory serves examples and queries, is NP-hard. Try looking for Dana Angluin's papers back in the 80's. If your claim is that evolution can not produce a 1-way function, that's crazy. ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?list_id=303 ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?list_id=303