My point is simply that an AGI should be able to think about such
concepts, like we do. It doesn't need to solve them. In this sense I
think it is a fundamental concern: how is it possible to have a form
of knowledge representation that can in principle capture all ideas a
human might express? Intuition suggests that there should be a simple
sufficient representation, like 1st-order logic. But 1st-order logic
isn't enough, and neither are 2nd-order logics, 3rd order...

Yes, there is a simple sufficient representation that nature has put *a lot* of effort into evolving and continues to evolve every day (and not only that -- there are numerous variants of it to study and determine what is key/core and what can be varied).

It's simple, infinitely expandable and you see it and use it every day.

Can you guess what it is?

<scroll down to see the answer>










































Yes, it's ordinary human language -- whether written or spoken; English or Spanish or Chinese or whatever . . . . .





----- Original Message ----- From: "Abram Demski" <[EMAIL PROTECTED]>
To: <agi@v2.listbox.com>
Sent: Wednesday, June 18, 2008 2:20 PM
Subject: Re: [agi] the uncomputable


On Wed, Jun 18, 2008 at 9:54 AM, Benjamin Johnston
<[EMAIL PROTECTED]> wrote:
[...]
In any case, this whole conversation bothers me. It seems like we're
focussing on the wrong problems; like using the Theory of Relativity to
decide on an appropriate speed limit for cars in school zones. If it could take 1,000 years of thought and creativity to go from BB(n) to BB(n+1) for some n, we're talking about problems of an incredible scale, far beyond what most of us have in mind for our first prototypes. A challenge with the busy
beaver problem is that when n becomes big enough, you start being able to
encode long-standing and very difficult mathematical conjectures.

-Ben

My point is simply that an AGI should be able to think about such
concepts, like we do. It doesn't need to solve them. In this sense I
think it is a fundamental concern: how is it possible to have a form
of knowledge representation that can in principle capture all ideas a
human might express? Intuition suggests that there should be a simple
sufficient representation, like 1st-order logic. But 1st-order logic
isn't enough, and neither are 2nd-order logics, 3rd order...


-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: http://www.listbox.com/member/?&;
Powered by Listbox: http://www.listbox.com





-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=106510220-47b225
Powered by Listbox: http://www.listbox.com

Reply via email to