RE: [agi] E=mc^2 Morphism Musings... (Intelligence=math*cJohn et alonsciousness^2 ?)

2018-09-14 Thread John Rose
> -Original Message- > From: Jim Bromer via AGI > > > There are some complications of the experience of our existence, and those > complications may be explained by the complex processes of mind. > Since we can think we can think about the experience of life and interweave > the strands

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*cJohn et alonsciousness^2 ?)

2018-09-14 Thread Jim Bromer via AGI
If you want to talk about user experience then do so (as you just did.) Do not conflate human experience with programming constructs and then find a tangential argument to justify it. Jim Bromer On Fri, Sep 14, 2018 at 12:21 PM John Rose wrote: > > -Original Message- > > From: Jim Bromer v

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*cJohn et alonsciousness^2 ?)

2018-09-14 Thread Nanograte Knowledge Technologies via AGI
I think, Last Mile Engineering, is everything...even for AI. Principles for the survival of technology have not changed over the past 5 decades. For mankind, the user interface is the only protocol. As translators, we need to bear in mind that chasing the AGI dragon won't exclude our results fro

Re: [agi] Growing Knowledge

2018-09-14 Thread Jim Bromer via AGI
I am not talking about growing knowledge from a domain base. I am talking about growing knowledge from interactions with human users who will provide the system with facts and relationships usually expressed in a natural (but austere) form, which the program would then have to integrate and explore

RE: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-14 Thread John Rose
> -Original Message- > From: Matt Mahoney via AGI > > When we say that X is more conscious tha Y we really mean that X is more like > a human than Y. > > The problem is still there how to distinguish between p-zombie and a > conscious being. > > The definition of a p-zombie makes this i

Re: [agi] Judea Pearl on AGI

2018-09-14 Thread Jim Bromer via AGI
That is not quite true. Each game could be reduced to a conveniently finite number of reactions and principles. So if someone wanted to waste his time he could create a simple physics-like modelling program that could learn to play the games. The complexities could be refined or reduced to a relati

RE: [agi] E=mc^2 Morphism Musings... (Intelligence=math*cJohn et alonsciousness^2 ?)

2018-09-14 Thread John Rose
> -Original Message- > From: Matt Mahoney via AGI > > > It's relevant if consciousness is the secret sauce. and if it applies to the > complexity problem. > > Jim is right. I don't believe in magic. > A Recipe for a Theory Mind Three pints of AIT (Algorithmic Information Theory) Ale