Hi Bruno Marchal Good.
[Roger Clough], [rclo...@verizon.net] 1/12/2013 "Forever is a long time, especially near the end." - Woody Allen ----- Receiving the following content ----- From: Bruno Marchal Receiver: everything-list Time: 2013-01-12, 07:02:42 Subject: Re: Subjective states can be somehow extracted from brains viaacomputer On 12 Jan 2013, at 11:53, Roger Clough wrote: > Hi meekerdb > > Complexity need not have anything to do with intelligence. > The critical requirement is autonomy of choice. But that autonomy itself requires a minimal amount of complexity. Then the math shows that it is not a lot. universality is cheap. Intelligence requires the same very minimal, but not null, complexity. Bruno > > > [Roger Clough], [rclo...@verizon.net] > 1/12/2013 > "Forever is a long time, especially near the end." - Woody Allen > ----- Receiving the following content ----- > From: meekerdb > Receiver: everything-list > Time: 2013-01-11, 14:20:13 > Subject: Re: Subjective states can be somehow extracted from brains > viaacomputer > > > On 1/11/2013 2:12 AM, Telmo Menezes wrote: > > > > > > On Fri, Jan 11, 2013 at 1:33 AM, meekerdb wrote: > > On 1/10/2013 4:23 PM, Telmo Menezes wrote: > Do you think there can be something that is intelligent but not > complex (and use whatever definitions of "intelligent" and "complex" > you want). > > > > A thermostat is much less complex than a human brain but intelligent > under my definition. > > But much less intelligent. > > > That's your conclusion, not mine. According to my definition you can > only compare thermostats being good at being thermostats and Brents > being good at being Brents. Because you can only compare > intelligence against a same set of goals. Otherwise you're just > saying that intelligence A is more complex than intelligence B. > Human intelligence requires a certain level of complexity, bacteria > intelligence another. That's all. > > So you've removed all meaning from intelligence. Rocks are smart at > being rocks, we just have to recognize their goal is be rocks. > > Maybe we can stop dancing around the question by referring to human- > level-intelligence and then rephrasing the question as, "Do you > think human-like-intelligence requires human-like-complexity?" > > Brent > > -- > You received this message because you are subscribed to the Google > Groups "Everything List" group. > To post to this group, send email to everything-list@googlegroups.com. > To unsubscribe from this group, send email to > everything-list+unsubscr...@googlegroups.com > . > For more options, visit this group at > http://groups.google.com/group/everything-list?hl=en > . > http://iridia.ulb.ac.be/~marchal/ -- You received this message because you are subscribed to the Google Groups "Everything List" group. To post to this group, send email to everything-list@googlegroups.com. To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/everything-list?hl=en. -- You received this message because you are subscribed to the Google Groups "Everything List" group. To post to this group, send email to everything-list@googlegroups.com. To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.