>
> Obviously you didn't consider the potential a laptop has with its
> network connection, which in theory can give it all kinds of
> perception by connecting it to some input/output device.


yes, that's true ... I was considering the laptop w/ only a power cable as
the "AI system in question."  Of course my point does not apply to a laptop
that's being used as an on-board control system for an android robot, or a
laptop that's connected to a network of sensors and actuators via the net,
etc.  Sorry I did not clarify my terms better!

Similarly the human brain lacks much proprioception and control in
isolation, and probably would not be able to achieve a high level of general
intelligence without the right peripherals (such as the rest of the human
body ;-)


Even if we exclude network, your conclusion is still problematic. Why
> a touchpad cannot provide proprioceptive perception? I agree it
> usually doesn't, because the way it is used, but that doesn't mean it
> cannot, under all possible usage. The same is true for keyboard. The
> current limitation of the standard computer is more in the way we use
> them than in the hardware itself.
>

I understand that a keyboard and touchpad do provide proprioceptive input,
but I think it's too feeble, and too insensitively respondent to changes in
the environment and the relation btw the laptop and the environment, to
serve as the foundation for a robust self-model or a powerful general
intelligence.



>
> > to form a physical self-image based on its perceptions ... hence a
> standard
> > laptop will not likely be driven by its experience to develop a
> phenomenal
> > self ... hence, I suspect, no generally intelligent mind...
>
> Of course it won't have a visual concept of "self", but a system like
> NARS has the potential to grow into an intelligent operating system,
> with a notion of "self" based on what it can feel and do, as well as
> the causal relations among them --- "If there is a file in this
> folder, then I should have felt it, it cannot be there because I've
> deleted the contents".


My suggestion is that the file system lacks the complexity of structure and
dynamics to support the emergence of a robust self-model, and powerful
general intelligence...

Not in principle ... potentially a file system *could* display the needed
complexity, but I don't think any file systems on laptops now come close...

Whether the Internet as a whole contains the requisite complexity is a
subtler question.

>
>
> I know some people won't agree there is a "self" in such a system,
> because it doesn't look like themselves. Too bad human intelligence is
> the only known example of intelligence ...


I would call a "self" any internal, explicit model that a system creates
that allows it to predict its own behaviors in a sufficient variety of
contexts....  This need not have a visual aspect nor a great similarity to a
human self.

-- Ben



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=111637683-c8fa51
Powered by Listbox: http://www.listbox.com

Reply via email to