On 27 Feb 2012, at 23:15, Craig Weinberg wrote:
On Feb 27, 4:52 pm, meekerdb <meeke...@verizon.net> wrote:
On 2/27/2012 1:09 PM, Craig Weinberg wrote:
On Feb 27, 3:32 pm, meekerdb<meeke...@verizon.net> wrote:
On 2/27/2012 11:54 AM, Craig Weinberg wrote:
AIs can generate their own software. That is the point of AI.
They don't have to generate their own software though, we have
to tell
them to do that and specify exactly how we want them to do it.
Not exactly. AI learns from interactions which are not known to
those who write the AI
program.
...when we program them specifically to 'learn' in the the exact
ways
which we want them to.
They can learn by higher level program modifications too, and those
can also be random.
So there is no evidence that their learning is qualitatively
different from yours.
There is no such thing as evidence when it comes to qualitative
phenomenology. You don't need evidence to infer that a clock doesn't
know what time it is.
A clock has no self-referential ability.
You reason like that: no animals can fly, because pigs cannot fly.
Bruno
http://iridia.ulb.ac.be/~marchal/
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en.