Hi Benjayk,
Bruno Marchal wrote:
We just cannot do artificial intelligence in a provable manner. We
need chance, or luck. Even if we get some intelligent machine, we
will
not know-it-for sure (perhaps just believe it correctly).
But this is a quite weak statement, isn't it? It just prevents a
mechanical
way of making a AI, or making a provably friendly AI (like Eliezer
Yudkowsky
wants to do).
Yes it is quite weak. It can even been made much weaker if we allow
machines to make enough mistakes for indeterminate period of times. In
that case, some necessarily non constructive proof can be made
constructive. After all, evolution itself is plausibly mechanical.
We can prove very little about what we do or "know" anyway. We can't
prove
the validity of science, for example.
You are right, but here the point is more subtle. Most initial
theoretical statements are not provable, but we can take them as new
axioms without becoming inconsistent. But most "theological"
statements of the machine/numbers have that property that, despite
being true, they become false when added as an axiom.
It is a bit like a theory with five axioms. You cannot add a sixth
axioms saying that the theory has five axioms. Self-consistency, and
consciousness behave similarly. Human science or theological science
are full of things of that kind, I mean truth which just cannot be
asserted, except very cautiously. In fact the modal logic G* minus G
axiomatizes them all (at the propositional level).
That is perhaps the source of this very deep 'truth': hell is paved
with the good intentions.
It doesn't even mean that there is no developmental process that
will allow
us to create ever more powerful heuristics with which to find better
AI
faster in a quite predictable way (not predictable what kind of AI
we build,
just *that* we will build a powerful AI), right?
Yes, that is possible. Heuristics are typically not algorithmic.
Bruno
http://iridia.ulb.ac.be/~marchal/
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en.