Richard,

> So long as the general response to the complex systems problem is not "This
> could be a serious issue, let's put our heads together to investigate it",
> but "My gut feeling is that this is just not going to be a problem", or
> "Quit rocking the boat!", you can bet that nobody really wants to ask any
> questions about whether the approaches are correct, they just want to be
> left alone to get on with their approaches.

Both Ed Porter and myself have given serious thought to the "complex systems
problem" as you call it, and have discussed it with you at length.  I
also read the
only formal paper you sent me dealing with it (albeit somewhat
indirectly) and also
your various online discourses on the topic.

Ed and I don't agree with you on the topic, but not because of lack of thinking
or attention.

Your argument FOR the existence of a "complex systems problem" with Novamente
or OpenCog, is not any more rigorous than our argument AGAINST it.

Similarly, I have no rigorous argument that Novamente and OpenCog won't fail
because of the lack of a soul.   I can't prove this formally -- and
even if I did, those who
believe a soul is necessary for AI could always dispute the
mathematical assumptions
of my proof.  And those who do claim a soul is necessary, have no
rigorous arguments
in their favor, except ones based transparently on assumptions I reject...

And so it goes...

Ben


-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=106510220-47b225
Powered by Listbox: http://www.listbox.com

Reply via email to