On Thu, Apr 11, 2013 at 11:53 AM, Ben Goertzel <[email protected]> wrote:
> I have a different theory of > mind ... which leads to different theories about developing and testing > *general intelligence* systems than you have. > > -- Ben G I think this is a good point. Your methods of testing have to follow the mechanisms of the applied theories that you are working with. The idea that someone who has no clue about what you are working on or who is pursuing a totally different approach than you can dictate how your highly innovative program should be tested is nonsense. And of course we have to exclude the voices of the extremists when we offer a description of how those tests went. However, the inability to deal with the hard results of a test is a major liability in so many different ways that there is no good excuse for avoiding the process. So I have been trying to define my own tests and even though I predicted that it might take me some months before I started testing, that delay is such a negative indicator that I realized I had to sharpen my theories and find the time to get to work on the program. And I am doing that. Maybe I would have sharpened my theories in that time anyway but I was surely motivated my acceptance of the negative results of the preliminary tests that created. Of course I will have to show that these ideas do work ( in the AGi model that I have talked about) to prove that they are as sharp as I feel that they are but one thing is sure. If I can't get them to work then then I haven't figured it out. Jim Bromer ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
