Well just to finish, the plan is that I will start by defining and
using a simple language to express simple ideas. At first I will use
extensive annotation to help define how the program should use the
language. Since I will also be modifying the program as I go, the
programming and the language will create a simulation of how the
program might work. Of course it would be simplistic and it would be
based on the artificiality of using the annotations (as well as other
analysis of the language) to determine some actions that it should
take in trying to use the ideas expressed. I will not be programming
the response mechanisms to seem like the program knew things that it
had not learned, I will be designing the program to use the
'knowledge' that it had previously derived through the exchanges that
it had with me, the 'user'.

So at first it will be a crude simulation of an AGI program. I will be
programming it, using the language as well as the developing
programming, to act as I wished it would act. As the program started
to learn some things for itself, I will then start designing the
program to use that kind of learned knowledge more effectively and I
should be annotating the text in a way that will be a better
simulation of what the program might eventually be able to do based on
my observations of how it reacts. And, over time, as I learn what
might be more feasible, my annotations should look more like the sort
of thing that the program might actually be able to derive for itself.
My hope is that I will eventually find practical (or feasible) methods
for the program to use and my annotations will become more simple and
more casual. It would be impossible to annotate every detail that is
needed to understand a language that was like a natural language.

At first my annotations will not look like the kinds of relations a
computer program might actually detect, but eventually my simulations
should look more and more like the kind of relations a feasible
program could detect and the program will be able to figure out more
for itself.


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to