I am going to write a quick AI-Simulator. If it works I should be able
to make it capable of demonstrating some emergence for itself even
though I am starting with a system that will allow me to encode input.
It will also try to interpret input. I am going to be developing my
theories about the use of highly integrable conceptual
cross-categorization, cross-generalization and cross-abstractions to
create more intelligent AI. This is not going to be the same as old AI
although I will be using a text IO. In order to do this quickly I will
be abandoning my long standing data and database management code that
I planned to use for an AGI project so that I can cut the program
design down to make it as simple as possible. I will use a few of the
classes I have already developed for the data structures but I will
only start with 3 fundamental data structures. (I will probably need a
few more for data management but simplicity is a key design element).
I will not be using the active file data access methods that I
developed and instead I will just open the file, run the code using
virtual memory and save the file by menu selection when I am done. (If
the program works I can add updates-only file saving as needed since I
have already developed those kinds of functions.) And I intend to use
the cross-categorization, generalization and abstraction systems to
develop indexing for the system as well.

So the program will allow me to program the AI (which makes it a
simulation) using a labelling (or annotation) of the input. I will not
be using a context-free compiler or interpreter or anything like that.
I expect that I will need to develop the encoding language (the
AI-Simulation language) as I go but it is not going to be a highly
developed formal programming language. I want it to run into some
ambiguities. It will also need to be able to interpret statements
using the AI as it is developed. And when it does run into
interpretation problems I intend on using text to clarify any
confusing interpretations or to disambiguate as needed. (One of my AI
principles includes something I call generalization-levels so
clarification, explanation and disambiguation are really relative not
only to general context levels but also to a variety of kinds of
specific contexts.) So, if it works at all it should be unlike any
other AI programs that I know about.
Jim Bromer


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to