> From: Samantha Atkins [mailto:[EMAIL PROTECTED]
> 
> On Dec 28, 2007, at 5:34 AM, John G. Rose wrote:
> >
> > Well I shouldn't berate the poor dude... The subject of rationality is
> > pertinent though as the way that humans deal with unknown involves
> > irrationality especially in relation to deitical belief establishment.
> > Before we had all the scientific instruments and methodologies
> > irrationality
> > played an important role. How many AGIs have engineered
> > irrationality as
> > functional dependencies? Scientists and computer geeks sometimes
> > overly
> > apply rationality in irrational ways. The importance of irrationality
> > perhaps is underplayed as before science, going from primordial
> > sludge to
> > the age of reason was quite a large percentage of mans time spent in
> > existence... and here we are.
> 
> Methinks there is no clear notion of "rationality" or "rational" in
> the above paragraph.  Thus I have no idea of what you are actually
> saying.    Rational is not synonymous with science.   What forms of
> irrationality do you think have a place in an AGI and why?   What does
> the percentage of time supposedly spend in some state have to do with
> the importance of such a state especially with respect to an AGI?
> 

What I am trying to zero in on Samantha is that methodology of reasoning
that humankind uses to deal with unknowns. Example - 10,000 years ago, sun -
it's hot, comes up everyday, gives life, need it or plants will die. BUT you
being the avant-garde answer-finder of the local tribe of semi-civilized
folk DON'T have much in terms of science and boolean logic to start figuring
out what it really is. So various approaches are used to identify and apply
utility to and make the reasoning part of everyday operations of the people.
The rationality that is used is mixed with irrationality. Why? We are not
following clear cut probabilities here there are other processes involved
and if these are in your understanding of what "rational" is please feel
free to enlighten. Man is not a purely rational being and if reasoning in an
AGI is based on just maximizing probabilities it is not enough.

You could say well I want a pure intelligence that is 100% rational and
man's intelligence is deviant from pure. It probably is but a pure
intelligence may not deem man's (human's) existence as a rational
expenditure of resources and want to terminate him. This is obviously bad.
Us biological blobs of useless resource consuming waste want a pure
intelligence to keep us around (but not like in the Matrix :)). So how do we
fit in rationally, or do we make exceptions. It is not pure probability
optimizations. There is "irrationality" for lack of better term, IOW this
"irrationality" needs to be explored more and broken up. The irrationality
is relative; it is a mask, a deception device, has social functions, etc.
etc...

John

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=80187072-0b0307

Reply via email to