-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Marcus G. Daniels on 01/08/2008 02:18 PM:
> Glen E. P. Ropella wrote:
>> Well, correct me if I'm wrong, but GP currently requires a human to set
>> up the objective function.  And even in the cases where a system is
>> created so that the objective function is dynamically (and/or
>> implicitly) evolved, my suspicion is that the GP would soon find a
>> computational exploit that would result in either an infinite loop
>> (and/or deadlock), crash, or some sort of "exception".
>>   
> The objective function can be to an extent arbitrary and self-defined by 
> the agent, but there must be a large implicit emphasis on avoiding 
> death.  In a simulated world, a way to deal with exceptions is to trap 
> them, and then reflect that in the objective function.  Existing memory 
> management hardware, operating systems and programming languages have 
> good facilities for trapping exceptions.

Aha!  What you're implicitly referring to, here, is an assemblage
(though not a holarchy) of formal systems ... just like I suggested. [grin]

If inference within one of the formal systems (e.g. memory allocation)
reaches an impasse, the machine hops out of that system and into one
that has a different semantic grounding (e.g. the OS or a hardware
driver) takes over and "plugs the hole".  After the hole is plugged, it
hops back inside the prior formal system and continues on.

_Or_ the latter formal system, through its inference modifies the former
formal system (new axiom, new alphabet, whatever) such that the previous
exception can no longer obtain.  Of course, in that case, it's probably
true that the inference int he former system has to be re-run from the
start rather than picking up where it left off... but, hey, c'est la vie.

The reason I suggested a holarchy rather than just an adhoc assemblage
of systems, however, is important because it's unlikely we'd be able to
design an assemblage of formal systems to handle every exception.
(Sorry for repeating myself...)  So, what's necessary is either the
on-the-fly generation of new systems along with on-the-fly
re-architecting of the assemblage OR a holarchy where every sub-system,
regardless of what level it's at, is further composed of sub-sub-systems.

> runtime.  This is all in the context of a simulated environment, of 
> course.  In the robot example, the robots would just slump on the ground 
> or jump up and down or whatever until its energy supplies were exhausted. 

Well, this is another example of fragility to ambiguity and, to some
extent is the heart of my cheap shot criticism of RR's concept.  The
robot should fail gracefully (like living systems do).  A robot endowed
with the holarchy of formal systems would do everything in its power to
avoid slumping on the ground or doing something over and over with no
discernible effect.  I.e. it would _explore_ not only its own repertoire
(determined by the formal systems) but also the repertoire of its
environment.  Hence, a robot would find ways to harness things in its
environment to plug any holes (resolve any ambiguities) it couldn't
otherwise plug.

E.g. a troubled robot may well find itself replacing its aging
transistor-based "computer" with, say, a bag of wet fat/meat it harvests
from that annoying human who lives in the apartment next door.

- --
glen e. p. ropella, 971-219-3846, http://tempusdictum.com
Do not fear to be eccentric in opinion, for every opinion now accepted
was once eccentric. -- Bertrand Russell

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFHhCONZeB+vOTnLkoRAh5RAJ9fqcffe75m7axl9b1u8z1Rvbq/gACgkaTS
FTBfh0LyX/ibYot7lIgitN8=
=cUMT
-----END PGP SIGNATURE-----

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

Reply via email to