On Thursday 04 September 2008, Valentina Poletti wrote:
> When we want to step further and create an AGI I think we want to
> externalize the very ability to create technology - we want the
> environment to start adapting to us by itself, spontaneously by
> gaining our goals.

There is a sense of resilience in the whole scheme of things. It's not 
hard to show how stupid each one of us can be in a single moment; but 
luckily our stupid decisions don't blow us up [often] - it's not so 
much luck as it might be resilience. In an earlier email to which I 
replied today, Mike was looking for a resilient computer that didn't 
need code. 

On another note: goals are an interesting folk psychology mechanism. 
I've seen other cultures afflict their own goals upon their 
environment, sort of how the brain contains a map of the skin for 
sensory representation, the same with the environment to their own 
goals and aspirations in life. What alternatives to goals could you do 
when doing programming? Otherwise you'll not end up with Mike's 
requested 'resilient computer' as I'm calling it.

- Bryan
________________________________________
http://heybryan.org/
Engineers: http://heybryan.org/exp.html
irc.freenode.net #hplusroadmap


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=111637683-c8fa51
Powered by Listbox: http://www.listbox.com

Reply via email to