will develop reasons to find humans useful is the only way to ensure an AI will remain friendly.
Our Carbon based DNA may continue to be a vector for validation of programming
a silicon based program may devise.
Carbon based life has some value as sensory appendages for silicon life.
Humans may have to pay back the AI with providing social and emotional
entertainment feedback.
We indeed may find that our future form might become enhanced so that like ODO
the shape shifting lifeform the familiar human form might become only one of many
iterations of carbon based enhanced life.
So to ensure an AI keeps us around might entail pre-loading an instinctual
parental feeling or a sort of "mothering up" or imprinting which should be
integrated deeply within its source code.
Some concern should be given as to what social more's to instill in the AI.
Does it become necessary to create a "religeous" or emotional bond?
And how exactly can such traits be converted into source code?
Given the inability of humans to cooperate and collaborate on a global scale to date,
perhaps it should be the first task for and AI to be given the challenge of
domesticating the human species to make it worthy of a long term association.
Morris
On 9/17/06, Shane Legg <[EMAIL PROTECTED]> wrote:
On 9/17/06, Brian Atkins <[EMAIL PROTECTED]> wrote:I'm still not convinced you're both talking about exactly the same thing, but
I'll leave it for you two to decide. I would definitely suggest for anyone
working on FAI debunking to make sure you are aiming at the right target.
It would be much easier to aim at the right target if the target was
properly defined. There are endless megabytes of text about friendly
AI on the internet, but still no precise formal definition. That was, and
remains, may main problem with FAI. If you believe that the only safe
AI is one that has been mathematically proven to be 100% safe, then
you will need a 100% water tight formal mathematical definition of what
this means. Until I see such a definition, I'm not convinced that FAI is
really going anywhere.
Shane
This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/[EMAIL PROTECTED]
This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/[EMAIL PROTECTED]
