Jiri Jelinek wrote:

Ok, seriously, what's the best possible future for mankind you can imagine?
In other words, where do we want our cool AGIs to get us? I mean
ultimately. What is it at the end as far as you can see?

That's a very personal question, don't you think?

Even the parts I'm willing to answer have long answers. It doesn't involve my turning into a black box with no outputs, though. Nor ceasing to act, nor ceasing to plan, nor ceasing to steer my own future through my own understanding of it. Nor being kept as a pet. I'd sooner be transported into a randomly selected anime.

--
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=60516560-38feaf

Reply via email to