Ed, >But I guess I am too much of a product of my upbringing and education to want only bliss. I like to create things and ideas.
I assume it's because it provides pleasure you are unable to get in other ways. But there are other ways and if those were easier for you, you would prefer them over those you currently prefer. >And besides the notion of machines that could be trusted to run the world for us while we seek to surf the endless rush and do nothing to help support our own existence or that of the machines we would depend upon, strikes me a nothing more than wishful thinking. A number of scenarios were labeled "wishful thinking" in the past and science later got us there. >The biggest truism about altruism is that it has never been the dominant motivation in any system that has ever had it, and there is no reason to believe that it could continue to be in machines for any historically long period of time. Survival of the fittest applies to machines as well as biological life forms. a) Systems correctly designed to be altruistic are altruistic. b) Systems correctly designed to not self-change in particular way don't self-change in that way. c) The a) and b) hold true unless something [external] breaks the system. d) *Many* independent and sophisticated safety mechanisms can be utilized to mitigate c) related risks. >If bliss without intelligence is the goal of the machines you imaging running the world, for the cost of supporting one human they could probably keep at least 100 mice in equal bliss, so if they were driven to maximize bliss why wouldn't they kill all the grooving humans and replace them with grooving mice. It would provide one hell of a lot more bliss bang for the resource buck. As an extension of our intelligence, they will be required to stick with our value system. Regards, Jiri Jelinek ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=8660244&id_secret=60898198-756d29