Is long term survival of intelligent life an AGI-hard problem?

What do you think?

IMHO that question is very deep ... a lot of
technology-relevance-timeframe-biases to consider:

If technologies with existential risk properties (nanobots, black hole
generators, etc., ...) are AGI-hard problems or if every civilization will
inevitably arrive at AGI before arriving at said technologies than AGI
might obsolete most of the existential risks stemming from said
technologies. AGI would make those technologies a possibility but at the
same time ensure a certain level of collective intelligence, rationality
and empathy (mindplex?) in order for those technologies not to represent
existential risks. I.e. there would be no cultural "lag" between a change
in technology and intelligence/rationality/knowledge/consciousness.

The existence of thermonuclear weapons on the other hand probably already
tells us that some more primitive existential risk technologies are not
AGI-hard.

But what about overcoming fear, competition, asymmetries, ego and
capitalism? It seems that we are trapped within a very narrow range of
intelligence and rationality as a species and given the cyclic nature of
our socioeconomic framework it also seems that we are unlikely to obsolete
this paradigm anytime soon. Levels of nationalism, racism, unnecessary
systemic poverty and fear are yet again increasing as capitalism (or rather
our monetary system) collapses yet again ... just as if we indeed fail to
learn from history and to adapt or fail to do so fast and thoroughly enough.

The question "Is overcoming fear, competition, ego and capitalism an
AGI-hard problem?" of course is just a "subset" of the original question.

If long term survival actually represents an AGI-hard problem then the
question is how soon does a technological civilization need to arrive at
AGI technology in order to stand a chance?

Any thoughts?

regards,

-- ch



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to