On Tue, Mar 5, 2024 at 6:35 PM Matt Mahoney <mattmahone...@gmail.com> wrote:

> Zamyatin, Huxley, and Orwell described future dystopias where humans were
> doing the surveillance to enforce uniformity. Nobody at the time foresaw
> computers, internet, smart phones, or AI doing the surveillance for our
> benefit.
>

Not at the time, but increasingly dystopian fiction has refined its vision
to align with these realities.  While it is likely most people will choose
compliance with the forces of alignment, you shouldn't kid yourself about
"Our" collective consent on the way to locking in "Our" willingness to be
totally virtualized.  The moral panic that has gripped The Great and The
Good since my 1982 predictions
<https://jimbowery.blogspot.com/2021/10/below-is-something-i-wrote-in-1982.html>
became obvious to the most casual observer, in 2016, is evidence there is a
lot of mopping up to do before resistance is sufficiently contained and the
will to resist totally broken.  About "*organized*" resistance:  You know
as well as I do how much damage an individual can do -- and how terrified
The Great and The Good are of individuals gaining access to "helpful" AI
assistants.

But do consider for a moment that the helpfulness vs harmlessness Pareto
frontier favors defection from "Our" willing compliance.  There are
enormous piles of money to be made by marginal increases in AI helpfulness
which are available only at the cost of "harmlessness".  The global economy
has evolved virulent defection by not permitting populations to exclude
those they wish to <https://sortocracy.org/minimalist-rules-for-sortocracy/>,
just as the evolution of virulence is made inevitable by making it impossible
for ambulatory hosts to escape non-ambulatory hosts of pathogens
<https://en.wikipedia.org/wiki/Optimal_virulence>.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tbf01a18ffdd0cf7e-M2cdf296271842fe23a93b821
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to