Hi Phillip,

Unless you're willing to present a great deal of evidence in relation to ET
or time machines, its probably best you not mention them, for I hope,
obvious reasons.

Isolate states exporting anarchy or not attempting to participate in
> globalized workforce. Begin measuring purchasing parity adjusted annual cost
> to provide a Guaranteed Annual Income (GAI) in various nations.


Anarchy technically means "no rulers," though by anarchy you may mean
"chaos." Socially speaking, you might want to consider it a lack or
unwillingness to collaborate or communicate rather hinting at
incomprehensibility. Bryan, it would seem we're kindred spirits on this
subject. ;)

A basic income would help encourage those to work in more self motivated
ways rather than by coerced social pressures to "get a job" "because you
have to" "to pay the bills." Because of labor deskilling caused by
productive development overall, the unfolding of AGI will be another can of
worms to trump all previous cans filled with worms, no doubt. A GAI, or even
more cleverly rearranged as "GIA" would hint at "socialist"  underpinnings
of basic income proposals.

3) Brainstomring of industries required to maximize longeivty, and to handle
> technologies and wield social systems essential for safely transitioning
> first to a medical/health, then to a leisure society.


By longevity do you mean the continuation of distributed resources? Industry
usually entails methods of profit making, none of which will occur once
abundant resources sink markets that where once proprietary in nature. I
formed an organization to tackle your #3. Its certainly a topic likely for
more discussion as labor economies tank. For now, we're just a bunch of
cukes involved in the so-called technological unemployment debate.

8) Mature medical ethics needed. Mature medical AI safeguards needed.
> Education in all medical AI-relevant sectors. Begin measuring AI medical R+D
> advances vs. human researcher medical R+D advances.


Why the emphasis on health? Running a fever are you? ;P

Potentially powerful occupations and consumer goods will require increased
> surveillence. Brainstorming metrics to determine the most responsible
> handlers of a #13 technology (I suggest something like the CDI Index as a
> ranking).


I think it would be better to build safe guards into produced items that
prevent measures for surveillance. CDI analysis will eventually transition
from monetary figures to resource allocation statistics, measuring in the
realms of physics rather than financial figures, because money will
gradually lose face.

To maintain security for some applications it may be necessary to engineer
> entire cities from scratch. Sensors should be designed to maximize human
> privacy rights. The is a heighten risk of WWIII from this period on until
> just after the technology is developed.


The progress made in the open source community suggests strongly that open
and freely available tools, if applied to all resource domains, would
prevent potential global warfare. Dinosaur economies rely on this warfare
business when money doesn't circulate, and this lack of circulation is often
due to greed. Hoarding and scarcity insights war. Abundance fosters peace.
Of course, there isn't much money in either abundance or peace, so let us
hope the ride isn't too rocky before abundant society is the norm. Perhaps
abundant thinking will catch on and spread much like the internet did,
deemed by default as a worthwhile universal endeavor. Resistance to
abundance and adherent measures that insist on scarcity based economies will
only cause more bloodshed as more productive technologies develop.
Information wants to be free.

Safe AGI/AI software programs would be needed before desired humane
> applications should be used. Need mature sciences of psychology and
> psychiatry to assist the benevolent administration of this technology. Basic
> Human Rights, goods and services should be administered to all where
> tyrannical regimes don't possess military parity.


Yes. AGI could help distribute resources in a way less arguably fair and
reduce harm to a minimum. If AGI is introduced it will collapse scarce
economies, and all of them over time, allocating resources and spatial
boundaries in an abundant or more preferred manner.

 I suggest at this point invoking a Utilitarian system like Mark Walker's
> "Angelic Heirarchy"


Preference Utilitarianism is a good one. Social heirarchy, whether angelic
or otherwise, poses problems. It fuels one-upmentships, creates classes,
rewards or punishes, and environments that compete for arbitrary reasons.
There are of course many tastes. Maybe some folks will prefer this angelic
hierarchy business. Like anything, it's not for everyone. AGI could help
match different personalities and establish places for a variety of
political behaviors.

Nathan



-------------------------------------------
singularity
Archives: http://www.listbox.com/member/archive/11983/=now
RSS Feed: http://www.listbox.com/member/archive/rss/11983/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=4007604&id_secret=104200892-0d3a07
Powered by Listbox: http://www.listbox.com

Reply via email to