If some subset of humanity build a general artificial intelligence, and that 
intelligence takes over, or leaves, I don't see what gut biomes or ISIS matter. 
  Nor do I see why wonkiness (w.r.t. Glen's last e-mail) must occur within a 
(sub)population of cybernetic or genetically engineered super-intelligent 
humans that separate themselves from (or control) a legacy human population -- 
either for biological or sociological reasons.   Sure it could occur.    Why 
must it occur?    (Here I am assuming that `wonky' isn't just a word with a 
purposely ambiguous meaning, but is meant to suggest some sort of systemic 
pathology.

-----Original Message-----
From: Friam [mailto:friam-boun...@redfish.com] On Behalf Of Carl
Sent: Friday, June 10, 2016 10:59 AM
To: friam@redfish.com
Subject: Re: [FRIAM] Fascinating article on how AI is driving change in SEO, 
categories of AI and the Law of Accelerating Returns

I was thinking of symbiont in terms of mitochondria, gut biomes, HERVs, 
etc.   I'm also rather increasingly fond of 1G, so if I am to give that 
up, it doesn't seem to me that some long-term fractional G is going to be worth 
it.

You are of course familiar with Golgafrincham?

On 6/10/16 9:23 AM, glen ☣ wrote:
> On 06/09/2016 08:26 PM, Carl wrote:
>> One might do well to remember that we are symbionts (a Good Thing), so, 
>> transcendence for who or what?
> Excellent question!  It's pretty easy to trash faith in various contexts.  I 
> do my best to hunt it down and eradicate it in my own world view.  But one 
> article of faith I'm having a hard time killing is that if _we_ go anywhere 
> (including across some abstract singularity as well as to Mars), we'll _all_ 
> have to go, or at least some kernel of us with a chance of growing into a 
> robust ecosystem.
>
> One of the better senses of the concept of "machine" comes (basically) down 
> to a machine is that which can be adequately sliced out of its environment.  
> Life cannot be so sliced out ... or at least I have yet to eliminate my faith 
> in our systemic/social nature.  We are a film, a lumpy, gooey, sticky, mess.
>
>> On 6/9/16 6:50 PM, Steven A Smith wrote:
>>> The question I suppose, that I feel is in the air, is whether we are 
>>> accelerating toward an extinction event of our own making and whether 
>>> backing off on the accelerator will help reduce the chances of it being 
>>> total or if, as with the source domain of the metaphor,  will backing off 
>>> too fast actually *cause* a spinout?  Or perhaps the best strategy is to 
>>> punch on through?   Kurzweil is voting for "pedal to the metal" (achieve 
>>> transhuman transcendence in time for him to erh... transcend personally?) 
>>> and I suppose I'm suggesting "back off on the pedal gently but with strong 
>>> intent" with some vague loyalty and identity with "humans as we are"...
> You already know I agree with you.  But it helps to repeat it.  The "pedal to 
> the metal" guys sound the same (to me) as climate change deniers.  There are 
> 2 types: 1) people who believe the universe is open enough, extensible 
> enough, adaptive enough, to accommodate our "pedal to the metal" and settle 
> into a (beneficial to us) stability afterwards and 2) those who think we (or 
> the coming Robot Overlords) will be smart enough to intentionally regulate 
> stability.
>
> It's not fear that suggests an agile foot.  It's open-minded speculation 
> across all the possibilities.  But the metaphor falls apart.  It's not 
> out-driving our headlights so much as barely stable bubbles of chemicals, 
> which is what we are.  And it only takes a slight change in, say, medium pH 
> to burst all of us bubbles ... like wiping your finger on your face and 
> sticking it into the head on your beer ... add a little skin oil and it all 
> comes crashing down.
>
>>> so who am I to argue with the end of an individual life, culture or species?
> Hear, hear.  Besides, death is a process.  And it may well feel good:
>
>    
> http://www.nature.com/scitable/blog/brain-metrics/could_a_final_surge_
> in
>
>


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe 
http://redfish.com/mailman/listinfo/friam_redfish.com
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com

Reply via email to