Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns
"Now we will build you an endlessly upward world..." https://youtu.be/F7P2ViCRObs ( written from the POV of an AI (if that's even possible)) Speaking of robot overlords, after listening to this starting to think that trade agreements are less about trade than about big data. C On 6/10/16 3:21 PM, Marcus Daniels wrote: s/white guys playing basketball/scientists without engineers around/ http://www.thewrap.com/snoop-dogg-explains-the-hizzistory-of-bizzasketball-to-jimmy-kimmel-viewers-video/ -Original Message- From: Friam [mailto:friam-boun...@redfish.com] On Behalf Of glen ? Sent: Friday, June 10, 2016 3:18 PM To: friam@redfish.com Subject: Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns On 06/10/2016 11:22 AM, Marcus Daniels wrote: Bah. I'll see your "You kids get off my lawn" and raise you a "Save it just keep it off my wave" .. In particular David Brooks can save it.. That's kinda how I feel when I go to museums. My postmodernist homunculi start thrashing around demanding to know why I'm looking at all this useless and meaningless stuff ... drives my nihilist homunculi crrraaazy. -- ☣ glen FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns
s/white guys playing basketball/scientists without engineers around/ http://www.thewrap.com/snoop-dogg-explains-the-hizzistory-of-bizzasketball-to-jimmy-kimmel-viewers-video/ -Original Message- From: Friam [mailto:friam-boun...@redfish.com] On Behalf Of glen ? Sent: Friday, June 10, 2016 3:18 PM To: friam@redfish.com Subject: Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns On 06/10/2016 11:22 AM, Marcus Daniels wrote: > Bah. I'll see your "You kids get off my lawn" and raise you a "Save it just > keep it off my wave" .. In particular David Brooks can save it.. That's kinda how I feel when I go to museums. My postmodernist homunculi start thrashing around demanding to know why I'm looking at all this useless and meaningless stuff ... drives my nihilist homunculi crrraaazy. -- ☣ glen FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns
On 06/10/2016 11:22 AM, Marcus Daniels wrote: Bah. I'll see your "You kids get off my lawn" and raise you a "Save it just keep it off my wave" .. In particular David Brooks can save it.. That's kinda how I feel when I go to museums. My postmodernist homunculi start thrashing around demanding to know why I'm looking at all this useless and meaningless stuff ... drives my nihilist homunculi crrraaazy. -- ☣ glen FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns
`` I have no idea, which is why I called it "faith" and hand-waved toward the inadequate closures of our current machines. '' Bah. I'll see your "You kids get off my lawn" and raise you a "Save it just keep it off my wave" .. In particular David Brooks can save it.. Marcus FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns
Heh, I'd forgotten about Golgafrincham. It's funny because it's true! The problem lies with the permeable and dynamic boundaries of all these things. And "symbiont" captures the fuzziness of the boundaries quite well. As we've argued till we're blue, _general_ intelligence may well be illusory. It's possible (if not likely) that the only general intelligence we can build will be just as symbiotic with the milieu as we are. Maybe the AI won't rely directly on gut microbes. Maybe it will rely on some other huge population of nanomachines that requires an entire earth to maintain ... perhaps the robot overlords will need promechanic pills to keep their gut nanomachines in healthy proportions. I have no idea, which is why I called it "faith" and hand-waved toward the inadequate closures of our current machines. Yes, I used "wonky" in order to prevent my email text from ballooning out of control. But "pathology" has (almost) a worse type of ambiguity to it because it implies an assumed state of health or normality that wonky doesn't imply. It's fine to adapt to wonky things if one is adaptable enough, like learning to ride a backwards brain bicycle http://www.instructables.com/id/Reverse-steering-bike/. Pathology is almost universally considered bad.) On 06/10/2016 10:12 AM, Marcus Daniels wrote: If some subset of humanity build a general artificial intelligence, and that intelligence takes over, or leaves, I don't see what gut biomes or ISIS matter. Nor do I see why wonkiness (w.r.t. Glen's last e-mail) must occur within a (sub)population of cybernetic or genetically engineered super-intelligent humans that separate themselves from (or control) a legacy human population -- either for biological or sociological reasons. Sure it could occur.Why must it occur?(Here I am assuming that `wonky' isn't just a word with a purposely ambiguous meaning, but is meant to suggest some sort of systemic pathology. -Original Message- From: Friam [mailto:friam-boun...@redfish.com] On Behalf Of Carl Sent: Friday, June 10, 2016 10:59 AM I was thinking of symbiont in terms of mitochondria, gut biomes, HERVs, etc. I'm also rather increasingly fond of 1G, so if I am to give that up, it doesn't seem to me that some long-term fractional G is going to be worth it. You are of course familiar with Golgafrincham? -- ☣ glen FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns
If some subset of humanity build a general artificial intelligence, and that intelligence takes over, or leaves, I don't see what gut biomes or ISIS matter. Nor do I see why wonkiness (w.r.t. Glen's last e-mail) must occur within a (sub)population of cybernetic or genetically engineered super-intelligent humans that separate themselves from (or control) a legacy human population -- either for biological or sociological reasons. Sure it could occur.Why must it occur?(Here I am assuming that `wonky' isn't just a word with a purposely ambiguous meaning, but is meant to suggest some sort of systemic pathology. -Original Message- From: Friam [mailto:friam-boun...@redfish.com] On Behalf Of Carl Sent: Friday, June 10, 2016 10:59 AM To: friam@redfish.com Subject: Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns I was thinking of symbiont in terms of mitochondria, gut biomes, HERVs, etc. I'm also rather increasingly fond of 1G, so if I am to give that up, it doesn't seem to me that some long-term fractional G is going to be worth it. You are of course familiar with Golgafrincham? On 6/10/16 9:23 AM, glen ☣ wrote: > On 06/09/2016 08:26 PM, Carl wrote: >> One might do well to remember that we are symbionts (a Good Thing), so, >> transcendence for who or what? > Excellent question! It's pretty easy to trash faith in various contexts. I > do my best to hunt it down and eradicate it in my own world view. But one > article of faith I'm having a hard time killing is that if _we_ go anywhere > (including across some abstract singularity as well as to Mars), we'll _all_ > have to go, or at least some kernel of us with a chance of growing into a > robust ecosystem. > > One of the better senses of the concept of "machine" comes (basically) down > to a machine is that which can be adequately sliced out of its environment. > Life cannot be so sliced out ... or at least I have yet to eliminate my faith > in our systemic/social nature. We are a film, a lumpy, gooey, sticky, mess. > >> On 6/9/16 6:50 PM, Steven A Smith wrote: >>> The question I suppose, that I feel is in the air, is whether we are >>> accelerating toward an extinction event of our own making and whether >>> backing off on the accelerator will help reduce the chances of it being >>> total or if, as with the source domain of the metaphor, will backing off >>> too fast actually *cause* a spinout? Or perhaps the best strategy is to >>> punch on through? Kurzweil is voting for "pedal to the metal" (achieve >>> transhuman transcendence in time for him to erh... transcend personally?) >>> and I suppose I'm suggesting "back off on the pedal gently but with strong >>> intent" with some vague loyalty and identity with "humans as we are"... > You already know I agree with you. But it helps to repeat it. The "pedal to > the metal" guys sound the same (to me) as climate change deniers. There are > 2 types: 1) people who believe the universe is open enough, extensible > enough, adaptive enough, to accommodate our "pedal to the metal" and settle > into a (beneficial to us) stability afterwards and 2) those who think we (or > the coming Robot Overlords) will be smart enough to intentionally regulate > stability. > > It's not fear that suggests an agile foot. It's open-minded speculation > across all the possibilities. But the metaphor falls apart. It's not > out-driving our headlights so much as barely stable bubbles of chemicals, > which is what we are. And it only takes a slight change in, say, medium pH > to burst all of us bubbles ... like wiping your finger on your face and > sticking it into the head on your beer ... add a little skin oil and it all > comes crashing down. > >>> so who am I to argue with the end of an individual life, culture or species? > Hear, hear. Besides, death is a process. And it may well feel good: > > > http://www.nature.com/scitable/blog/brain-metrics/could_a_final_surge_ > in > > FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns
I was thinking of symbiont in terms of mitochondria, gut biomes, HERVs, etc. I'm also rather increasingly fond of 1G, so if I am to give that up, it doesn't seem to me that some long-term fractional G is going to be worth it. You are of course familiar with Golgafrincham? On 6/10/16 9:23 AM, glen ☣ wrote: On 06/09/2016 08:26 PM, Carl wrote: One might do well to remember that we are symbionts (a Good Thing), so, transcendence for who or what? Excellent question! It's pretty easy to trash faith in various contexts. I do my best to hunt it down and eradicate it in my own world view. But one article of faith I'm having a hard time killing is that if _we_ go anywhere (including across some abstract singularity as well as to Mars), we'll _all_ have to go, or at least some kernel of us with a chance of growing into a robust ecosystem. One of the better senses of the concept of "machine" comes (basically) down to a machine is that which can be adequately sliced out of its environment. Life cannot be so sliced out ... or at least I have yet to eliminate my faith in our systemic/social nature. We are a film, a lumpy, gooey, sticky, mess. On 6/9/16 6:50 PM, Steven A Smith wrote: The question I suppose, that I feel is in the air, is whether we are accelerating toward an extinction event of our own making and whether backing off on the accelerator will help reduce the chances of it being total or if, as with the source domain of the metaphor, will backing off too fast actually *cause* a spinout? Or perhaps the best strategy is to punch on through? Kurzweil is voting for "pedal to the metal" (achieve transhuman transcendence in time for him to erh... transcend personally?) and I suppose I'm suggesting "back off on the pedal gently but with strong intent" with some vague loyalty and identity with "humans as we are"... You already know I agree with you. But it helps to repeat it. The "pedal to the metal" guys sound the same (to me) as climate change deniers. There are 2 types: 1) people who believe the universe is open enough, extensible enough, adaptive enough, to accommodate our "pedal to the metal" and settle into a (beneficial to us) stability afterwards and 2) those who think we (or the coming Robot Overlords) will be smart enough to intentionally regulate stability. It's not fear that suggests an agile foot. It's open-minded speculation across all the possibilities. But the metaphor falls apart. It's not out-driving our headlights so much as barely stable bubbles of chemicals, which is what we are. And it only takes a slight change in, say, medium pH to burst all of us bubbles ... like wiping your finger on your face and sticking it into the head on your beer ... add a little skin oil and it all comes crashing down. so who am I to argue with the end of an individual life, culture or species? Hear, hear. Besides, death is a process. And it may well feel good: http://www.nature.com/scitable/blog/brain-metrics/could_a_final_surge_in FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns
On 06/10/2016 09:05 AM, Marcus Daniels wrote: > They wouldn't do a Mars One (one way) trip. They are thriving in this > environment. Only `weird' people would do that. There are other options > for people that are willing to take risks. But in Elysium case, yes. That's a good point. But it gets a bit muddied when considering other forms of "leaving", like installing more memory in your head, cognitive enhancing drugs, designer babies, etc. "Organic" food is similar. I suspect the Trumps, Thiels, etc. _will_ do everything they can to leave the rest of us behind, because they see us as parasitic parts of the "we". Even if some of them (Gates, Musk, Branson) have a more generous bent, their attention is limited in the same way everyone else's is. They simply won't spend the time required to understand, say, the role an oxy-addicted instagram addict plays in the "we". My main point with the machine vs. life severability concept was that, in any of these types of "leaving", if we don't take the whole system, then it will go wonky. A great example of "taking all of us when we go" is ISIS. Social media has (I think) transformed us quite a bit. And we brought ISIS right along with us on the transition. The alt-right and neo-reactionaries are the same. What would otherwise be an obvious (small) collection of morons without social media has become part of the existential threat (in part, provided with a recruitment pathway to/through Trump): http://thinkprogress.org/politics/2016/06/09/3786370/students-for-trump-psu/ Like it or not, those jerks are part of us and we will take them with us as we evolve. -- ☣ glen FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns
`` I'll not only consider them. I'll be in the front of the line ... as long as they let lower middle class morons like me in the line at all. I suspect it'll be packed with Trumps, Musks, Thiels, and Bransons. '' They wouldn't do a Mars One (one way) trip. They are thriving in this environment. Only `weird' people would do that. There are other options for people that are willing to take risks. But in Elysium case, yes. Marcus FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns
On 06/10/2016 08:41 AM, Marcus Daniels wrote: > That "we" entered into the discussion is arbitrary (Steve started with that, > I think), and further the statement is tautological. Heh, no, it's not tautological. It relies on the ambiguity of the word (perhaps concept) "we". You're right that it's technically fallacious. But the fallacy isn't that it's tautological. Fallacy can be used to good effect in the same way paradox can. > For example, I'm quite confident I don't need the Trump or ISIS people in my > life at all.I am not a willing symbiant. If there were other ways to > live / other forms to take / other planets or non-terrestrial locations to > inhabit, and life were longer than it is, I would certain consider them. I think we'll be surprised. I'll not only consider them. I'll be in the front of the line ... as long as they let lower middle class morons like me in the line at all. I suspect it'll be packed with Trumps, Musks, Thiels, and Bransons. -- ☣ glen FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns
``But one article of faith I'm having a hard time killing is that if _we_ go anywhere (including across some abstract singularity as well as to Mars), we'll _all_ have to go, or at least some kernel of us with a chance of growing into a robust ecosystem.'' That "we" entered into the discussion is arbitrary (Steve started with that, I think), and further the statement is tautological. For example, I'm quite confident I don't need the Trump or ISIS people in my life at all.I am not a willing symbiant. If there were other ways to live / other forms to take / other planets or non-terrestrial locations to inhabit, and life were longer than it is, I would certain consider them. Marcus FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Re: [FRIAM] Fascinating article on how AI is driving change in SEO, categories of AI and the Law of Accelerating Returns
On 06/09/2016 08:26 PM, Carl wrote: > One might do well to remember that we are symbionts (a Good Thing), so, > transcendence for who or what? Excellent question! It's pretty easy to trash faith in various contexts. I do my best to hunt it down and eradicate it in my own world view. But one article of faith I'm having a hard time killing is that if _we_ go anywhere (including across some abstract singularity as well as to Mars), we'll _all_ have to go, or at least some kernel of us with a chance of growing into a robust ecosystem. One of the better senses of the concept of "machine" comes (basically) down to a machine is that which can be adequately sliced out of its environment. Life cannot be so sliced out ... or at least I have yet to eliminate my faith in our systemic/social nature. We are a film, a lumpy, gooey, sticky, mess. > On 6/9/16 6:50 PM, Steven A Smith wrote: >> The question I suppose, that I feel is in the air, is whether we are >> accelerating toward an extinction event of our own making and whether >> backing off on the accelerator will help reduce the chances of it being >> total or if, as with the source domain of the metaphor, will backing off >> too fast actually *cause* a spinout? Or perhaps the best strategy is to >> punch on through? Kurzweil is voting for "pedal to the metal" (achieve >> transhuman transcendence in time for him to erh... transcend personally?) >> and I suppose I'm suggesting "back off on the pedal gently but with strong >> intent" with some vague loyalty and identity with "humans as we are"... You already know I agree with you. But it helps to repeat it. The "pedal to the metal" guys sound the same (to me) as climate change deniers. There are 2 types: 1) people who believe the universe is open enough, extensible enough, adaptive enough, to accommodate our "pedal to the metal" and settle into a (beneficial to us) stability afterwards and 2) those who think we (or the coming Robot Overlords) will be smart enough to intentionally regulate stability. It's not fear that suggests an agile foot. It's open-minded speculation across all the possibilities. But the metaphor falls apart. It's not out-driving our headlights so much as barely stable bubbles of chemicals, which is what we are. And it only takes a slight change in, say, medium pH to burst all of us bubbles ... like wiping your finger on your face and sticking it into the head on your beer ... add a little skin oil and it all comes crashing down. >> so who am I to argue with the end of an individual life, culture or species? Hear, hear. Besides, death is a process. And it may well feel good: http://www.nature.com/scitable/blog/brain-metrics/could_a_final_surge_in -- ☣ glen FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com