On Thursday, October 24, 2013 3:08:26 PM UTC-4, JohnM wrote: > > Craig and Telmo: > Is "anticipation" involved at all? Deep Blue anticipated hundreds of steps > in advance (and evaluated a potential outcome before accepting, or > rejecting). > What else is in "thinking" involved? I would like to know, because I have > no idea. > John Mikes >
It's hard to talk about the particulars of pseudo-sentience, since all of our language is geared toward the assumption of sentience. We haven't had time to develop terms to discern between map and territory when the territory is a trompe l'oeil illusion. When we think, we are rehearsing or pretending to some extent. It is an act of imagination that is anticipatory. The etymology of anticipate traces back to a sense of "taking into possession beforehand,". Did Deep Blue take anything into possession, or did it merely exhaust its ritual of mindless reductions - compressing a fourth dimensional object of game permutations into a one dimensional path which matches its mindless criteria? What a computer does would be thinking if it could care what it was thinking about, but since it is built from the outside in, it is incapable of caring about the games that we designed it to play. It isn't playing a game at all, it is filtering one abstract pattern against another without reference to 'before' or 'after'. It's not anticipating from its point of view, it's just rendering a set of positions which satisfy a rule. I think that what complicates the story is that the power of human thought is in it's distance from the feelings and sensations that it has evolved from. Think of the evolution of the human experience as an artistic movement, which has oscillated between realism, impressionism, cubism, and now finally abstract minimalism. Without the whole history of art behind it, the stark forms of minimalism seem simple and mechanical...and they are, in the absence of an appreciation of the whole story of art. Thinking is an art that acts like a science. Computation is a science which we can use to frame art. The danger is that we have overlooked what has led up to thinking and now mistake the frame for the canvas. Thanks, Craig > > > On Thu, Oct 24, 2013 at 1:02 PM, Craig Weinberg > <whats...@gmail.com<javascript:> > > wrote: > >> >> >> On Thursday, October 24, 2013 12:43:49 PM UTC-4, telmo_menezes wrote: >> >>> On Thu, Oct 24, 2013 at 6:39 PM, Craig Weinberg <whats...@gmail.com> >>> wrote: >>> > http://www.theatlantic.com/**magazine/archive/2013/11/the-** >>> man-who-would-teach-machines-**to-think/309529/<http://www.theatlantic.com/magazine/archive/2013/11/the-man-who-would-teach-machines-to-think/309529/> >>> >>> > >>> > The Man Who Would Teach Machines to Think >>> > >>> > "...Take Deep Blue, the IBM supercomputer that bested the chess >>> grandmaster >>> > Garry Kasparov. Deep Blue won by brute force. For each legal move it >>> could >>> > make at a given point in the game, it would consider its opponent’s >>> > responses, its own responses to those responses, and so on for six or >>> more >>> > steps down the line. With a fast evaluation function, it would >>> calculate a >>> > score for each possible position, and then make the move that led to >>> the >>> > best score. What allowed Deep Blue to beat the world’s best humans was >>> raw >>> > computational power. It could evaluate up to 330 million positions a >>> second, >>> > while Kasparov could evaluate only a few dozen before having to make a >>> > decision. >>> > >>> > Hofstadter wanted to ask: Why conquer a task if there’s no insight to >>> be had >>> > from the victory? “Okay,” he says, “Deep Blue plays very good chess—so >>> what? >>> > Does that tell you something about how we play chess? No. Does it tell >>> you >>> > about how Kasparov envisions, understands a chessboard?” A brand of AI >>> that >>> > didn’t try to answer such questions—however impressive it might have >>> > been—was, in Hofstadter’s mind, a diversion. He distanced himself from >>> the >>> > field almost as soon as he became a part of it. “To me, as a fledgling >>> AI >>> > person,” he says, “it was self-evident that I did not want to get >>> involved >>> > in that trickery. It was obvious: I don’t want to be involved in >>> passing off >>> > some fancy program’s behavior for intelligence when I know that it has >>> > nothing to do with intelligence. And I don’t know why more people >>> aren’t >>> > that way...” >>> >>> I was just reading this too. I agree. >>> >>> > This is precisely my argument against John Clark's position. >>> > >>> > Another quote I will be stealing: >>> > >>> > "Airplanes don’t flap their wings; why should computers think?" >>> >>> I think the intended meaning is closer to: "airplanes don't fly by >>> flapping their wings, why should computers be intelligent by >>> thinking?". >>> >> >> It depends whether you want 'thinking' to imply awareness or not. I think >> the point is that we should not assume that computation is in any way >> 'thinking' (or intelligence for that matter). I think that 'thinking' is >> not passive enough to describe computation. It is to say that a net is >> 'fishing'. Computation is many nets within nets, devoid of intention or >> perspective. It does the opposite of thinking, it is a method for >> petrifying the measurable residue or reflection of thought. >> >> >> >>> > -- >>> > You received this message because you are subscribed to the Google >>> Groups >>> > "Everything List" group. >>> > To unsubscribe from this group and stop receiving emails from it, send >>> an >>> > email to everything-li...@**googlegroups.com. >>> > To post to this group, send email to everyth...@googlegroups.**com. >>> > Visit this group at >>> > http://groups.google.com/**group/everything-list<http://groups.google.com/group/everything-list>. >>> > >>> >>> > For more options, visit >>> > https://groups.google.com/**groups/opt_out<https://groups.google.com/groups/opt_out>. >>> > >>> >>> >> -- >> You received this message because you are subscribed to the Google Groups >> "Everything List" group. >> To unsubscribe from this group and stop receiving emails from it, send an >> email to everything-li...@googlegroups.com <javascript:>. >> To post to this group, send email to everyth...@googlegroups.com<javascript:> >> . >> Visit this group at http://groups.google.com/group/everything-list. >> For more options, visit https://groups.google.com/groups/opt_out. >> > > -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To post to this group, send email to everything-list@googlegroups.com. Visit this group at http://groups.google.com/group/everything-list. For more options, visit https://groups.google.com/groups/opt_out.