On Aug 23, 5:58 pm, meekerdb <meeke...@verizon.net> wrote: > On 8/23/2011 2:13 PM, Craig Weinberg wrote: > > > The basic difference is the ability to feel. Literally proving it > > would require a brain implant that remotes to the device, but I would > > be very impressed if a machine could convincingly answer personal > > questions like 'what do you want', or 'what's bothering you'. If they > > could continue to converse fluently about those answers and reveal a > > coherent personality which was not preconfigured in the software. > > "Not preconfigured in software" sounds like an escape clause. Your use > of speech was preconfigured in the software of your brain. All infants > learns to speak the language they hear - and if they don't hear any they > make one up.
Right. Making one up = not preconfigured. If a machine can make a coherent identity up for itself with a point of view without having any templates to choose from, then I would be impressed. Note that infants making up their own language don't wind up with a mix of French, Chinese, and Braille. Let a machine tell me what it wants or how it feels without a programmer telling it how it might answer. Craig -- You received this message because you are subscribed to the Google Groups "Everything List" group. To post to this group, send email to everything-list@googlegroups.com. To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.