On Oct 28, 2012, at 10:04, "Jones Beene" <jone...@pacbell.net> wrote:

> This step of self-programming will allow "them" to evolve on their own, and
> the time frame could be shorter than expected - without morals, without
> "empathy" ... which is essentially what Bill Joy was implying

Very interesting thread. I never thought about the Turing or reverse-Turing 
tests in the light that has been discussed (or the implications).

Setting aside the question of sentience and how to determine it, we already 
have machine learning, and it is becoming increasingly effective.

Closer to the present day, I think of technology as an extension of the person 
using it -- a way of increasing his or her effectiveness in the world. In that 
regard I wonder about the psychological consequences for others. What will it 
be like to live in a world where a swarm of hummingbird-sized assassin robots 
can be sent out on a mission to take out an important decision maker in the 
Iranian defense establishment, and then for the reverse to occur in 
retaliation?  As for the hummingbird-size robots, they will have the ability to 
"linger" and are under active development.

What happens when this technology falls into the hands of organized crime?

Eric

Reply via email to