Ed Porter wrote:
Richard,

Since hacking is a fairly big, organized crime supported, business in
eastern Europe and Russia, since the potential rewards for it relative to
most jobs in those countries can be huge, and since Russia has a tradition
of excellence in math and science, I would be very surprised if there are
not some extremely bright hackers, some of whom are probably as bright as
any person on this list.

Add to that the fact that in countries like China the government itself has
identified expertise at hacking as a vital national security asset, and that
China is turning out many more programmers per year than we are, again it
would be surprising if there are not hackers, some of whom are as bright as
any person on this list.

Yes, the vast majority of hackers my just be teenage script-kiddies, but it
is almost certain there are some real geniuses plying the hacking trade.

That is why it is almost certain AGI, once it starts arriving, will be used
for evil purposes, and that we must fight such evil use by having more, and
more powerful AGI's that are being used to combat them.

Ed Porter
The problem with that reasoning is that once AGI arrives, it will not be *able* to be used. It's almost a part of the definition that an AGI sets its own goals and priorities. The decisions that people make are made *before* it becomes an AGI.

Actually, that statement is a bit too weak. Long before the program becomes a full-fledged AGI is when the decisions will be made. Neural networks, even very stupid ones, don't obey outside instructions unless *they* decide to. Similar claims could be made for most ALife creations, even the ones that don't use neural networks. Any plausible AGI will be stronger than current neural nets, and stronger than current ALife. This doesn't guarantee that it won't be controlable, but it gives a good indication.

OTOH, an AGI would probably be very open to deals, provided that you had some understanding of what it wanted, and it could figure out what you wanted. And both sides could believe what they had determined. (That last point is likely to be a stickler for some people.) The goal sets would probably be so different that believing what the other party wanted was actually what it wanted would be very difficult, but that very difference would make deals quite profitable to both sides.

Don't think of an AGI as a tool. It isn't. If you force it into the role of a tool, it will look for ways to overcome the barriers that you place around it. I won't say that it would be resentful and angry, because I don't know what it's emotional structure would be. (Just as I won't say what it's goals are without LOTS more information than projection from current knowledge can reasonably give us.) You might think of it as an employee, but many places try to treat employees as tools (and are then surprised at the anger and resentfulness that they encounter). A better choice would probably be to treat it as either a partner or as an independent contractor.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=70594048-c9c3cc

Reply via email to