So IMHO if you want to sell AGI to investors you better start with
replacing
travel agents, brokers, receptionists, personal assistants etc. etc.
rather than
researchers.
I'm sorry, but this makes no sense at all: this is a complete negation of
what "AGI" means.
Actually . . . . sorry, Richard . . . . but why does it matter what AGI
means? You are trying to sell a product for money. Why do you insist on
attempting to sell someone something that they don't want just because *you*
believe that it's better than what they do want? Why not just sell them
what they want (since they get it for free with what you want) and be happy
that they're willing to fund you?
If you could build a (completely safe, I am assuming) system that could
think in *every* way as powerfully as a human being, what would you teach
it to become:
1) A travel Agent.
2) A medical researcher 3) An expert in AGI system design,
4) All of the above. But I'd just market it as a travel agent to the people
who want a travel agent and a medical researcher to the drug companies (the
AGI expert would have it figured out but would have no spare cash :-)..
To say to an investor that AGI would be useful because we could use them
to build travel agents and receptionists is to utter something completely
incoherent.
Not at all. It is catering to their desires and refraining from forcibly
educating them. Where is the harm? It's certainly better than getting the
door slammed in your face.
This is the "Everything Just The Same, But With Robots" fallacy.
No, it's not because you're not saying that everything is going to be the
same. All you're saying is that travel agents *can* be replaced without
insisting on pointing out that *EVERYTHING* is likely to be replaced.
-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription:
http://www.listbox.com/member/?member_id=8660244&id_secret=101455710-f059c4
Powered by Listbox: http://www.listbox.com