http://www.codeplex.com/singularity
---
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription:
Ben - your email scared me. I thought the evil empire (I can say that since
I worked for them for a few years) achieved *some* level of cognition / AGI
... even the most rudimentary signs of intelligence / learned behavior -
prediction machine.
Whew! It's not that at all! I know they are
A more likely scenario is that someone else creates an AGI and then
Microsoft copies it some time later. But seriously, if someone does manage
to produce a working AGI it's probably game over for software engineering
and software companies as we know them today.
On 24/03/2008, Aki Iskandar
Thanks for asking. I will try to come up with a simple model during the
next week. I can create an example because the principle can be used in
well-defined constrained models or in more extensible models.
The theory does not answer all questions about AGI. I would think that
should be taken
I agree with your statement, if someone does manage to produce a working
AGI it's probably game over for software engineering and software companies
as we know them today.But another equally likely scenario is that
Microsoft will buy it - and not reverse engineer it. Perhaps they can't
You're thinking too small. The AGI will distribute itself. And money is
likely to be:
a.. rapidly deflated,
b.. then replaced with a new, alternate currency that truly values talent and
effort (rather than just playing with the money supply -- aka interest,
commissions, inheritances,
I agree with Mark.
The reason the readers of this forum should seek to control AGI development is
to ensure friendly behavior, rather than leaving this responsibility to an Evil
Company or to some military organization.
With human labor removed as a constraint on our system's economic