2008/10/14 Terren Suydam <[EMAIL PROTECTED]>: > > > --- On Tue, 10/14/08, Matt Mahoney <[EMAIL PROTECTED]> wrote: >> An AI that is twice as smart as a >> human can make no more progress than 2 humans. > > Spoken like someone who has never worked with engineers. A genius engineer > can outproduce 20 ordinary engineers in the same timeframe. > > Do you really believe the relationship between intelligence and output is > linear?
I'm going to use this post as a place to grind one of my axes, apologies Terren. The relationship between processing power and results is not necessarily linear or even positively correlated. And as an increase in intelligence above a certain level requires increased processing power (or perhaps not? anyone disagree?). When the cost of adding more computational power, outweighs the amount of money or energy that you acquire from adding the power, there is not much point adding the computational power. Apart from if you are in competition with other agents, that can out smart you. Some of the traditional views of RSI neglects this and thinks that increased intelligence is always a useful thing. It is not very There is a reason why lots of the planets biomass has stayed as bacteria. It does perfectly well like that. It survives. Too much processing power is a bad thing, it means less for self-preservation and affecting the world. Balancing them is a tricky proposition indeed. Will Pearson ------------------------------------------- agi Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/ Modify Your Subscription: https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34 Powered by Listbox: http://www.listbox.com