On 5/15/07, Shane Legg <[EMAIL PROTECTED]> wrote:
Hmmm. Ok, imagine that you have two optimization algorithms X and Y and they both solve some problem equally well. The difference is that Y uses twice as many resources as X to do it. As I understand your notion of intelligence, X would be considered more intelligent than Y. True?
This situation won't happen in a system designed according to my notion of intelligence, since the system normally won't solve the problem using the same algorithm. Instead, each instance will be handled in a case by case manner, and the resource expense will be a variable, not a constant, over the occurrences of the problem. However, in general I do think that, other things being equal, the system that uses less resources is more intelligent.
Essentially then, according to you intelligence depends on how well a system can perform per unit of resources consumed?
I won't put it in this way, because resource cost is just one of many factors that determine the intelligence of a system. Other factors include the complexity of input/output signals, the depths of processing, etc. I'm not ready to suggest a complete list.
> beside input/output of > the system, you assume the rewards to be maximized come from the > environment in a numerical form, which is an assumption not widely > accepted outside the reinforcement learning community. For example, > NARS may interpret certain input as reward, and certain other input as > punishment, but it depends on many factors in the system, and is not > objective at all. For this kind of systems (I'm sure NARS isn't the > only one), how can your evaluation framework be applied? NARS can... - accept a number as input? - be instructed to try to maximise this input? - interact with its environment in order to try to do this? I assume NARS is able to do all of these things.
Though NARS has the potential to work in the environment you specified, it is not designed to maximize a reward measurement given by the environment. Pei ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936