PS-- I am not denying that statistics is applied probability theory. :) When
I say they are different, what I mean is that saying "I'm going to use
probability theory" and "I'm going to use statistics" tend to indicate very
different approaches. Probability is a set of axioms, whereas statistics is
a set of methods. The probability theory camp tends to be bayesian, whereas
the stats camp tends to be frequentist.

Your complaint that probability theory doesn't try to figure out why it was
wrong in the 30% (or whatever) it misses is a common objection. Probability
theory glosses over important detail, it encourages lazy thinking, etc.
However, this all depends on the space of hypotheses being examined.
Statistical methods will be prone to this objection because they are
essentially narrow-AI methods: they don't *try* to search in the space of
all hypotheses a human might consider. An AGI setup can and should have such
a large hypothesis space. Note that AIXI is typically formulated as using a
space of crisp (non-probabilistic) hypotheses, though probability theory is
used to reason about them. This means no theory it considers will gloss over
detail in this way: every theory completely explains the data. (I use AIXI
as a convenient example, not because I agree with it.)

--Abram

On Mon, Jul 12, 2010 at 2:42 PM, Abram Demski <abramdem...@gmail.com> wrote:

> David,
>
> I tend to think of probability theory and statistics as different things.
> I'd agree that statistics is not enough for AGI, but in contrast I think
> probability theory is a pretty good foundation. Bayesianism to me provides a
> sound way of integrating the elegance/utility tradeoff of explanation-based
> reasoning into the basic fabric of the uncertainty calculus. Others advocate
> different sorts of uncertainty than probabilities, but so far what I've seen
> indicates more a lack of ability to apply probability theory than a need for
> a new type of uncertainty. What other methods do you favor for dealing with
> these things?
>
> --Abram
>
>
> On Sun, Jul 11, 2010 at 12:30 PM, David Jones <davidher...@gmail.com>wrote:
>
>> Thanks Abram,
>>
>> I know that probability is one approach. But there are many problems with
>> using it in actual implementations. I know a lot of people will be angered
>> by that statement and retort with all the successes that they have had using
>> probability. But, the truth is that you can solve the problems many ways and
>> every way has its pros and cons. I personally believe that probability has
>> unacceptable cons if used all by itself. It must only be used when it is the
>> best tool for the task.
>>
>> I do plan to use some probability within my approach. But only when it
>> makes sense to do so. I do not believe in completely statistical solutions
>> or completely Bayesian machine learning alone.
>>
>> A good example of when I might use it is when a particular hypothesis
>> predicts something with 70% accuracy, well it may be better than any other
>> hypothesis we can come up with so far. So, we may use that hypothesis. But,
>> the 30% unexplained errors should be explained if possible with the
>> resources and algorithms available, if at all possible. This is where my
>> method differs from statistical methods. I want to build algorithms that
>> resolve the 30% and explain it. For many problems, there are rules and
>> knowledge that will solve them effectively. Probability should only be used
>> when you cannot find a more accurate solution.
>>
>> Basically we should use probability when we don't know the factors
>> involved, can't find any rules to explain the phenomena or we don't have the
>> time and resources to figure it out. So you must simply guess at the most
>> probable event without any rules for figuring out which event is more
>> applicable under the current circumstances.
>>
>> So, in summary, probability definitely has its place. I just think that
>> explanatory reasoning and other more accurate methods should be preferred
>> whenever possible.
>>
>> Regarding learning the knowledge being the bigger problem, I completely
>> agree. That is why I think it is so important to develop machine learning
>> that can learn by direct observation of the environment. Without that, it is
>> practically impossible to gather the knowledge required for AGI-type
>> applications. We can learn this knowledge by analyzing the world
>> automatically and generally through video.
>>
>> My step by step approach for learning and then applying the knowledge for
>> agi is as follows:
>> 1) Understand and learn about the environment(through Computer Vision for
>> now and other sensory perceptions in the future)
>> 2) learn about your own actions and how they affect the environment
>> 3) learn about language and how it is associated with or related to the
>> environment.
>> 4) learn goals from language(such as through dedicated inputs).
>> 5) Goal pursuit
>> 6) Other Miscellaneous capabilities as needed
>>
>> Dave
>>
>> On Sat, Jul 10, 2010 at 8:40 PM, Abram Demski <abramdem...@gmail.com>wrote:
>>
>>> David,
>>>
>>> Sorry for the slow response.
>>>
>>> I agree completely about expectations vs predictions, though I wouldn't
>>> use that terminology to make the distinction (since the two terms are
>>> near-synonyms in English, and I'm not aware of any technical definitions
>>> that are common in the literature). This is why I think probability theory
>>> is necessary: to formalize this idea of expectations.
>>>
>>> I also agree that it's good to utilize previous knowledge. However, I
>>> think existing AI research has tackled this over and over; learning that
>>> knowledge is the bigger problem.
>>>
>>> --Abram
>>>
>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/> | 
>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>> <http://www.listbox.com>
>>
>
>
>
> --
> Abram Demski
> http://lo-tho.blogspot.com/
> http://groups.google.com/group/one-logic
>



-- 
Abram Demski
http://lo-tho.blogspot.com/
http://groups.google.com/group/one-logic



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=8660244-6e7fb59c
Powered by Listbox: http://www.listbox.com

Reply via email to