On 8/28/06, William Pearson <[EMAIL PROTECTED]> wrote:
Possibly I am not explaining things clearly enough. One of my
motivations for developing AI, apart from the challenge, is to enable
me to get the information I need, when I need it.

As a lot of the "power" I have in this world is through what I buy, I
need to have this information available when I might buy something,
which may be when I am in social situations etc. I can be a lot better
ethical consumer with the the details I need at the right time given
to me. As such I am interested in wearable and ubiquitous computing.
Due to the constraints wearable computer place upon the designer, you
really want the correct information given to you and nothing else that
may distract the user unnecessarily.

Ah, so you see this on a wearable... okay... that makes a bit more sense, and also of what you said earlier about computing power, since wearables are much more constrained in that regard than desktops.

Knowing what the correct information is will entail knowing about the
user and the uses current environment. Whether they rate energy
efficiency or CO2 emissions as a priority, for example. It will also
entail the google like system you are focused upon.

I should clarify: I think competing with Google in the search market is a losing proposition, that's already wrapped up; I'd look for new markets that nobody is serving well today. I use it only as an example of a software system that needs a lot of knowledge and computing power and is therefore run on a central rather than local basis.

You have hinted at the normative value of AI, I'm curious what you
find it to be? Is it simply to speed up technological development so
that we can escape the gravity well?

Break the boundaries of space and time that currently apply to human life. Specifically:

1) Escape the gravity well. Or more precisely, we can already do that, but we can't live anywhere other than Earth, because the number of tasks that need to be carried out to keep a person alive for a year vastly exceeds the number of things a person can do in a year. Cracking that complexity barrier needs qualitative technological advances.

2) Stop or at least slow down the loss of fifty-plus million lives per year. That's a matter both of developing the hardware tools to work proficiently at the molecular level (i.e. some form of nanotechnology) and again the software tools to handle the complexity.

To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to