On 3/26/07, Richard Loosemore <[EMAIL PROTECTED]> wrote:
Kaj,
My first thoughts are that my reasons for thinking it will happen are so
extremely local, where the ones you cite are global.
What I mean is that you list a number of very general reasons why, but
in each of your cases I see a conflict between the optimism of your
summary and the reality on the ground. This makes me very nervous.
Richard,
what you're saying does sound worrisome - and now that I think of it,
it does sound plausiblé. After all, scientists do need to spend a lot
of time and effort to master even one field of study - trying to gain
the deep sort of understanding that must be required to successfully
_apply_ complicated concepts from one field to another may very well
be beyond the capabilities of many. Currently I'm only a first-year
CogSci student, but I'm already seeing part of this in my own studies
- I'd like to pick *at least* three minors (psychology, computer
science and maths - philosophy and neurology on top of those would be
nice, too) plus maybe even do a double major at another university to
gain a comprehensive understanding of the field and issues involved -
and the system is pretty much built with the assumption that we'd pick
exactly two minors. Sigh.
Ah well. Still, the things I mentioned in my essay will probably still
help out those few who are brave enough to do true multidisciplinary
work. Even if they are very few in numbers...
Then again, it's probably only good if development towards true AI is
slow. With any luck, a slow take-off won't be as hazardous and
unpredictable as a hard one...
For example, you cite the interdisciplinary nature of this field. Well,
in fact, I am looking at it from the inside (having crossed disciplines,
and having been hanging out with people in the AI/CogSci field ever
since I graduated, and what I see is *lip-sevice* interdisciplinary work
which is always a shotgun marriage at best. I see little fragments of
ideas going across, but always with distortion and simplification, and
almost always in such a way that the idea is *appropriated* rather than
used. To be blunt about it, people take some phrase x-y-z from the
field across the fence, find an excuse to say that they are doing x-y-z
by incorporating some shadow of the real x-y-z, and then they get cool
points from all the folks in their own field, who don't really know what
x-y-z is, but are impressed by the sound of it.
I exagerate slightly, but you get the general idea. To an outsider this
might sound like cynicism on my part -- people think, hey these are
scientists, right? They wouldn't be that crummy, surely? -- but the
horrible, horrible truth is that this really is the way that things
happen. You would not believe the extent to which science these days is
a matter of personal spin and marketing. And this is especially
pronounced in the case of "interdisciplinary" interactions.
In cognitive psychology, for example, interaction with AI folks used to
be a big thing 30 years ago. Then it gradually died out. Today, it
hardly happens at all, except with the rump of the old-school AI folks.
The same story can be applied separately to each of the fields you talk
about. Brain imaging in particular.
But now, on a more positive note, I think that none of what you say will
make any difference BUT some progress will come out of left field, and
that in the fullness of time it will turn out that the one thing that
made the singularity happen was a single set of discoveries of a
theoretical or practical nature.
I happen to believe that I am working in precisely that direction, but I
have to try to keep my self-confidence in check for fear of giving the
wrong impression. But it doesn't have to be me that gets it to work, it
could be you, or someone else that none of us has heard of yet.
Summary: I take everything that the big guns are doing with a pinch of
salt, but I have enormous faith in the creativity of the girls and boys
out there who are trying to think outside the box. They are the ones
who are going to make it happen.
Richard Loosemore
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=11983
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=11983