Samantha Atkins wrote:
Charles D Hixson wrote:
Stathis Papaioannou wrote:
Available computing power doesn't yet match that of the human brain,
but I see your point, software (in general) isn't getting better
nearly as quickly as hardware is getting better.
Well, not at the personally accessible level. I understand that
there are now several complexes that approximate the processing power
of a human brain...of course, they aren't being used for AI development.
Where? I have not heard of any such.
I believe that IBM has one such. The successor to blue-gene. I think
this is the one that they intend to use to model protein folding in a
proteome. Also I believe that the DOD World Simulator has such a
computer complex. (That one I'm less certain of, but since it's
supposed to be doing crude modeling down the small statistical groups of
individuals...I don't see how it could have any less and still do what
it's supposed to do, i.e. predict popular trends by modeling individual
behavior.) Another possibly comparable complex is the weather
department's world weather model...but I somehow doubt that it
qualifies. It's only supposed to get down to the 100 square kilometer
level in it's simulations (i.e., each 100 square kilometers is
represented by one simulator node). That's getting up there, but
probably not quite close enough. (I *said* that none of them were used
for AI :-) )
I'm also willing to admit that I may be quite off in the amount of
processing that a brain does. I tend to think of brains as being much
simpler than many people do. It's my suspicion that a tremendous amount
of what brains do is housekeeping that other implementations wouldn't
need rather than calculation that's needed for any implementation.
Evolution is a great strategy, but it has it's problems, and one of the
problems is that if it's got a working system, it never does a
clean-room reimplementation. So we combine breathing and drinking in
the same tube, and several other basic engineering design defects. The
work-arounds are quite elegant, but that doesn't mean that they aren't
sub-optimal. In this case what I think happened was that neural-net
control systems got the first-mover advantage, and nothing else could
ever get good enough fast enough to surpass them. But they've got their
defects. Dying when oxygen is removed for even a short amount of time,
e.g. Another is LOTS of maintenance. (Admittedly this isn't all
inherent in neural nets, but merely in the particular implementation
that evolved...but that's what we're dealing with when we calculate how
much thinking a brain does.)
OTOH, if things continue along the same curve (and Vista appears to be
pushing the trend), then it won't be too long. A question I wonder
about is "what's the power supply need of one of these?", but I don't
think that answers at that fine a level are predictable.
Surely you jest. Vista sucks so bad that Microsoft is having to eat
crow and make it easy to escape back to Windows XP. There is
nothing, not a damn thing, revolutionary at the software level in Vista.
- samantha
It's because it's so bad that Vista is pushing the hardware envelope.
You need twice as much hardware to get the machine to even be bootable.
More than that it required for usability. (I may have the precise
numbers wrong...I haven't tried using it and am going by lurid published
reports.)
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=7d7fb4d8