Sorry, the new version of the book I mentioned (I read the old one) is
called "Principles of Neural Science".

With regards to computer power, I think it is very important.  The average
person doing research in AI (i.e. a PhD grad student) doesn't have access
to much more than a PC or perhaps a small cluster of PCs.  So it's all
very well that IBM can build super computers with vast amounts of power,
but most of us don't get access to such machines --- we're many orders
of magnitude behind this.

The other thing is that what is necessary for some algorithm to solve
a problem is very different to what was needed to develop the algorithm
in the first place.  To develop a machine learning algorithm you might want
to test it on 10 different data sets, with various different parameter
settings,
and a few different versions of the algorithm, and then run it many times in
each of these configurations order to get accurate performance statistics.
Then you look at the results and come up with some new ideas and repeat.
Thus, even if algorithm Y is a decent AGI when run on hardware X, you
probably want 100X computer power in order to develop algorithm Y.

Shane

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to