Marcus, what a nifty idea! (http://tinyurl.com/ys388b) Most of
computing does not need to be exact .. a slight "error" generally is
not terrible and for imaging, audio, and so on simply is not
observable by a human.
And there are lots of solutions for making inaccuracy less
observable. A few weeks ago, Steve was using netlogo, a projector,
and a camera to build a camera coordinate system that lets the
computer "know" where a particular event occurs in the world.
To do this, a calibration step of several horizontal and vertical
stripes are projected and the camera collects the data. Steve used
gray-coding:
http://en.wikipedia.org/wiki/Gray_code
to minimize the possible bit errors so that, for example, an error in
the high order bit did not cause a 2^n error, but only an error of 1
(2^0).
I really like this sort of thinking. Letting computers be a bit fuzzy
in areas where slight errors can be managed, especially with adaptive
algorithms to bound the error, seems very reasonable, especially where
the system achieves benefits in other areas such as power consumption
and better random number generation.
Sweet!
-- Owen
On Apr 23, 2009, at 10:02 PM, Marcus G. Daniels wrote:
Possibly of interest..
http://www.cs.rice.edu/~kvp1
http://www.businessweek.com/technology/content/feb2005/tc2005024_2426_tc024.htm
http://www.technologyreview.com/read_article.aspx?ch=specialsections&sc=emerging08&id=20246
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org