-Caveat Lector-

 >From The News Observer,
http://www.news-observer.com/monday/business/Story/508038p-505358c.html
-
Published: Monday, June 18, 2001 3:43 a.m. EDT

Paul Gilster

Robots raise issue of what 'alive' is

If you're a science-fiction buff, you've heard of Isaac Asimov's Three Laws
of Robotics. They state: 1) A robot may not injure a human being, or,
through inaction, allow a human being to come to harm; 2) A robot must obey
the orders given it by human beings, except where such orders would conflict
with the First Law; and 3) A robot must protect its own existence, except
where such protection would conflict with the First or Second Law.

That's pretty clever stuff, and as a boy growing up on novels such as
Asimov's "The Caves of Steel" and Jack Williamson's "The Humanoids," I
always assumed that by 2000 or so, such laws would be needed. It seemed
obvious that technology would produce mechanical devices that could do the
heavy lifting, while we organic beings led a life of leisure. No more
drudgery for 21st-century man!

It never occurred to me that the first home robot would be a battery-powered
lawn mower. Robomower was designed by an Israeli company called Friendly
Robotics (http://www.friendlyrobotics.com) To use it, you lay a navigation
wire around the plot to be mowed and let the robot's sensors navigate as it
cuts and mulches your grass.

Yet though Friendly Robotics has an energizing motto ("Mission: To Be the
Leader in Home Robotics"), it's clear that the $795 Robomower is a long way
from human-style awareness, much less independent action.

Much the same can be said about the industrial robots that spray parts, weld
joints and insert chips in manufacturing settings around the world. They can
do one thing only, and they have little flexibility in adapting to changing
circumstances.

But interesting things are happening on the robotics front. Consider a
company called Cybermotion (http://www.cybermotion.com) A glance at its Web
site reveals CyberGuard, a robotic security system that patrols warehouses,
factories or other industrial areas, up to 15 miles per night. CyberGuard's
sensors can identify smoke, industrial spills and the presence of intruders.

Down inside CyberGuard is a computer that uses dead reckoning and what the
company describes as "uncertainty modeling" and "fuzzy logic" to help it
learn and adapt. Approximately 100 instructions direct the robot and control
its subsystems, connecting to a base station that monitors its progress.
CyberGuard can be sent on pre-designed pathways or it can be set to operate
without human intervention, navigating around obstacles with ease.

CyberGuard is out there doing its stuff as we speak. In fact, a U.S. Army
performance assessment report in 1999 gave the security robot its highest
rating for quality and service. The Army has bought a number of them to keep
an eye on warehouses, and so have major pharmaceutical companies such as
GlaxoSmithKline, Bayer and Novartis.

Only in the most general sense can CyberGuard be described as "humanoid"; in
fact, it looks sort of like an overhead projector on wheels.

But ongoing work at MIT by the Humanoid Robotics Group is delving deeply
into the question of how to create a humanoid robot that not only operates
independently but also manages to have a social life of sorts with people.

One project is called Kismet, an experiment with facial expression and body
posture that would allow man/machine interactions to be more intuitive.
Kismet is basically a robotic head that uses vocalizations and facial
expressions driven by 21 motors to convey information. Kismet's face is
remarkably expressive, with "eyes" that swivel and "lips" that convey a wide
range of "feelings."

Another MIT project, called Cog, is an attempt to build a humanoid robot
that approximates the dynamics of the human body. Already given adaptive,
lifelike arms, Cog's hands are being developed now (no legs as yet, though
MIT has another group working on the specifics of leg articulation).
Ultimately, COG is about the blend of hardware with artificial intelligence,
to create a robot that learns not by instruction but by trial and error.

Run with a certain puckish humor by computer science professor Rodney A.
Brooks, the Cog project has its own answer to Asimov's laws.

As Brooks says on the project Web site: "The truth is that our lab focuses
on building robots that are as human as possible. Even if we were successful
in all of our goals (which is, in technical terms, 'not likely'), the robot
would have no 'super-human' abilities. It would be no more likely to take
over the world than, say, Pulitzer-Prize winning film critic Roger Ebert."

But in a more serious vein, Brooks puts his work in this light: "My burning
question is what is it that lets matter transcend itself to become living."
And though that has always been a philosophical question, it's now one that
enters the world of engineering in this fascinating work. You can read more
about MIT's Humanoid Robotics Group at
http://www.ai.mit.edu/projects/humanoid-robotics-group

And consider ongoing work to meld biology and robotics. The result is a
so-called "cyborg." At Northwestern University in Chicago, researchers have
merged the brain stem of a lamprey eel with a small robot, producing a
creature that uses biological inputs to drive robotic wheels, allowing it to
track a beam of light.

And Duke University's Miguel Nicolelis and a team of researchers have used
nerve signals from a monkey's brain to control a robotic arm. The work has
possibilities in creating machine assistance for the control of artificial
arms or legs. In real-world terms, such fusions may solve physical problems
once thought untreatable. In a broader philosophical realm, Brooks' question
about when matter becomes living is likely to heat up as this work
continues.

Paul A. Gilster can be reached at [EMAIL PROTECTED]

<A HREF="http://www.ctrl.org/";>www.ctrl.org</A>
DECLARATION & DISCLAIMER
==========
CTRL is a discussion & informational exchange list. Proselytizing propagandic
screeds are unwelcomed. Substance—not soap-boxing—please!  These are
sordid matters and 'conspiracy theory'—with its many half-truths, mis-
directions and outright frauds—is used politically by different groups with
major and minor effects spread throughout the spectrum of time and thought.
That being said, CTRLgives no endorsement to the validity of posts, and
always suggests to readers; be wary of what you read. CTRL gives no
credence to Holocaust denial and nazi's need not apply.

Let us please be civil and as always, Caveat Lector.
========================================================================
Archives Available at:
http://peach.ease.lsoft.com/archives/ctrl.html
 <A HREF="http://peach.ease.lsoft.com/archives/ctrl.html";>Archives of
[EMAIL PROTECTED]</A>

http:[EMAIL PROTECTED]/
 <A HREF="http:[EMAIL PROTECTED]/";>ctrl</A>
========================================================================
To subscribe to Conspiracy Theory Research List[CTRL] send email:
SUBSCRIBE CTRL [to:] [EMAIL PROTECTED]

To UNsubscribe to Conspiracy Theory Research List[CTRL] send email:
SIGNOFF CTRL [to:] [EMAIL PROTECTED]

Om

Reply via email to