With new chip technology, Hollywood digital effects are almost lifelike

By Steve Johnson
San Jose Mercury News

Jan 5, 2009

http://www.siliconvalley.com/portlet/article/html/fragments/print_article.jsp?articleId=11359212&siteId=573


With a big assist from Silicon Valley technology, a movie superstar like 
Angelina Jolie could keep starring as Lara Croft in "Tomb Raider" 
sequels — forever.

Aided by increasingly powerful microprocessors and incredibly 
sophisticated software, movie makers and video game developers are 
getting closer to achieving the holy grail of animation: creating 
computer-generated actors that are visually indistinguishable from real 
people. Consider it Hollywood's most special effect.

Experts say that could bring revolutionary changes for film lovers and 
game players. Stars could keep playing iconic roles even as they aged 
past the point of believability, like Jolie as Croft or Daniel Radcliffe 
as Harry Potter. At the same time, video games could look more realistic 
— in fact, more like movies themselves.

"Basically anything a person can dream up, we'll be able to create," 
said Mark Starkenburg, chief executive of Santa Monica-based Image 
Metrics, which recently made a remarkably lifelike computer-generated 
video of a soap-opera actress using the latest chips from Advanced Micro 
Devices of Sunnyvale.

The video isn't perfect, however, and experts say chips and animation 
software need to get much better to be able to produce 
computer-generated actors that look identical to the original.

But "we may be getting to the tipping point," said Rick Bergman, general 
manager of AMD's graphics products group. "With what we're starting to 
deliver with our chips, the computing power is getting real close."

Over the past three decades, computer-produced graphics have created 
stunning visual effects in films such as "Alien," "Total Recall," 
"Jurassic Park," "Titanic" and "Lord of the Rings." Lately, increasing 
numbers — including all of the new Disney-Pixar collaborations, such as 
"Bolt" and the upcoming "Monsters vs. Aliens" — are shown in 3-D.

But while animators have been able to make astonishingly 
realistic-looking representations of buildings, trees and other objects, 
the complexity of the human face and its subtle emotions have proven too 
difficult to replicate.

For now, producers have generally avoided even trying to make digital 
characters that look like actual people. And when they have, they have 
often blundered into what those in the industry call the "uncanny 
valley." That's where animated faces seem so devoid of normal human 
expressiveness they appear zombielike, a problem critics claim 
especially cropped up in the 2004 movie "Polar Express" that starred a 
synthetic Tom Hanks.

To avoid that, movie producers sometimes get highly creative with camera 
angles or by cropping images, while video game makers often just show 
their characters from the back, said Scott Cronce, vice president of 
technology for Redwood City video game giant Electronic Arts.

But in creating its video of the television soap opera actress — Emily 
O'Brien, who has appeared in "The Young and the Restless" — Image 
Metrics found another solution. It used a device developed by the 
University of Southern California's Institute for Creative Technologies, 
which can digitally capture enormous amounts of visual detail about 
human actors, including their faces.

In a sign of how rapidly the technology is developing, Image Metrics 
used AMD's latest graphics card to create the video, featuring powerful 
chips with twice the computing power of what was available just a year 
ago, said AMD spokesman Dave Erskine.

As even more powerful chips are developed, some experts say, it will 
become possible to create full-length movies featuring large numbers of 
animated actors with even more lifelike characteristics.

Computer generation gives film producers enormous creative flexibility. 
Besides allowing them to change an actor's appearance — by digitally 
exaggerating their movements or facial expressions, for example — it 
also makes it easy to create elaborate animated environments, which can 
be a lot cheaper than having to fly an entire movie crew to exotic 
locations to shoot scenes.

Moreover, being able to create digital characters that are 
indistinguishable from real people would enable performers who have 
grown older, or have even died, to continue appearing in movies, said 
Jules Urbach, who has licensed USC's technology to use in his Burbank 
animation business, LightStage.

Urbach said an actor in his 30s — whom he declined to identify — 
recently asked him to capture the man's image with LightStage so the 
actor can star in future animated films without ever looking a day older 
than he does now.

If needed, Urbach added, the actor's words could be digitally generated 
years from now through computerized voice reconstruction, a technology 
that also is rapidly advancing.

While all this might enable actors to collect movie royalties well past 
their prime, some critics have decried giving animators so much creative 
control. Yet Mark Friedlander, national director of new media for the 
Screen Actors Guild, said his members shouldn't panic yet over what he 
called "blurring the line between the real and the synthetic."

"It certainly is something we're beginning to watch," he said. But he 
added, "I don't really see technology in any way replacing performers. I 
see it enhancing the possibility of storytelling."

HOW IT WAS DONE

In making animated movies of people, producers in the past have used a 
method in which they pasted the actor"s body with sensors. Then the 
actor performs a series of movements, which are recorded in a form that 
can be used to make computer-generated representations of the same motions.

But it"s hard to place a sensor on an actor"s eye or tongue. And on the 
parts of a face where a sensor can be applied, the devices have trouble 
picking up every flicker of sarcasm, anger or fear. So to moviegoers, 
the animated versions wind up looking so devoid of normal human emotion 
they appear ghoulish or demonic.

To get around that problem, Santa Monica-based Image Metrics used a 
spherical device created by a researcher at the University of Southern 
California"s Institute for Creative Technologies, which rapidly 
photographs a broad range of human movements in varying lighting 
conditions without sensors.

Using the method with soap-opera actress Emily O"Brien, Image Metrics 
was able to capture far more facial detail of O"Brien than would have 
been possible previously. It then used Advanced Micro Devices" latest 
computer chips to process the data into an animated video of her that 
looks remarkably realistic.

It can be viewed at 
www.amd.com/us-en/assets/content_type/DigitalMedia/46197A_13_EMILY_1.swf.

-- 
================================
George Antunes, Political Science Dept
University of Houston; Houston, TX 77204 
Voice: 713-743-3923  Fax: 713-743-3927
Mail: antunes at uh dot edu

***********************************
* POST TO MEDIANEWS@ETSKYWARN.NET *
***********************************

Medianews mailing list
Medianews@etskywarn.net
http://lists.etskywarn.net/mailman/listinfo/medianews

Reply via email to