Thanks for the details Mathieux, very much appreciated!

-----------------------------------------------
Ahmidou Lyazidi
Director | TD | CG artist
http://vimeo.com/ahmidou/videos
http://www.cappuccino-films.com


2013/7/3 Greg Punchatz <g...@janimation.com>

> Sounds brilliant . I need to see the movie now.
>
>
>
> Sent from my iPhone
>
> On Jul 2, 2013, at 11:19 AM, Mathieu Leclaire <mlecl...@hybride.com>
> wrote:
>
> > Hi guys,
> >
> > I just wanted to share some information on the shots we did for White
> House Down.
> >
> > First off, there's an article in fxguide that explains a bit what we did
> :
> >
> >
> http://www.fxguide.com/featured/action-beats-6-scenes-from-white-house-down/
> >
> >
> > And here is some more details about how we did it :
> >
> > We built upon our ICE based City Generator that we created for Spy Kids
> 4. In SK4, all the buildings where basically a bunch of instances (windows,
> wall, doors, etc.) put together using Softimage ICE logic to build very
> generic buildings. ICE was also used to create the streetscape, populate
> the city with props (lamp post, traffic lights, garbage cans, bus stops,
> etc.), distribute static trees and car traffic. Everything was instances so
> memory consumption was very low and render times where minimal (20-30
> minutes a frame in Mental Ray at the time). The city in Spy Kids 4 was very
> generic and the cameras where very high up in the sky so we didn't care as
> much about having a lot of details and interaction on the ground level and
> we didn't really need specific and recognizable buildings either.
> >
> > The challenge in White House Down was the fact that it was Washington
> and we needed to recognize very specific landmarks so it needed to be a lot
> less generic. The action also happens very close to the ground so we needed
> to have a lot more detail on the ground level and there needed to be a lot
> of interaction with the helicopters that are passing by.
> >
> > So we modeled a lot more specific assets to add more variation (very
> specific buildings and recognizable landmarks, more props, more vegetation,
> more cars, etc.). We updated our building generator to allow more
> customizations. We updated our props and cars distribution systems. They
> where all still ICE based instances, but we added a lot more controls to
> allow our users to easily manage such complex scenes. We had a system to
> automate the texturing of cars and props based on rules so we could texture
> thousands of assets very quickly. Everything was also converted to
> Stand-Ins to keep our working scenes very light and leave the heavy lifting
> to the renderer.
> >
> > Which brings me to Arnold.
> >
> > We knew the trick to making these shots as realistic as possible would
> be to add as much details as we possibly could. Arnold is so good at
> handling a lot of geometry and we where all very impressed by how much
> Arnold could chew (we where managing somewhere around 500-600 million
> polygons at a time) but it still wasn't going to be enough, so we built a
> deep image compositing pipeline for this project to allowed us to add so
> much more detail to the shots.
> >
> > Every asset where built in low and high resolution. So we basically
> loaded whatever elements we where rendering in a layer as high resolution
> while the rest of the scene assets where all low resolution only to be
> visible through secondary rays (so to cast reflections, shadows, GI, etc.).
> We could then combine all the layers through deep compositing and could
> extract whatever layer we desired without worrying about generating the
> proper hold-out mattes at render time (which would have been impossible to
> manage at that level of detail).
> >
> > In one shot, we calculated that once all the layers where merged
> together using our deep image pipeline, it added up to just over 4.2
> billion polygons... though that number is not quite exact since we always
> loaded all assets as lo-res in memory except for the visible elements that
> where being rendered in high resolution. We have a lot of low res geometry
> that is repeated in many layers, so the exact number is slightly lower then
> the 4.2 billion polygons reported, but still... we ended up managing a lot
> of data for that show.
> >
> > Render times where also very reasonable, varying from 20 minutes to 2-3
> hours per frame rendered at 3K. Once we added all the layers in one shot,
> then it came somewhere between 10-12 hours per frame.
> >
> > We started out using Nuke to manipulate our deep images, but we ended up
> creating an in-house custom standalone application using Creation Platform
> from Fabric Engine to accelerate the deep image manipulations. What took
> hours to manage in Nuke could now be done in minutes and we could now also
> exploit our entire render farm to extract the desired layers when needed.
> >
> > Finally, the last layer of complexity came from the interaction between
> the helicopters and the environment. We simulated and baked rotor wash wind
> fields of air being pushed by those animated Black Hawks using Exocortex
> Slipstream. That wind field then was used to simulate dust, debris, tree
> deformations and crowd cloth simulations. Since the trees needed to be
> simulated, we created a custom ICE strand based tree system to deform the
> branches and simulate the leaves movement from that wind field. Since the
> trees where all strand based, they where very light to manage and render.
> We also had created a custom ICE based crowd system for the movie Jappeloup
> that was updated for this project. We bought a couple Kinect to do in-house
> motion capture and build a list of animation cycles of agents reacting to
> helicopters flying over their heads. We then procedurally analyzed the
> movement of the helicopters per shot and randomly select an animation cycle
> from the list and time the cycle in a way that the agents would react when
> the helicopters passed within a specified distance of the agents.
> >
> >
> > These where definitely the most complex and technically challenging
> shots we ever managed here at Hybride. It was achievable thanks to the
> evolution of a lot of in-house tools we had created over the years and
> obviously, thanks to a lot of very talented and hard working people in a
> very short span of 3-4 months.
> >
> >
> > Mathieu Leclaire
> > Head of R&D
> > Hybride Technologies, a Ubisoft division
> >
> >
>
>

Reply via email to