BlizzCon 2016 Overwatch Animated Shorts Panel Transcript

FX Team

BILL: Whether it is transforming a character out of a cloud of smoke, unearthing a robot buried in dirt and plants, or creating dragons from the tip of an arrow; the effects team gets to create some of the magical elements that help bring our characters to life.

When our jobs starts, we see something like this, and we help create the elements to turn it into something that looks like this; because of the magical nature of the effects elements for the dragons, it was really important that we work closely with the animators as the movement in the speed of the dragons would have a big impact on what the effects would look like.

When we started development on the dragon effects we started simple. One of our senior artists: Shuntarō Furukawa, helped us out here; and we did a test of attaching geometric curves to the body of the dragon, and rendered those curves looking like blue smoke.

This is a first pass, Very rough animation of the dragon just to give us an idea of how this technique would work. we started dialing in how much of the motion from the animation of the dragons would be influenced into the shape of the curves.

You see two different versions here. We might choose something that’s kind of in the middle, and while we were happy with the results of this, it did have a few drawbacks. The smoke tends to travel along with the dragon, and we knew we needed something to live in the space around the dragon.

So taking what we had here, and taking these curves we use that to emit particles. We run it through a pretty lightweight simulation, and we get a little bit more complexity, a little bit more of that magical feel.

In addition to those two elements, we also create a kind of magical dust element, and it all gets combined to make up the dragon body effects.

In addition to the body effects we also were responsible for creating the lightning, the volumetric portal cloud, and also internal volumetric passes for the dragon bodies.

All those elements were rendered out, we hand them down the pipe to the lighting and compositing team, and they put the shot together for us. One of the things about the effects department at Blizzard is we really love to create awesome jaw-dropping complex effects, and one of the biggest challenges in doing these Overwatch episodes was: How do we maintain that high-level standard and that complexity, but do it in a shorter amount of time?

So for example, we knew that we had Reaper in several episodes, and his smoke effects can be pretty complex. Typically, when we create smoke, we do that using a 3D volume that is run through a simulation (often times using fluid dynamics) to give us a realistic result; and while that is a great method, it can be computationally very expensive; and it can take a long time to iterate.

So we knew that we were not going to have the time to do that on all the shots. So we had to look for an alternative. Senior software engineer Chris Allen helped us out, and developed a procedural technique that we call Volume Fill; and what volume fill does is it allows you to take a geometric surface, a curve, or particles, turn it into a volume, and with the controls dialed appropriately it can start to look like smoke.

Let me show you a simple example of a straight static curve that is fed in the volume fill.

We start adjusting some of the noise parameters, we can create a couple copies of that curve, and then we start to animate some of those parameters, and we get something that starts to look like smoke and this does not involve any time-consuming simulations; no simulations at all in fact.

So to take this a step further, we can add a little bit more complexity to it, and we can apply this to curves and geometry, and we can also animate those curves and geometry before it goes into Volume Fill.

Let me show you another simple example. We have a sphere that’s animated. We build geometry based on the motion path of that sphere. We attach curves to it, and we add some dynamics into the curves. All that is fed into volume fill, and we get this nice wispy smoke trail.

Using this technique, I was able to apply that to Reaper’s smoke transformation effects. Here is a test that we did very early on before we had any animation to work with, and we thought it was pretty successful.

So taking this we are able to apply it to the shots that add animation. As funny as it may sound, we just asked the animator to animate Reaper flying across the scene, and let effects replace him with a smoke trail.

So taking this animation, and then applying that volume fill technique, we get something that looks like this.

It took us a little bit of time to dial in the settings and get the smoke looking nice wispy, but then when the director approved this and was happy with the look, we rendered out the element, we handed it off to lighting and compositing, and they put the shot together.

In the final shot the effect looks like this. We are very pleased with the results, however. Like I said, a procedural technique like this we knew would not work on all shots.

If you guys saw the episode this morning: INFILTRATION, there is a shot that Reaper jumps down from a ledge, and transforms into smoke. A shot like this has a lot of character interaction with smoke, the smoke interacts with the environment. So we had to go do a simulation technique. This is what we call low resolution simulation using some fluid dynamics.

We are able to get a very nice overall shape to the smoke, some decent internal motion, but what we are lacking is detail; and to get detail, we could start turning up the settings on the simulation, go to a higher resolution. However, that can be very time-consuming.

So we took an alternative method. Taking this simulation we extract the velocity field out of the simulation. So all the nice internal motion; and then we run a much faster, simpler simulation just using particles; and we use that velocity field to guide the motion of the particles.

This is not something that we invented. This is a technique. It is called Particle Advection; and in this example, that you see here that might be a couple thousand particles. So to get more detail. We simply add more particles to the simulation, we run additional simulations, and we add them all together in the end.

So in the final shot for the smoke plume, we simulated and rendered over 40 million particles for the plume of smoke; and then, we were able to take our volume fill technique, we attached it to Reaper, and we get the smoke coming at camera.

So while we were able to streamline, simplify a lot of the effects in the episodes, and make them go quicker, there are some effects that are just complex, and they require complex solutions. In the Last Bastion, when Bastion stands up for the first time, he is covered in dirt, and moss, and plants; and this is something that we knew we had to start development pretty early on. It was going to be very difficult.

Senior effects artist Atsushi Ikarashi led the development on this, and this is using our animation package Houdini. Using its position-based dynamic solver. It is also called the grain solver, and what this test shows us is not only how the grains of dirt interact with each other, but also how they can interact with another simulated object in the scene, such as the plant roots.

Seeing how promising this test was, we can introduce Bastion. So we take a first pass animation of Bastion standing up, cover him in some dirt particles, run it through the simulation, and see what happens.

Obviously, there are some problems. Some instabilities in the simulation that we need to work out, but as we zero in on one area, we are able to stabilize things, and we start adding complexity. The Brown particles are representing simulated dirt and the green particles are representing the moss, and then we have the plants attached to that.

One of the most difficult challenges that we had to solve here was when you are simulating different materials in the same simulation, because they interact with each other, knowing which particles need to be moss, which particles needed to be dirt, and getting the moss to sit on top of the dirt; and as the dirt crumbles and falls away, the moss stays attached, and it follows with it. As you can see there’s a lot of complexity here.

Once we are happy with the simulation, that data from the dirt and the moss was then written out, fed back into the effects pipeline, and was used to simulate the plants on Bastion. All in all, we simulated over 320,000 grains of dirt, and after that was run through a clustering technique to clump the dirt together into kind of natural dirt clumps, we outputted over 40 million particles that was then handed off to the lighting team for rendering.

So here is a look at the dirt effects in the final shots from the Last Bastion.

KEVIN: So now we have low overhead simulation techniques, procedural effects workflows, and all of which help to keep the effects team’s iteration time down so they can iterate faster; and as Corey mentioned, we have fast flexible rig solutions for the animation team, and a robust animation library for them to leverage so they can focus on what they do best; but we still have the question of how do we render this all? To talk to you today about what it took under the hood to make that happen, I’m going to invite Schimon Cohen back up.

Next: Technology

BLIZZCON 2016 OVERWATCH ANIMATED SHORT PANEL TRANSCRIPT
StoryProduction Concept3D ModelingAnimationFX TeamTechnology
LightningSound Design

Hope you enjoyed this article. Please, support Blizzplanet on Patreon, and follow us in Twitter, Facebook and YouTube for daily Blizzard games news updates.

BlizzCon 2019 Panel Transcripts

Tomas Hernandez is owner of Blizzplanet.com since 2003. I post news about World of Warcraft, StarCraft II, Diablo III, Hearthstone, Overwatch, Heroes of the Storm, Blizzard Careers, and the Warcraft film.

Blizzplanet is a leading fansite covering news about upcoming Blizzard Entertainment licensed products. I also post previews and reviews. I have interviewed book writers and Blizzard game developers.

I was previously an employee of the OGaming Network (2003), and IncGamers (2008-2010). I was a guest newsposter for GosuGamers (World of Warcraft) a few years ago and for Diablofans.com (formerly Diablo3.com)

***Fans who would love to watch our recorded Blizzard panels, BlizzCon panel transcripts, and appreciate our efforts can support Blizzplanet’s patreon in a monthly-basis, or a one-time basis. Our staff are volunteer fans like you. Your donatives will help us travel to all the Blizzard events we attend year-round.

Hope you enjoyed this article. Please, support Blizzplanet on Patreon, and follow us in Twitter, Facebook and YouTube for daily Blizzard games news updates.

BlizzCon 2019 Panel Transcripts

© 2018 Blizzplanet - Read our Privacy Policy; and Terms and Conditions

©2004-2018 Blizzard Entertainment, Inc. All rights reserved. World of Warcraft, Warcraft and Blizzard Entertainment are trademarks or registered trademarks of Blizzard Entertainment, Inc. in the U.S. and/or other countries.

DMCA.com