BEFORESANDAFTERS.COM
The new tech that made Mufasa possible
Behind MPCs QuadCap motion capture, premium previs and 2,000 frame renders on the film.Barry Jenkins Mufasa: The Lion King is a fully CG film. However, it is intended to look as if it could have been filmed for real in Africa (just like its 2019 predecessor, The Lion King). To do that, the filmmakers employed an array of virtual production techniques such as VR scouting, virtual cinematography and real-time rendering to plan out sets and action and realize them in a manner that replicated a live-action feel (aided further by the final naturalistic animation and photoreal rendering).One of the new virtual production techniques relied upon on Mufasa was a tool called QuadCap, ie. a quadruped motion capture system. It formed part of the motion capture shoots for the film that took place in Downtown Los Angeles. A still from the official Disney Technology of Mufasa, featured below.Here, a number of the characters such as Mufasa, Sarabi and Taka were represented by performers in motion capture suits (with the resulting capture aimed at informing the final animation and helping with staging).Normally this would produce only bipedal motion captured animation, but QuadCap aligned the performers head and spine movements to the lions head and neck, their legs to the lions front legs, and simulated the lions back legs and hips. It was great because it offered so much flexibility for Barry Jenkins, observes Audrey Ferrara, who was MPCs visual effects supervisor on the film, working with production visual effects supervisor Adam Valdez, animation supervisor Daniel Fotheringham and virtual production supervisor Ryan Champney. On the stage, there would also be DOP James Laxton with a virtual camera and he and Barry could immediately say, No, it needs to be a little bit punchier in terms of the movement, or, Hold on there for a minute so we can really come close to you. It was art directable live in Unreal Engine.MPC was behind QuadCap and all of the virtual production and visual effects on Mufasa. Within Unreal Engine, a total of 12,680 on-stage takes were shot using the V-Cam and motion capture systems. Meanwhile, a total of 7,399 live motion capture and QuadCap performances were captured during the shoot. The VFX studio was involved the whole way, including in early pre-production during a COVID lockdown period with the director and cinematographer, and also production designer Mark Friedberg, and later in Los Angeles. At this early stage, concept art led to early set builds and then VR scouts to help flesh out the world. As sets and shots continued to be planned, so too did lighting. All of this occurred within an Unreal Engine sandbox crafted by MPC.Premium previsThe ultimate goal of this prep work was previs at a high fidelity level. Adam Valdezs goal on this one was, We need to have premium previs, relates Ferrara. In fact, it was to not even think about it like previs, but more like going into the first pass of the movie. This meant our sets were way more detailed in terms of textures, and even the first pass of effects were generated, including water. If there was fire or rain, it would be there, all in Unreal. And then, continues Ferrara, James Laxton would spend a lot of time setting up his light rigs in Unreal. With ray tracing enabled, it gave him and everyone a better idea of how the final shots would look. It can be so hard to project yourself into the final image. It takes so long to get there, usually, so we wanted to give them something that would be shaped closer to what they wanted to achieve sooner in the process.MPC developed new tools to export the resulting Unreal Engine files into post-production, too. We would save the animation, says Ferrara. We would save the light rigs. We would save everything: the cameras, the environments. It meant we had our blocking of pretty much everything. What Ferrara says was also a huge benefit of this premium previs process was effectively having the entire film inside of Unreal Engine. If we wanted to do re-shoots, or if we wanted to explore different approaches, you could go back in there. The lighting was already set-up, you just export, and boom. We even pushed final animation being done in Maya back into Unreal, and when that was rendered in there it really helped with editorial. The previs was the Bible, the cornerstone of everything. As a VFX supervisor, I would constantly go back to the previs. Suddenly, you are not constrained by the pipe anymore. Its malleable, its flexible, and thats great.Going bigger on MufasaThe world in which Mufasa travels in the film spans some 107 square miles. Thats about the same size as Salt Lake City, Utah, all of which had to be created by MPC. 77 digital sets were created, with 5,790 assets such as trees, plants and grass species were built, and another 118 photoreal creatures made. By the end of the film, the VFX studio would have completed 1500 fully-CG shots, and using 25 petabytes of storage (rendering the film in final quality took 150 million hours).The process began with Friedbergs teams concept art and builds, and was also informed by a team that visited several countries including Namibia, Botswana and Tanzania.That team came back with something like 10 terabytes of material, describes Ferrara. Just tons of videos, photos and photogrammetry. The MPC team in post-production would process the photogrammetry of, say, columns in a canyon and share them with the art department. Then they would be able to use it in their layouts for the scouts. Just like the Unreal previs, we had those assets traveling between pre-production and post-production all the time. In addition to the build of many landscapes, MPC also had the challenge of the camera generally coming much closer to the characters than the previous film. The camera comes close and back and keeps going back and forth, says Ferrara. So we needed those characters to hold up very close-up. Theres way more details in the model, way more hair in the groom. We had to rebuild the lookdev and the shaders of the fur from scratch in order to have this complexity.Each lion in Mufasa featured over 30,000,000 hairs to achieve the realistic look of fur. Just Mufasas mane on its own was made up of 16,995,454 hair curves. The lion has 600,000 hairs on his ears, 6.2 million hairs on his legs, and 9 million hairs covering the middle portion of his body. Some shots feature constantly moving cameras and long frame ratesexceeding over 2,000 frames at times (that were also realized in stereo). Then there were effects simulations, including environments. Simulation of the environments was crucial, states Ferrara. We had to make sure that this world was dynamic and kinetic. Those characters are moving creatures moving through a moving world. So the grass, the trees, the simulation of the air, even of the pollen in the air. It was all about making sure that it didnt feel static and that we werent just putting characters straight onto some kind of backdrop. Water, of course, was a major effects simulation task. It existed in several states like rivers, rain, mist, snow and clouds. Says Ferrara: The water is a character in the story. It goes hand-in-hand with Mufasas story, which is his fear of water, the water that separated him from his parents. It needed to be art directable because it needed to perform in order to match the performance of the characters.The films landmark flash flood sequence, in which Mufasa is washed away, was one in which MPC started the process in effects and then went back and forth between effects and animation. We had to almost rehearse this process even before it was turned over, advises Ferrara. It was like us getting ready to go into the arena for the fight and be prepared. Snow, too, was a challenge for the MPC team. My personal fear on this movie was the snow and how to make snow look good and realistic, admits Ferrara. I had this one shot that was my personal nemesis throughout the movie, which was when Rafiki falls down and makes a snow angel. That one gave me nightmares. But it was also very satisfying when we cracked the code and made it work.And cracked the code, MPC did. One particular snow angel shot required the simulation of over 620 million snow particles. The post The new tech that made Mufasa possible appeared first on befores & afters.
0 Comments
0 Shares
43 Views