BEFORESANDAFTERS.COM
Shooting on an LED volumeon film
Magnopus details its particular virtual production approaches on Fallout, which included capturing the LED wall scenes on 35mm film. An excerpt from the print mag.The Prime Video post-apocalyptic series Fallout was shot on 35mm film with anamorphic lenses.While thats a format not unfamiliar at all to the shows executive producers Jonathan (Jonah) Nolan and Lisa Joywho took the same approach on Westworldit is a format not all that common for an episodic project that also relied heavily on shooting in an LED volume.Getting there required close collaboration with Magnopus, a studio that had been part of previous Westworld R&D efforts and some filming (on film) in an LED volume for that series fourth season.That season four of Westworld was where the evolution of this tech, integrated into storytelling, really began, advises AJ Sciutto, director of virtual production at Magnopus.Sciutto oversaw the effort at Magnopus to deliver and collaborate on several virtual production services for Fallout, including the virtual art department, LED volume operations and in-camera VFX (ICVFX). Fallouts visual effects supervisor was Jay Worth and visual effects producer was Andrea Knoll. The shows virtual production supervisors were Kathryn Brillhart and Kalan Ray, who oversaw four episodes of the series each. Magnopus ran stage operations for the first four episodes, with All of it Now handling the second lot of four episodes. More on how the film side of the production came into play below, but first the process began with the building of an LED volume in New York, where the series would be shooting.At the time, says Sciutto, there was not an LED volume in New York that could have accommodated a show this size. Spearheaded by productions Margot Lulick along with our partners at Manhattan Beach Studios and Fuse Technical Group, Magnopus CEO Ben Grossmann and the Magnopus team designed an LED volume in Long Island at Gold Coast Studios that was built to meet the specifications that Jonah wanted. He likes doing walk-and-talks, he likes being in longer shots, almost oners. He likes being able to be encompassed by immersive content. And so the design of the volume was very much a horseshoe shape. It wasnt cylindrical like you see in a lot of volumes now. It was a horseshoe to allow us a big long, flat section to do a walk and talk. The final LED wall size was 75 wide, 21 tall, and almost 100 long.The assets for the LED wallwhich included virtual sets for the underground vaults and post-apocalyptic Los Angeles environmentswere designed to run fully real-time in 3D using Epic Games Unreal Engine. We used the latest and greatest versions of Unreal at the time, states Sciutto. For the first couple episodes of the season, this was Unreal 4.27, and then we took a few months hiatus between the first four episodes and last four episodes and at that point Unreal upgraded to 5.1 and there were some advantages in using 5.1. Lumen was one of them, the real-time global illumination system, which we found to be pretty essential for the needs of the set designs that we were working with. And so we upgraded engine versions to Unreal 5.1 about a week before we actually shot the scenes using it, which can be a hive-inducing moment to anyone whos worked in this industry before. Epic says we were probably the first large production to actually use 5.1 in practice and it ended up working great for us.Making it work for filmWith the LED wall stage established and virtual art department builds underway, Magnopus still needed to solve any issues arising from the shooting of 35mm film on the volume. Sciutto notes that genlock was the most important factor. You have to be able to genlock the camera so that youre getting in sync with your refresh of the LEDs. We had worked with Keslow Camera back on Westworld to get sync boxes that are designed for the Arricam LT and Arricam ST to read a genlock signal and actually be able to phase lock the camera. That took a couple months of just designing the leading to trailing edge of the genlock signal for the camera to read that and get that to be in phase.Once we did a couple of camera tests, continues Sciutto, we felt like we were in a good state, but then we had to do some wedge tests because the actual latency flow between Unreal to the render nodes to the Brompton processors to the screen was slightly dynamic. We had to do some wedge tests to figure out what that latency offset was so we could then dial in the camera.The next hurdle was color workflow. Normally, says Sciutto, you build in a color profile for your camera, but because the HD tap on the film camera is not truly an HD tap, you are winging it. Well, youre not actually winging it. Theres a lot of science behind it in terms of what youre looking at in dailies and how youre redialing the wall and how youre redialing to a digital camera. You cant really trust what youre seeing out of the HD tap. So we had a Sony Venice that was sitting on sticks right next to the film camera. We had a LUT applied to the digital camera that mimicked our film camera so that we could do some live color grading to the overall image of the wall.Sciutto adds a further challenge was understanding the nature of different results from the film lab in terms of dailies. Depending on which day of the week we got the dailies processed, they might change the chemical bath on Mondays, so by the end of the week it might skew a little bit more magenta or might skew more green. We would use the digital camera footage to know we were always within a very comfortable range.That dailies processwhich saw rushes shipped from New York to Burbank for development and digitizationalso impacted the pre-light on the LED wall, as Sciutto explains. When you do a pre-light day on the film camera, you dont really know what you shot until a day and a half after. So we would do a series of pre-light shoots where we would shoot on a day, get the film developed, have a day to review and make any adjustments to our content before we did another pre-light day. That created a schedule for us that allowed set dec to get in there and do some adjustments to the scene, or our virtual art department to do lighting adjustments to the virtual content to make sure it lined up with the physical content and be ready for any of the lighting and color settings we needed to be set up for on the actual shoot day.Asked about the final look of the LED wall film footage, Sciutto mentions that seeing the finished results in dailies, there was definitely a softer fall-off between your foreground actors, your mid-ground set pieces and your background virtual content. That transition was blended a little bit smoother through the grittiness of the film [compared to digital] and that helped a lot. Also, you can capture a lot more detail and range in film that allows for more dynamic offsetting through color in a post process than you can with digital. If you go digital and then you push it or you crank it too much, it can get weird somewhat quickly. So shooting on film at least allowed us more range to dial in during the DI.Read the full story in issue #21 of befores & afters magazine.The post Shooting on an LED volumeon film appeared first on befores & afters.
0 Commenti 0 condivisioni 146 Views