
VIRTUAL PRODUCTION NOW AND GOING FORWARD
www.vfxvoice.com
By TREVOR HOGGPreparing for a virtual production shoot of a Vertibird featured in Fallout. (Image courtesy of All of it Now)Has virtual production revolutionized filmmaking, beginning with The Mandalorian in 2019, and accelerated by the COVID-19 pandemic a year later? The answer is no, but the methodology has become an accepted alternative to bluescreen and greenscreen. Even though technology continues to advance at a rapidly, some things have remained the same. Its a mixed bag, states Matt Jacobs, VFX Supervisor. Whats on my mind now when talking to people is building brick-and-mortar facilities. There was a project constructing a backlot in France, and I asked, Did you set up an LED volume because youve sunk a lot of money into this? And theyre like, No, because every time we do an LED volume, it seems that the ask is different for what the volume needs to do. Everybody comes in and says, I need it for process shots for cars. Or, Im doing playback, and I need the volume to be this size and configuration. The ability to pop up a volume, be flexible and build the volume out to case-specific specs seems to be the way to go these days.Companies like Magicbox offer a tractor-trailer studio setup. The pop-up trailer is an interesting thing, but you also have to look at that as a set configuration, Jacobs notes. Yes, its mobile, but its what the tractor trailer looks like. Do you need a volume that is semicircle? Do you need the ceiling, or is that lighting? How are you going to work a volume with a known configuration of width and height? Is it squared-off walls or a circular volume? Does it have ceiling panels that you need for reflections in a car? How are those ceiling panels configured? I was on a Netflix shoot, and we had this great volume at Cinecitt Studios outside of Rome. It was a cool setup and a big stage. The floor was a Lazy Susan, so it actually spun around. The ceiling was great, but because the tiles didnt line up perfectly, there were lines and seams across the car where there were no reflections. We had to bring in walls to do fill reflection on the front of the car. We had to do a lot of work to reconfigure that stage and bring in certain elements. Thankfully, they were nimble and had a lot of great pieces and solutions for us to work with. But it goes back to the point that the stage was probably too big for certain things, and maybe it wasnt perfect for our car shoot.LED walls are beneficial for rendering content for backgrounds but often fall short as a lighting instrument. (Image courtesy of Disney+ and Lucasfilm Ltd.)Generally, people think that virtual production is synonymous with the LED volume. I think virtual production is anytime that youre using real-time technologies in conjunction with normal production, remarks Ben Lumsden, Executive Producer at Dimension Studio. The biggest single change is you can push a lot more through Unreal Engine. Youve got a whole suite of tools specifically addressing LED volume methodologies. Theres the switchboard app and level snapshots that allow you to go back to a period of time when there was that particular load on the volume and understand exactly where everything was, which animation was where and what the lighting setup was. On Avatar, James Cameron would get so frustrated because everything was done using MotionBuilder. Cameron would return to post-production after being on set, and all the creative changes he made on the day got lost in translation through the pipeline. MegaLights from Unreal Engine 5.5 is a huge step forward. Lumsden says, Beforehand, it was geometry, which was too expensive. But then Nanite came along with Unreal Engine 5, meaning geometry was no longer an issue. Our experiments with MegaLights so far suggest that lights will no longer be an issue.Limitations still exist regarding how much you can put on the LED wall in terms of computational power. (Image courtesy of Dimension Studio, DNEG and Apple TV+)Westworld Season 4 made use of virtual production technology to expand the scope of the world-building. (Image courtesy of Technicolor and HBO)Limitations still exist regarding to how much you can put on the LED wall in terms of computational power. You dont want to drive too many metahumans, for instance, but you can put loads of volumetrically-captured people and make sure that their card is pointed back to the camera or their rendered view is relative to the position of the camera, Lumsden notes. One thing that we did that was cool regarding R&D is marrying our performance-capture technology with the LED virtual production. Weve been doing some tests where we can actually drive metahumans on the wall as digital extras being live-puppeteered on a mocap stage and interacting with the real talent; thats a new technology or workflow that we may well bring into production going forward. Sound remains problematic. There is a real issue with capturing audio because youve got this big echo chamber. There are some fantastic new LED panels coming out all of the time. But the great new panels are always expensive. Over time, that will change, as with all of these things. There are also some new and interesting technologies of people doing projector-based methodologies, which are intriguing because the price point is more applicable to indie filmmakers.The most significant single change is that Unreal Engine has a whole suite of tools specifically addressing LED volume methodologies. From Those About to Die.(Image courtesy of Dimension Studio)Virtual production is anytime real-time technologies are used in conjunction with normal production, as in Here. (Image courtesy of Dimension Studio, DNEG and TriStar Pictures)Astra Production Group has forged a partnership with Magicbox, which has developed a mobile virtual production studio setup. (Image courtesy of Magicbox)Interest rates have made productions more cost-conscious and less adventurous. The early stories of the volume being a cost-saving mechanism put volume shoots at a disadvantage because producers came in expecting to see a 10x savings in cost or whatever number they had in mind, and its dramatic but not that dramatic, observes Danny Firpo, CEO & Co-Founder of All of it Now. Now, people are realizing what the volume does well, which are process shoots for vehicles or being able to create a lot of environments in a short amount of time or being able to move the environment around talent. Hardware and software have greatly improved. The expansive rate of cheap graphic cards is increasing in power and is helping to keep the dream of a real-time holodeck-style volume within arms reach. The quality of real-time graphics is increasing exponentially, and the time it takes to create those real-time environments is decreasing due to the impressive tools that have come out on the software side. Nanite and some of the impressive tools that have come out from Unreal Engine 5.3 and all of the way up to 5.5 are creating a much better environment for artists to create the best version of what they can possibly create now. In addition, were seeing a better understanding across the board of LED and camera providers and even lighting vendors of what types of equipment flourish in an LED volume environment as opposed to trying to take live show or film rental inventory and cramming it into the volume, which we saw in the volumes during the pandemic.Technicolor Creative Studios partnered with NantStudios to construct a virtual production stage in Los Angeles. (Image courtesy of Technicolor)One particular department head remains central in being able to understand and communicate the capabilities of the LED volume to other members of the production team. The visual effects supervisor is an ideal bridge because they already exist in this hybrid or mixed reality of 2D and 3D, real-time, physical and digital environments colliding to create the finished product, Firpo states. That type of thinking is more challenging for somebody from a different department like Art, Camera or Lighting and is only used to dealing with one physical reality in a real-world space. What we have discovered is specialists are emerging in those departments who have a real understanding of that and are willing to take an extra day and pre-light or go through a virtual scout and ultimately help explore those worlds more and use the same mentalities of what they would do in a physical scout. An effort has been made to make the virtual production process more intuitive for the various departments. Firpo notes, Were moving all of the extraneous tools and features that we deal with and making a simplified UI. For example, giving a DP doing a virtual location scout using an iPad, which is ubiquitous on set, a sense of a rigged virtual camera, which feels like operating a physical one but is essentially a digital portal into that world. Getting that buy-off and sense of translation from the physical into the digital world and vice versa is where its helped bridge that communication and culture gap.Technicolor, in cooperation with the American Society of Cinematographers, conducts an in-camera visual effects demo. (Image courtesy of Technicolor)Virtual production has not only revolutionized filmmaking, but the methodology has become an accepted alternative to bluescreen and greenscreen. (Image courtesy of Technicolor)LED walls are great for rendering content for backgrounds but often fall short as a lighting instrument. LED volumes have a limited brightness, and the light spreads out, so you cant create harsh shadows, notes Lukas Lepicovsky, Director of Virtual Production at Eyeline Studios. Theyre also not full spectrum light. LED walls are only RGB instead of RGBW Amber like you would get from an on-set light. You can maybe use the LED wall as fill light, but then you definitely want to be working with on-set lighting for the actual key light. Virtual production excels with short turnaround projects such as commercials because all the decisions are made upfront. If youre a massive visual effects project, then youre probably going to want to lean on it more for lighting capabilities, like projecting an explosion that lights up the actors face in a nice way, but then leave yourself room in visual effects to augment the background with giant building destruction. This is what we ended up doing with Black Adam. We made the wall be near final, or in some cases just a previs in the background that had good lighting, which had explosions and lightning elements. We used it as a lighting instrument, knowing we would replace the background afterward. It depends on the production because, in those cases, you dont always know what your final asset looks like while youre shooting a large feature production. Because its a real-time process, you have constraints of polygon budget and render time, so you cant just fill the world with all sorts of assets. You have to have strong planning when it comes to these things.The quality of real-time graphics is increasing exponentially, and the time it takes to create those real-time environments is decreasing. (Image courtesy of All of it Now)Those About to Die was shot on the LED volume stage at Cinecitt Studios in Rome. (Image courtesy of Dimension Studio, DNEG and Peacock)Interest rates have made productions more cost-conscious and less adventurous. (Image courtesy of All of it Now)Game engines have been a game-changer and are constantly improving. Where it can stand to improve still is the integration of some visual effects technology like USD and the ability to quickly share assets between departments and make layered, modifiable changes in the pipeline, Lepicovsky remarks. Also, over time, weve seen this with visual effects; things started from a rastering approach, and eventually everything turned into ray tracing. So, Im excited to see that there are also ray tracing possibilities in real-time that are coming forward both from Epic Games and Chaos Vantage, a new entrant in the virtual production market. It is still too early to judge the impact of machine learning on virtual production. Lepicovsky adds, There are machine learning tools that generate the backgrounds, but right now, they often want nice animation with all the leaves blowing and trees swaying; that is easier to do in actual game assets. Machine learning has been interesting for us in a new process called Gaussian Splatting, which is like a new version of photogrammetry based on a machine learning process. What is different from traditional photography is that you can have reflective and see-through surfaces and capture hair. Another interesting one involves a relighting process that allows you to capture actors in neutrally-lit lighting conditions, like volumetric capture, but then change the lighting afterwards using machine learning.The LED panel is excellent because its an incredibly high output, so people like to use it for the lighting, and companies like ROE Visual are adding additional colors into the diode cluster to get better skin tones, remarks Jay Spriggs, Managing Partner at Astra Production Group. But thats not going to replace a conventional lighting instrument. We know people who are researching projection in volumes because the cost to run that is much lower, and you also have additional benefits. For LEDs, the diodes light up and shoot light out, whereas, in a projection-oriented environment, they are reflective, so you have a different quality of light and mixing, which comes from that. The Light Field Lab stuff is fascinating. I dont want to even think about what the volume would cost for that! The central question is, how do you help with what is happening in the frame? From there, you reverse engineer that into what products are not just the best for whats going to happen but also the most money-efficient so that they have enough money to bring in their people. The most cost-effective way is projecting plate photography, as there are so many more complications with real-time tracking, says Spriggs. However, Unreal Engine is making major strides with a new grading workflow. That is going to be huge for making better pictures out of the game engine because one of the biggest things has always been: how do you do a final polish pass on what is already a good lighting engine but is not perfect?The Mandalorian, along with the pandemic, have been credited for causing a boom in virtual production. (Image courtesy of Disney+ and Lucasfilm Ltd.)Not everything gets treated the same way. If Greig Fraser [Cinematographer] wants to get the highest quality lighting effect for the best skin tone, but were only doing a couple of tight shots, and he has a generous post budget, then we look at the background of the LED, Spriggs explains. We build it with the highest quality LED with the smallest pitch we can find. Dont worry about the final color that you see in the picture because the post budget will kick all of that stuff out so they can post-render and grade. All we focus on is the skin tone. If someone is trying to shoot a car commercial, theyre trying to get the closest to final pixel for the reflections. You build a volume around the car that theyre looking at with the smallest pitch so that you will not be able to see individual pixels on an LED wall with a ceiling. Shoot that and walk away. You wouldnt use that same configuration for the other one because benefits wouldnt be there. Fundamentals should not be forgotten. Advises Spriggs, If we focus too much on revolutionizing and democratizing or any such big-picture thoughts, we forget about what we have to do right in front of us, which is to make a damn pretty picture!The visual effects supervisor remains the bridge in understanding and communicating the capabilities of the LED volume to the other heads of the departments. From Time Bandits. (Image courtesy of Dimension Studio, DNEG and Apple TV+)
0 Комментарии
·0 Поделились
·10 Просмотры