THE RISE OF REAL-TIME VFX AND WHERE ITS GOING
www.vfxvoice.com
By TREVOR HOGGReal-time software programs are being developed by Chaos, such as the ray tracing renderer Vantage and Arena, which does ray tracing for in-camera effects. (Images courtesy of Chaos)Virtual production could not exist without real-time rendering, customarily associated with game engines such as Unreal Engine and Unity. Still, real-time technology is also impacting workflows and pipelines constructed to produce visual effects on a daily basis. As the tool is refined to become more cinematically proficient, new challenges and opportunities have emerged for visual effects artists and production teams. My first job in the visual effects industry was working on Star Wars: Episode 1 The Phantom Menace, the first movie to do previs, recalls Kevin Baillie, Vice President and Head of Creative at Eyeline Studio. Our real-time capabilities back then were quite limited, but now fast forward to where we have these images that can look near to final quality in real-time. Not just previs, but a virtual art department to build set designs whether were looking at them through a camera, VR goggles or any other means. These incredibly powerful tools allow a filmmaker to accelerate some of the physical process, start it digitally and iterate on it quickly before we get into the tedious, expensive physical phase. When I worked with Robert Zemeckis on Pinocchio, we previsd the entire movie. As we were shooting it, we did real-time on-set composites of the scenes that involved live-action, relay down cameras for everything that was a fully virtual shot, then those cameras went into the visual effects post-production process. We made the movie three times using these real-time technologies, and that iteration helped Zemeckis narrow it down on what exactly he wanted.The introduction of full ray tracing to the virtual production process removes the need for rasterized rendering. Source: Ray Tracing FTW. (Image courtesy of Chaos)Unreal Engine became the answer when pandemic restrictions meant that not everyone could go into the same vehicle together to scout locations for The Handmaids Tale. I would go out, scan the locations, rebuild them in Unreal Engine, and we would walk through in sessions, recalls Brendan Taylor, President & VFX Supervisor at Mavericks VFX. I like to say that we are making a game called, Lets make a movie. Whats awesome about that is you can create all the rules for this world. The thing about a game is you need to be able to see it from all angles and be able to change things on the fly. When were working in film, were dealing with whats here and in the camera. Virtual scouting led to some discoveries that Elisabeth Moss applied when directing her first episode of The Handmaids Tale. Taylor explains, What we were able to do was build the set on the bluescreen stage from the plans, sit with a monitor on a little handheld rig [in our screening room] and explore the space with Elisabeth. She tried things out with just me, Stuart Biddlecombe [Cinematographer] and Paul Wierzbicki [Unreal Engine Specialist]. Elisabeth said, Theres something missing. Were so monochrome. Paul responded, Sometimes these buildings have red lights on them. He quickly put a flashing red light in the corner, and it changed the tone of the scene to give it this devilish look. It made this guy pushing women off of the roof even more menacing. We would have never known until we lived within this game we had created. For me, that was a real a-ha moment where it became collaborative again.An ambition for real-time visual effects is to have the ability to visualize, explore and iterate quickly without closing the door on the visual effects team finishing it off to get the final image. Previs from The Witcher Season 3. (Image courtesy of Cinesite and Netflix)Real-time is most useful at the concepting stage. (Image courtesy of V Technologies)Simplification is taking place when it comes to game engines and real-time. We dont have enough people who know Unreal Engine to drive a virtual production because its such a beast of a software that has been in development forever, observes Jason Starne, Owner, SHOTCALLER and Director of Virtual Production for AMS Pictures. We need some simplified things, and thats what we are starting to see with what companies like Chaos are doing. Theyre building something that allows you to have a 3D world scene that is truly a real-time path tracer, and the path tracer gives the best quality you can out of a rendered image. Real-time is an aspect of the pipeline. Its a tool just like virtual production is another toolset a studio would have. Misconceptions are an issue. The con is that the marketing has made even our clients believe this is easy to do and can be achieved without a whole lot of work going into it. In real life, we have to put work into it and make or build things in a way where we can get speed out of it. Its not just going to be real-time because its coming out of Unreal Engine. It could be, but it will look like crap. How do we get the quality versus the speed that we need?The mantra for V Technologies is content at the speed of thought, which they believe will be the next evolution of communication. (Image courtesy of V Technologies)Real-time allows digital artists to iterate way faster, which means more options for clients. Scene from Sweet Tooth. (Image courtesy of Zoic Studios and Netflix)Real-time has shifted the involvement of Zoic Studios toward the front end of the production, resulting in far less in the back end. Scene from The Sympathizer. (Image courtesy of Zoic Studios and HBO)The Chaos Group is developing real-time software programs, such as the ray tracing renderer Vantage and Arena, which does ray tracing for in-camera effects. For us, Arena is an extension of the camera that the DP already has, and as long as the DP can talk to the people who are running the stage, like to a grip or camera operator, then were in good shape, remarks Christopher Nichols, Director of Chaos Labs at the Chaos Group. We looked at what they needed to do to get the correct video on the LED walls. Essentially, we needed a system that synchronizes renders across multiple nodes and can track a camera so you can get the correct parallax. Thats the fundamental thing we added to Vantage, enabling it to become an in-camera effect solution. By introducing full ray tracing to the process that removes the need for rasterized rendering, you can make a better duplicate of the camera and dont need to optimize your data or geometry in the same way that you need for video games. Almost everything that is done in post-production uses full ray tracing, either V-Ray or Arnold. That massively cuts down on how much time and energy is used to put the CG elements behind people because its the same asset for everything. The virtual art department can focus on compositing the shot correctly or creating the right environment and not on, How do I remake this to work for a game engine?More options have become available to be creative. Were seeing concepts emerge now that would have been nearly impossible without the use of real-time tools to plan and execute, like digital twins, which are changing the game for creators, especially when budget and ambition are both high and theres no room for miscommunication, notes states Brian Solomon, Creative Technology Director at Framestore. Another area advancing rapidly revolves around how we utilize characters. Real-time allows us to previs and utilize dynamic 3D characters earlier in feature film production, especially with character-driven live-action pictures. Similarly, there are now advantages coming from production-grade real-time variants of characters. These are benefiting larger brands and animated IP owners, as a host of new formats are emerging that allow these characters to interact with the world in ways they couldnt prior and at turnaround speeds not hitherto possible. Real-time overall is broadening the horizon for characters.The visual effects pipeline at Zoic Studios has always been modular. Scene from The Boys.(Image courtesy of Zoic Studios and Prime Video)Real-time technology is positively transforming production pipelines. In the traditional visual effects world, it is allowing for faster iterations which enable additional exploration of creative options, notes Paul Salvini, Global Chief Technology Officer at DNEG. These advances are most critical in areas like animation and creature and character effects [such as the simulation of muscle, skin, hair, fur and cloth]. In cases where the final output from real-time solutions needs further processing, seamlessly connecting real-time and non-real-time tools becomes critical. The role of artists doesnt fundamentally change, but the tools will allow a more interactive workflow with better feedback. Real-time visual effects are also transforming more areas of production than ever before from previs through final render. Audience members are getting to enjoy even more immersive and interactive experiences. Salvini remarks, Some recent live and virtual concert experiences have done a great job of bringing together the best of the real and computer-generated worlds to deliver experiences never before possible for audiences, such as allowing a current artists performance to be mapped visually onto their younger selves.Technology is an ecosystem that is constantly evolving because of innovation. (Image courtesy of V Technologies)Real-time visual effects are here to stay because it is the best way to get feedback from clients or collaborators. Composite from 9-1-1. (Image courtesy of Zoic Studios and ABC)Virtual production was a key component in expanding the practical sets for Barbie Land. (Image courtesy of Framestore)More creative options have become available because of real-time visual effects. Screen capture from Agatha All Along. (Images courtesy of Framestore and Warner Bros.)Storytelling and being able to present clients with the best possible imagery are the main technological goals for Sony Pictures Imageworks, which meant figuring out how to get close to real-time with their GPU renderer Arnold. The more the client is educated with real-time and sees what the studios are doing, the more they want you to push the envelope, states Gregory Ducatel, Executive Director, Software Development at Sony Pictures Imageworks. The magic you get when you work with good creatives, clients and technology is that the creativity of those people jumps. Its crazy. Currently, if you go outside of Unreal Engine, the quality of the imagery drops, and then with lighting, it goes back up; that was not acceptable for us because artists lose the context of their work, and the creatives dont like that. This is why Spear [Sony Pictures Imageworks version of the Arnold Renderer] was brought to the table. How can we always have the highest quality possible at each given step but never go back to the previous one? The feature animation and visual effects applications are somewhat different: however, the principles remain the same. We always want better quality, more iterations. We dont want to wait for notes and for the artists to do something, then go back to notes. If you can do that in real-time, the artist can move forward, and its exactly what you want, states Ducatel.Real-time visual effects are here to stay. People who dont see that real-time is where we all should go are stuck in the past, believes Julien Brami, VFX Supervisor & Creative Director at Zoic Studios. There is time for finishing and concepting; all of these take time, but when we need the interactivity and get feedback, whether from clients or collaborators, real-time is the best tool. Real-time allows us to iterate way faster, and faster means more options. Then you can filter what is working. Instead of saying no to a client, now you have an opportunity to work with them. There are more iterations, but its less painful to iterate. The pipeline is evolving. Brami says, The visual effects pipeline at Zoic Studios has always been modular. We try to make the pipeline procedural so it can be crafted per show and be more efficient. Real-time has shifted our involvement toward the front end of the production, and we have way less in the back end. With a traditional pipeline we would have a bluescreen or greenscreen and have to key everything; all of that would have been at the tail end, which is usually more stressful.The more the client is educated with real-time and sees what the studios are doing, the more they want the envelope pushed. Scene from K-Pop: Demon Hunters. (Image courtesy of Sony Pictures Animation and Sony Pictures Imageworks)Real-time is allowing the utilization of dynamic 3D characters earlier in the process of feature film production, especially with character-driven live-action pictures. Scene from Paddington in Peru. (Image courtesy of Framestore and Columbia Pictures)Three years ago, it was all about using game engines for real-time, but with the advances in generative AI, people are doing things even more instantly. (Image courtesy of V Technologies)Technology is constantly advancing along with the growth of expectations. Virtual production, machine learning and real-time rendering engines; all of these have been around for decades, observes Mariana Acua Acosta, SVP Global Virtual Production and On-Set Services at Technicolor. Its not like it just happened overnight. What has continued to advance is our computing power. I cant even comprehend how were going to be able to maintain all of the machine learning and AI with these new generational GPUs. What has pushed these advancements forward has been virtual production, cloud workflows, machine learning, AI and the game engines themselves. To avoid obsolete technology, hardware has to be constantly updated. Its costly for a studio to be constantly updating hardware. Maybe at some point, you get a project or want to create your own project and realize you dont have enough hardware to go and run with it. Thats when the cloud comes in, as you can scale and have the best spec machines. This is crucial because then the cloud service providers are the ones that have a lot of resources to go around when it comes to RAM and GPUs.Rendering improves with each new release of Unreal Engine and Unity. Advances in real-time rendering, such as virtualized geometry with Unreal Engines Nanite, have significantly reduced the time required to optimize assets for real-time performance while enhancing their visual fidelity, observes Dan Chapman, Senior Product Manager, Real-Time & Virtual Production at Framestore. Looking ahead, Gaussian Splatting is setting a new standard for photorealism in real-time applications. By moving away from traditional polygon-based 3D models and building on Neural Radiance Fields [point clouds that encode light information], Gaussian Splatting offers a more efficient and accurate approach to rendering complex, photorealistic scenes in real-time. Real-time visual effects have raised the expectations of audiences when it comes to immersive, interactive and personalized experiences.A wrinkle in real-time visual effects is that the various render passes that the visual effects team will be utilizing cant be replicated as easily. Building the plane for Hijack. (Image courtesy of Cinesite and Netflix)Chapman remarks, Technologies like augmented reality, virtual reality and projection mapping allow attractions to respond to guest movements and decisions in real-time, creating personalized storylines and environments that feel unique to each visitor. This shift is also taking place online, where audiences are actively participating in experiences in a way that they can shape and share with others. This is particularly evident in platforms like Fortnite and Roblox, where users engage in live events, socialize with friends and collaborate on creative projects.Sometimes, real-time solutions slow down to a traditional visual effects renderer. It can go in the wrong direction if youre pushing it too far, notes Richard Clarke, Head of Visualization & VFX Supervisor at Cinesite. Im curious if we can evolve this two-stage process where you can visualize, explore, iterate quickly, and have a good idea of what your end product is going to be, but still not closing the door on allowing the visual effects team to finish it off or push it to the cloud for higher processing. What you get back is closer to a final version. One little wrinkle at the moment is the various render passes that the visual effects team will be utilizing cant be replicated as easily. The more AOVs [Arbitrary Output Variables] youre pushing out, the more youre going to slow down the real-time. Postvis is a real melding of real-time technology and visual effects pipeline workflows. The nice thing about postvis is its not an end product. Weve got a little trick where we make a beautiful scene in Arnold, bake all of the lights and textures, output shots in minutes direct from Maya and go straight into comp. They almost look final. Thats pre-packaging things. Game engines pre-capture a lot of their lighting to make real-time. Thats where you can save on a lot of processing. The more I use real-time technology, the more I think its going to be a cornerstone of everything. Autodesk showed us a beta version of Unreal Engine in Maya. I got excited about that because weve been doing it the other way around. Having Unreal Engine in your viewport was like a hallelujah moment for me because most visual effects artists are Maya-centric at the moment.As with nature, technology is an ecosystem. What were seeing right now at the top level is the merging of many new innovative technologies, states Tim Moore, CEO of V Technologies. Three years ago, it was all about using game engines for real-time, and with the advances in generative AI, you now see people doing things even more instantly. The merging of those two is interesting; to be generative inside a 3D environment where you have all the perspectives and control. Real-time is most useful at the concepting stage. For people who have simple thoughts and want an extravagant output, AI is amazing because you can give it a little and the AI will fill in the rest. For people who have a specific vision and want it to come to life, AI becomes challenging because you have to figure out how to communicate to this thing in a way where it sees what you see in your head, and you have to use words to do that. The future can be found in the mantra of V Technologies. Moore comments, The vision for our company is content at the speed of thought, and to me that is the next evolution of communication. Encoding and decoding language into sounds and words is an inefficient way to communicate, whereas the ability to use visuals as a communication layer is the most universal language in the world. Everyone perceives the world in a visual way. That ability to make visuals at the speed of thought is the big evolution of storytelling we will see in the next 10 years.
0 Comentários ·0 Compartilhamentos ·9 Visualizações