Visual Effects Society
Visual Effects Society
Professional Honorary Organization
9 people like this
58 Posts
2 Photos
0 Videos
0 Reviews
Recent Updates
  • BUILDING A BETTER MAN WITH WT FX
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of Wt FX and Paramount Pictures.At first glance, having Robbie Williams depicted as a primate in Better Man comes across as a novelty rather than inspired creative choice, but there was a method behind the madness of Michael Gracey, who was previously responsible for The Greatest Showman. The visual effects artist turned filmmaker wanted to show the British popstar as he sees himself rather than his public persona. Wt FX was hired to produce a believable CG protagonist that causes the animal persona to quickly dissipate and allows audience members to get lost in his personal struggles with addiction, mental illness and fame.He had a scream. It was like, Lets really peel those lips back and get those canines out. The nose wrinkles. Lets make it feel primal. But it was never like a full departure. We never went quadrupedal or monkeyish behavior. It helps because you get swept up with the character and story. We dont want people to go, Im looking at a monkey. Dave Clayton, Animation Supervisor, Wt FXThe costumes and fur for Robbie Williams were treated as if they were characters in their own right.When people hear the premise, they get a picture in their head about what this movie is going to be, and then they see the film and the film never matches that picture, notes Luke Millar, VFX Supervisor at Wt FX. They think it will be comedic and caricatures, but there are a lot of components that layer in the character the cinematography, the gritty British backdrop and the fact that Robbie is the only digital thing for the majority of the movie. Better Man has a lot of heart and emotion; thats what sweeps you up. The ape metaphor falls away in the first few shots or scenes. It was a fine line to get the proper onscreen performance. There is an uncanny valley that we were careful not to fall into. Robbie Williams is represented as an ape in the film, but, essentially, he is human in the way he interacts, wears clothes and the style of his hair. Compared to the previous films that weve done, this is probably one of the biggest differences, Millar notes.Three different versions of Robbie Williams had to be produced. Hes a young lad, a teenager and a young man, remarks Dave Clayton, Animation Supervisor at Wt FX. Within there, we had a little bit of give to fill in the blanks. We previsualized a lot of this movie to help not only get into all of the details of how we were going to shoot it but also to help describe the entire arc of the movie and how this character was going to evolve. Its a long journey to get the most out of a character like this; hes had a complex life. Ape characteristics were generally avoided. Its not to say that we didnt have monkey mouth shapes or do a little bit more or less here and there to help things, Clayton states. He had a scream. It was like, Lets really peel those lips back and get those canines out. The nose wrinkles. Lets make it feel primal. But it was never like a full departure. We never went quadrupedal or monkeyish behavior. It helps because you get swept up with the character and story. We dont want people to go, Im looking at a monkey.A new technology was developed by Wt FX that enabled their VFX pipeline to receive data files from concert stage lightboards that allowed them to accurately recreate the lighting digitally.As we do a take, our on-set editor, Patrick Correll, would literally take that take and cut it into the timeline, switch out the previs, and then make sure that we got the camera timing, beats and action lined up. Then we went again. For each setup we would do 30 to 40 takes to try to get the perfect take. The nice thing about this was after shooting we already knew that the things were going to work. Luke Millar, VFX Supervisor, Wt FXAll of the musical numbers were previs. As we do a take, our on-set editor, Patrick Correll, would literally take that take and cut it into the timeline, switch out the previs, and then make sure that we got the camera timing, beats and action lined up, Millar explains. Then we went again. For each setup we would do 30 to 40 takes to try to get the perfect take. The nice thing about this was after shooting we already knew that the things were going to work. The Rock DJ scene was 5,334 frames long and featured five costume changes for Robbie and 500 dancers. Michael didnt want any obvious wipe points such as someone walking in front of the camera right in front of it, Millar remarks. We always tried to do it in a way that we could have some kind of continuity of movement going over the stitch like a digital bus would drive through. Everything was handheld with a little bit of crane work. No motion control work. One of the biggest challenges was getting that single cohesive camera, and it was further complicated by the fact that the two interiors were shot in Melbourne about a year before we shot on Regent Street. We always had to dovetail into those interiors and then back out onto the street again.The wide shots of Albert Hall were filmed with a real audience in London while the orchestra pit was filmed at Melbourne Dockland Studios with 200 extras. The footage from the two locations was combined to create an audience of 5,500.An internal battle literally and figuratively takes place. In Let Me Entertain You, Robbie is having an internal struggle where hes literally battling with himself, Clayton states. They were small versions of Robbie, but armies of them. Its using MASSIVE, but also our motion edit team to put together an army and some simulation tools for the tight-quarters characters getting jostled around by each other. Another fully CG scene occurs underwater for Come Undone. Robbie ends up in a surreal moment under the water, and there are these suicidal teen girls who are upset about him leaving [boyband] Take That. We did some motion capture of using a rope rig to get the performers suspended up in the air pretending to swim, Clayton remarks. We got some good movement there. Then we augmented that. We looked at some references of underwater sports like underwater hockey or rugby to see people struggling against each other.Everything was handheld with a little bit of crane work. No motion control work. One of the biggest challenges was getting that single cohesive camera, and it was further complicated by the fact that the two interiors were shot in Melbourne about a year before we shot on Regent Street. We always had to dovetail into those interiors and then back out onto the street again. Luke Millar, VFX Supervisor, Wt FXCentral to making the primate version of Robbie Williams a believable character was the motion capture performance of Jonno Davies.Atmospherics are in almost every scene. It makes things more complicated in terms of integrating Robbie because theres always these elements that are over the top of him, Millar observes. The dry ice was another level on top of that. The special effects team did a fantastic job of pumping tons of this stuff that they had, but it disappears so quickly. It was the same with the flame mortars in front of the stage. They set those up but we couldnt fire them all up because there was a crane that swept over the top of them. We were replacing half of them and patching in dry ice. However, when you have a real component onstage and were matching up to it, then youve got a great visual goal.In Let Me Entertain You, Robbie is having an internal struggle where hes literally battling with himself. They were small versions of Robbie, but armies of them. Its using MASSIVE, but also our motion edit team, to put together an army and some simulation tools for the tight-quarters characters getting jostled around by each other. Dave Clayton, Animation Supervisor, Wt FXVFX artist turned filmmaker Michael Gracey directs Raechelle Banno while shooting Better Man.At home, Robbie finds solace in his grandmother Bettys (Alison Steadman) support. Robbie Williams is represented as an ape in the film, but he is human in the way he interacts.The world-building offered something different compared to previous projects. Im proud of the scene when Robbie is on the billboard right at the start with his mate, Millar remarks. Normally, we have to build beautiful vistas and epic landscapes, and we ended up building a ring road like a dual carriageway with hideous sodium vapor lighting. Heaps of litter were digitally added. Not only litter but what would you find on the side of a motorway in the U.K. People would chuck some mattresses, shopping trolleys and beer bottles. Its that kind of rich patina of crap that is quintessentially British! That sort of world-building is an area that we never get to experience in the visual effects world. Its always creating these heightened realities, not a gritty down-to-earth reality.Raechelle Banno as Nicole Appleton and Jonno Davies as Robbie Williams performing Shes the One. Films like All That Jazz were an inspiration to director Michael Gracey because they didnt shy away from showing the darker moments and the truth that audiences respond to.At the Knebworth Festival in the U.K., a deteriorating Robbie Williams faces the nadir of his journey while performing Let Me Entertain You for 125,000 fans. The motion edit team put together an army of small Robbies and some simulation tools for the tight-quarters characters getting jostled around in the crowdWt FX was responsible for 1,968 VFX shots. The thing that I find with Better Man, now that weve finished it, is you can watch the whole movie and go, Of course, this was always going to work, Millar states. But I try to picture what was in my head right at the start of this whole process before we had done anything and you didnt know. I had complete faith and confidence in Michael Graceys vision in that we could realize it and make this world work. My first reaction when I heard of the film was, I dont get it, Clayton admits. But then I read the script and said, Yes. I would love to work on this because it sounds unique and different. I loved it. It has been such a special project to be a part of. Ive always thought this was going to be a once-in-a-lifetime, but maybe there are some filmmakers out there who want to innovate like this. I cant wait to see what everybody thinks of Better Man.Watch a behind-the-scenes VFX featurette with director Michael Gracey, Robbie Williams and Wt FX on how Better Man came to life. Click here. https://www.instagram.com/reel/DDxOxGnPTvm/
    0 Comments ·0 Shares ·30 Views
  • VIEWING TOLKIEN THROUGH ANIME FOR THE LORD OF THE RINGS: THE WAR OF THE ROHIRRIM
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of Warner Bros. PicturesMany attempts were made to bring The Lord of the Rings to the big screen with the most famous being John Boorman co-writing a script in 1970 that laid the foundation for his Arthurian classic Excalibur. Escape from development hell finally happened in 1978 when Ralph Bakshi did an adaption that covered half of the trilogy, which combined roto and hand-drawn animation. The franchise is returning to its animation roots with New Line Cinema and Warner Bros. Animation bringing The Lord of the Rings: The War of the Rohirrim to theaters under the direction of acclaimed anime filmmaker Kenji Kamiyama, who is making his first visit to Middle-earth after spending time in New Port City and Los Angeles in 2032 for Ghost in the Shell: Stand Alone Complex and Blade Runner: Black Lotus. The project takes place 183 years before the original trilogy helmed by Peter Jackson and revolves around the childhood friends Hra and Wulf becoming sworn enemies after their fathers engage in a hand-to-hand combat with each other resulting in one of them being killed. Serving as a producer alongside Joseph Chou and Jason DeMarco is Philippa Boyens, who developed the story with Jeffrey Addiss and Will Matthews and has been a frequent writing partner of Peter Jackson and Fran Walsh ever since The Lord of the Rings: The Fellowship of the Ring.When the studio came to us and asked, What about anime? this story seemed to fit what we were looking to do, which is to tell a standalone story with fresh characters who we had not come across before and didnt involve any dark lords or rings of power. When they told us that Kenji Kamiyama was interested in directing it, that got me even more excited.Philippa Boyens, ProducerHra (Gaia Wise), Helm Hammerhand (Brian Cox), Haleth (Benjamin Wainwright and Hama (Yazdan Qafouri) hold court in a scene that makes use of a live-action sensibility but in the style of anime.When it comes to horse riding, motion-capture footage was given to animators as reference so they could heighten the level of realism in their hand-drawn animation.From the beginning, the desire was to tell an animated tale that was a new take on Middle-earth while still having an air of familiarity. We did actually look at other forms of animation and other stories, states Producer Philippa Boyens while attending New York Comic Con to promote the feature with Kenji Kamiyama and Joseph Chou (who served as a translator for Kamiyama). When the studio came to us and asked, What about anime? this story seemed to fit what we were looking to do, which is to tell a standalone story with fresh characters who we had not come across before and didnt involve any dark lords or rings of power. When they told us that Kenji Kamiyama was interested in directing it, that got me even more excited. Shifting from television productions to a feature film did not radically change the animation process. A lot of anime directors set out to be a filmmaker, but in Japan the most freedom you get and the easiest path to directing is to go through the medium of anime, Kamiyama explains. Ive done a few films, but my prominent works have been series where the times demanded it; we either want more DVDs or episodes on streaming. But the filmmaking process and techniques are what I take to every single one of them when crafting and telling the story. When taking on this project [which started as 90 minutes and became 2 hours and 10 minutes], it was a different muscle that I had to use. I was so used to the 30-minute format that I had to change my way of thinking to be able transition into a two-hour format where you have a complete story told within that limited amount of time. Making a movie is like a 400-meter run while a series resembles a marathon. You have to pace yourself and hit your goal throughout that whole period. A movie is not a 100-meter dash, but you do need to move fast and get there by pacing yourself; it is quite an intense experience.Its mature storytelling and deserves a mature visualization. Thats one thing I wanted to do by taking on this film. It presented me with the opportunity to create that kind of visual and film that perhaps breaks the mold for anime in how its presented to audiences outside of Japan.Kenji Kamiyama, DirectorThe intention was always to do an animated film.There is no longer a divide between 2D and 3D animation as the best of both techniques are being creatively fused together in a large part by the growing popularity of adult animation. Animation targeted or appealing to adults is not something new for Japanese anime, Kamiyama observes. If you have seen my work, it has a lot of mature themes and storytelling. But there is definitely a shift in the West where theatrical animation is not just for kids but can also be enjoyed by adults as well. However, when you say animation, there is still this perception that it is for kids. But, for me, I believe that animation has the potential to elevate certain types of storytelling as a visual medium that can appeal to a general audience. One of the motivations for taking this film was this was a chance to do that and have that platform where the audience will be able to come in and at least check out how this kind of storytelling can be done by animation. Its not only about how attractive the drawing is or how good the art is, but its about how people are moving within the frames of the animation; that connects with the technique of how you would want to do this.Hra is modeled in a large part on owyn, with Mirand Otto, who played her in the original trilogy, serving as the narrator.An unconscious agreement exists between animation filmmakers and audience members. Explains Kamiyama, The reason why I took on motion capture and CG animation [is because] even though animation is not as detailed as live-action, audiences come in understanding that it does exist and fill in certain gaps. By adding some element of reality, it heightens the experience so much that it actually makes them appreciate what is happening on the screen even more. That is something which can be done differently from live-action where there is already so much visual detail. Its about how people move, how they get on a horse or how people put things in pockets. The depiction of those little things needed to be done by human hands, but in order to get it done by human hands you need the assistance of CG animation. For me to visualize, set the timing, decide on the camera lens and how the angle should be, all of that was done with motion capture data in an Unreal Engine setting. Then I would go and do the acting for certain actors in specific situations in CG animation. That would be provided completed as a film to the animators to use as a guide to create the hand-drawn animation. Tolkien does not write childrens fairy tales. Its mature storytelling and deserves a mature visualization. Thats one thing I wanted to do by taking on this film. It presented me with the opportunity to create that kind of visual and film that perhaps breaks the mold for anime in how its presented to audiences outside of Japan, Kamiyama remarks.You have this familiar landscape of that valley, which you have seen from the live-action films, but now set against the backdrop of the Long Winter] it has this beautiful, frozen quality. Even when you get a bit of sunlight coming through, you still see the sheets of ice on the side of the cliff faces, and that frozen quality that Kamiyama brings to it visually makes it feel different. We know that silhouette and have seen that long ramp before, but this is different. The subtle changes are beautifully done.Philippa Boyens, ProducerJ.J.R. Tolkien does not write childrens fairy tales, so a mature approach was taken towards the storytelling and the animation.Animation is not simply about drawing pretty pictures but also how characters move within the frame.The Lord of the Rings: The War of the Rohirrim takes place 183 years before the original trilogy helmed by Peter Jackson.Childhood friends Hra (Gaia Wise) and Wulf (Luke Pasqualino) become sworn enemies after their fathers engage in a hand-to-hand combat with each other resulting in one of them being killed.Assisting with concept design was the Wt Workshop.Serving as the narrative spine of the story is the father/daughter relationship between Helm Hammerhand and Hra.Even with the big battle sequences, it is the subtle details that immerse audiences in the world-building and characters.I believe that animation has the potential to elevate certain types of storytelling as a visual medium that can appeal to a general audience. One of the motivations for taking this film was this was a chance to do that and have that platform where the audience will be able to come in and at least check out how this kind of storytelling can be done by animation. Its not only about how attractive the drawing is or how good the art is, but its about how people are moving within the frames of the animation; that connects with the technique of how you would want to do this.Kenji Kamiyama, DirectorRather than have a well-known character as the protagonist, the decision was to explore the fate of the House of Helm Hammerhand and the ancient stronghold of Hornburg, which figures prominently in The Lord of the Rings: The Two Towers as Helms Deep, through the eyes of Hammerhands daughter Hra. Hra is in the story, so shes not created from whole cloth, Boyens explains. I examined a lot of the characters in that story who didnt necessarily out-live it. When you look at the story and what sparks the conflict, it centered around her. Who might actually see through this conflict to its final ending? You begin to realize that its this character, potentially. Hra is not without precedent within Professor J.R.R. Tolkiens world. Obviously, we are distinctively drawing upon owyn, but if you go back even further to the first Haleth, who was a woman, you could see that there is this tradition of quite strong women existing in the Rohirrim culture. Also, having an unnamed character does give you more freedom to be able to say, What happens to this character? Why is she so pivotal to this story? What is her relationship not only to her father but this antagonist? Once you start delving into that, the story came into focus fairly quickly. The thing that Kamiyama found and works beautifully on an emotional level is theres a level of father/daughter story in this that is interesting and beautifully played by Brian Cox and Gaia Wise.The Oliphaunt causes battlefield mayhem, just like in the original trilogy.Helms Keep is revisited, but in a different way. This is a much longer siege that were dealing with in this film, set against the backdrop of the Long Winter, which is an event that affected all of Middle-earth, and a lot of people suffered through it, Boyens states. For the Rohirrim, its one catastrophe on top of another. You have this familiar landscape of that valley, which you have seen from the live-action films, but now it has this beautiful, frozen quality. Even when you get a bit of sunlight coming through, you still see the sheets of ice on the side of the cliff faces, and that frozen quality that Kamiyama brings to it visually makes it feel different. We know that silhouette and have seen that long ramp before, but this is different. The subtle changes are beautifully done. Obviously, the gates are destroyed somewhere in this film because the gates are different than the ones you see in the live-action movies. Kamiyama pushed it even further with other settings, like Isengard, which we are so familiar with in connection with Saruman who inhabited it. Now were seeing it well before that, when the Orthanc is locked up, so what has risen around it are deserted guard towers, which have been taken over by another force of people. You get to see Isengard that was once familiar, but he uses it in a much different way. Then you actually get to see even more details in stuff that we didnt get to see in the live-action films. We get to spend more time in the stables, weirdly. We get to see Edoras alight, which is quite shocking when you see that happen. We get to see the ruins of that city. Again, familiar, but then with this twist that Kamiyama brings to it, which is so beautifully done and visually stunning. The mood that he is able to capture in the film, youve got to see it in a big cinema. Im telling you!
    0 Comments ·0 Shares ·104 Views
  • NEW TECHNOLOGIES MAKING THEIR MARK ON VFX AND ANIMATION
    www.vfxvoice.com
    By OLIVER WEBBThere have been leaps in technological growth over the last few years in the VFX and animation industries. With new developments across the board in virtual production, real-time technologies and LED volumes, 2025 is looking bright, though the industry is still recovering from the effects of COVID-19 as well as the strikes. Further to the technological advancements, there are skills in demand and education available to those looking to delve into the industry. The skillset required for implementing new tools and techniques with a sharp eye on the future while navigating the present is pivotal to keeping up and staying ahead. Following is a variety of viewpoints from industry professionals crossing the street at the busy intersection of VFX and animation.Wt FX doesnt use generative AI. Instead, for several years, theyve been developing and utilizing machine learning, embedded into their artists tools, such as Wts facial performance and image-based refinement systems. Wt FX contributed effects to A Minecraft Movie. (Images courtesy of Warner Bros.)Richard Frances-Moore, Senior Head of Motion, Wt FX At Wt FX weve been innovating and using virtual production since the cave troll sequence in The Lord of the Rings: The Fellowship of the Ring. Virtual production as a visualization technique supporting live-action, animation and performance capture grew as a key part of our workflow on many projects, especially on the Avatar series. For a while, it seemed like virtual production was a no-brainer as it saved costs in post-production and provided a better on-set experience for directors and actors. Since then, practitioners and productions have learned more about the strengths and weaknesses of virtual production high demand for upfront investment, complex scenarios and technical issues can lead to unplanned costs, causing some productions to become a little more cautious. However, with more industry understanding and standards, the scope for successful uses is now stabilizing, and ICVFX has become another piece of the filmmakers toolkit that will no doubt continue to develop and improve. We are continuing to push on our overall virtual production system, working on the integration of our pre- and post-production workflows, as well as integrating Unreal Engine as an option so we can connect and collaborate with productions using that platform.[W]ith more industry understanding and standards [for virtual production], the scope for successful uses is now stabilizing, and ICVFX has become another piece of the filmmakers toolkit that will no doubt continue to develop and improve.Richard Frances-Moore, Senior Head of Motion, Wt FXSSVFX was nominated for an Emmy for its work on the FX series Shgun. SSVFX completed hundreds of shots to bring CGI Osaka in the 1600s to life. (Images courtesy of SSVFX and FX Network)AI and machine learning are having a large impact on the industry at the moment. While we dont use generative AI, for several years, we have been developing and utilizing machine learning, embedded into our artists tools. Examples include our facial performance and image-based refinement systems, and demand for this technology is increasing. On set, AI allows us to provide real-time camera depth, improving the speed and quality of creating digital photogrammetry while reducing the footprint on set. Increased availability and the development of shared standards are also a big movement across the industry. With productions wanting the flexibility to share assets and work between facilities, this has led to the need for better platforms and systems like Open USD, MaterialX and even Unreal providing a form of shared format for some virtual production and previs workflows. We are embracing this wherever we can while maintaining flexibility and continuing to innovate where we still see there are opportunities for improving our work.Or-Bits is a 30-second TV commercial for a fictitious breakfast cereal. The AIE student led-production team was hired by a mock-client to deliver the spot according to a detailed brief. Some AI-generated still images were used in the mock-brief and appear in the commercial. All other work is 100% student-originated. (Images courtesy of AIE)Viktor Mller, CEO, Universal Production Partners (UPP)Is there any growth in terms of employment in the VFX industry? We are coming out of a disastrous time for our industry, and not just in VFX, but across the board. So many people lost their jobs last year. So many companies went under. Its a legitimate crisis. Despite that, were seeing more VFX in film and TV than ever. In a sense, youd think that would be good, but the films and shows you see now are overloaded with CGI elements and the number of CGI artists working on big projects is so gigantic that they cant even fit them all in the end credits despite the long end titles we have now.The AIE VFX Class of 2023 won the 2024 Seattle Film Festival prize for Best Animated Short for their senior project, Spare Parts. (Images courtesy of AIE)I would say that virtual production or AI programming is probably the fastest-growing field. But in the end, they will take jobs from other disciplines. I think everyone knows that were in what people are calling The Great Contraction. We had this boom-time of Peak TV and just tons of films being made for both theatrical and streaming, and now the bubble has burst and there arent enough projects, even compared to previous years. The craziness after COVID brought new talent into the industry, which is great news but, at the same time, it has overloaded the market. Another downside is that this saturation hasnt necessarily improved the quality in the VFX industry but has rather increased average capacity. As we move forward into the new normal, companies will need to focus on efficiency and develop really good pipelines to stay successful.At UPP, were always striving toward a more equitable and diverse workforce, and Im very proud to report that we have gender parity at our studio. Having a diverse group of artists and technicians has always just seemed like smart business to me, but, in our industry, women in particular are seriously under-represented, especially in technical and leadership roles. We have numerous women running various departments including the head of our film and TV department and while were happy to discuss every one of them, we also hope to get to a place where we dont need to talk about it at all because, frankly, it should just be nothing special. It should be the norm. But I think were gradually moving in the right direction as an industry.Jonathan McFall, Head of CG and VFX Trainer, National Film and Television School (NFTS)VFX has been an evolving industry since the beginning with pioneering and breakthrough technologies now making their way into summer blockbusters every year; however, a lot of the core departments and skills remain the same. A modeler makes the model, and an FX artist makes the FX, but things have become more sophisticated within each department with the introduction of USD into company pipelines, allowing for variants in the models, textures, lights, FX and more. Software like Unreal and Houdini are also playing a larger role in productions, and virtual production stages arent just used for the biggest films anymore. Therefore, key skills like understanding USD workflows, shotgrid, lighting, Houdini, virtual production, Unreal and, of course, comp will all help you to gain entry into the industry. These are highly in-demand skills.National Film and Television School (NFTS) VFX students in the U.K. working with a virtual production screen. (Image courtesy of NFTS).The industry may at one point have been more dominated by male figures, but these days it is becoming increasingly more diverse. The National Film and Television School strives to encourage and support people from all backgrounds to feel confident to pursue a career in visual effects, with outreach and over 1 million of funding available for U.K. applicants. There is such a wide range of skills that are needed across many roles ranging from very creative roles in the art department and texturing to very technical roles in FX or pipeline. There are jobs for anyone who wants to work in VFX!Sir David Attenborough tests the technology at Mars Volume in the U.K. (Image courtesy of Mars Volume and Bild Studios)The National Film and Television Schools (NFTS) Visual Effects (VFX) MA is renowned for the 360-degree practical training our students receive, learning creative problem-solving across the VFX pipeline. Our VFX Labs are cutting-edge with dedicated workstations that replicate those used in the VFX industry. Each machine has current industry-standard software, including Maya, Unreal Engine, Nuke, Nuke Studio, Photoshop, Substance, Houdini, Resolve, Reality Capture and PTGui. The students have access to the studio 24 hours a day and are part of a small cohort who receive personal and group tuition, gain experience on set and an understanding of the entire production pipeline. The expansive site, tucked away in a small Buckinghamshire village, offers 18 dedicated shooting spaces for students including a 7,000 sq. ft. main stage, a 4k television studio and virtual production volume.Rowan Pitts, Co-Founder, MARS Volume and Bild StudiosMARS is one of the U.K.s longest-running virtual production brands, delivered by the experienced workflow specialists at Bild Studios. Since its launch in 2021, MARS volume has become one of the bigger LED volumes in the U.K. (25.5 x 5m), with its facilities used by productions such as Netflixs 3 Body Problem, Disneys Culprits and Asif Kapadias 2073, and the MARS team has been on set delivering virtual production for HBOs House of The Dragon and Apple TVs Masters of the Air.Participating and collaborating on open-source software has positioned DreamWorks to more easily maintain and share assets across TV series and features such as The Wild Robot. (Image courtesy of DreamWorks Animation and Universal Pictures)Theres been a trend from bigger is better to recognizing that strategically-sized LED volumes are more cost-effective and work better overall for production. Thats how our business has channeled over the last few years. We are around a 50/50 split between scripted drama and branded content. We are based at our permanent virtual production facility in West London, but the team is mobile and can take virtual production services on the road with them as pop-up solutions. MARS volume is fully serviced, and we often get involved in the world-building of the digital environments for the shoots that come through. Our pop-up service involves building bespoke LED volumes for film shoots at other locations and providing a team to run and operate them as well.Virtual production certainly has an important place in the future of film production. As awareness grows around how a volume can be used, that unlocks producers ability to understand the true value gained with virtual production shooting, and the return that they will receive from shooting on a volume. While it may appear more expensive initially than a traditional studio, if you factor in the savings that could be made, both in terms of time and efficiencies, and in terms of how much content can be shot within a single day, then virtual production can come into its own. We now see experienced producers shooting on MARS volume, who can generate a huge scene throughput by carefully pre-planning and scheduling, proving the efficiency gains to be made are real.Bill Ballew, Chief Technology Officer, DreamWorks AnimationOne of the trends we have seen in the industry over the last few years is an increase in the participation and collaboration on open-source software. With the widespread involvement in the Academy Software Foundation (ASWF) by both studios and third-party tool creators, the viability and usefulness of these projects have increased manyfold. Today, these serve as a basis for nearly every studio, utilizing standard data representations and sharing functionality that has become ubiquitous. As an early adopter of the Linux operating system in the 1990s, to open-sourcing our proprietary renderer, OpenMoonRay, in 2023, we have seen tremendous value in leveraging and contributing to these best-of-breed community solutions. This gives us two major areas of benefit: 1) It allows us to focus our engineering on efforts that differentiate our films, provide unique creative looks and empower our artists, and 2) The common industry building blocks and standards are positioning us to more easily maintain and share assets across our TV series and features.Ed Bruce, Senior Vice President, SSVFXIt has always been the case that technology and creativity go hand in hand. Each year, new technology can help advance creative limitations or inspire the creativity to push further. Over the last few years, technical advances have been coming thick and fast. AI and machine learning are on a rapid growth trajectory bringing many new tools into vendors workflows and pipelines. It can be a challenge to navigate through to ensure a net benefit versus just adapting to change.Launched in 2021, MARS Volume is one of the more agile LED volumes in the U.K. The MARS team is mobile and can take virtual production services on the road. (Images courtesy of MARS and Bild Studios)Ultimately, our goal is about delivering beautiful storytelling imagery on time for the right price. If machine learning can play its part in that, then it is worth adapting to. At the moment, however, many of the tools can be a minefield to navigate regarding security, which is of key importance to us and our clients. Datasets must exist within the boundaries of a secure network, and any communication outside of that network comes with an incredible risk.At SSVFX, one area we specialize in is facial augmentation and de-aging. This is an area where weve seen AI advancements promising easier, faster and better outcomes. However, weve found that this is not the case with our proprietary tools and workflows that we continue to adapt and build upon. Weve found we still hit the quality, time and cost better than what the machine learning approach is currently delivering, plus there is no dataset or security concern and, more importantly, no infringement with actors contractual concerns regarding AI. The ability to make nuanced changes to service the clients direction can be very challenging with the current AI tools. This isnt an issue within the traditional approach.Every year, new tools and technology will be created to support the VFX team. Some will be adapted and become a staple within our workflows, others will fall away. Its important for us at SSVFX to continue to stay informed and adaptive to these advancements.Joseph Bell, Principal, HDRI ConsultingI think people were surprised by how slowly VFX work has started to pick up after last years SAG-AFTRA and WGA strikes. Focus on the strikes masked an enormous drop in project commissioning by the major streamers compared with the year before, driven in part by pressure on the streamers to reach profitability. To give just one example, Disney+ spent around 40% less on commissioning content in 2023 than in 2022. And 40% fewer productions started principal photography in the U.S. in Q2 2024 than in Q2 2022. This dramatic decrease in the amount of film and TV commissioning, with the U.S. being especially hard hit, had a huge impact on employment in the visual effects industry worldwide. A lot of skilled VFX professionals have been out of work for many months in some cases, more than a year. With Disneys streaming services reaching profitability in mid-2024 and Paramount close behind, production should finally pick up. As an industry, this is a great time to reassess how we operate and what we can do to make companies and careers more resilient for the future.Countries with strong domestic film industries and those that are less reliant on work from U.S. productions to stay busy, such as Japan, Vietnam and to some degree France, have been less impacted than countries where a large majority of the spend on high-end VFX comes from the U.S. studios.Joseph Bell, Principal, HDRI ConsultingUPP has become one of the most active and versatile visual effects and post-production houses in the Central Europe, and has contributed VFX to Gran Turismo (above) as well as Barbie, Conclave, Five Days at Memorial and Blade Runner 2049, on which UPP CEO Viktor Mller served as Visual Effects Supervisor. (Image courtesy of Sony Pictures)Its difficult to highlight growth when so many VFX professionals around the world have spent large portions of 2023 and 2024 without work. Countries with strong domestic film industries and those that are less reliant on work from U.S. productions to stay busy, such as Japan, Vietnam and to some degree France, have been less impacted than countries where a large majority of the spend on high-end VFX comes from the U.S. studios. We are seeing growth in the number of roles for which knowledge of RT3D tools like Unreal Engine is becoming important. Knowledge of these same tools equips VFX companies and artists to work on VR/AR/XR, gaming and immersive projects. Among the various new technologies currently making their mark on the industry, real-time 3D is going to have the most far-reaching and enduring impact.Andy Romine, Senior VFX Instructor, AIEThere continue to be perennial opportunities for junior artists in lighting and compositing roles, though skills like rigging, VFX and Creature FX are always in high demand. Theres also been a recent surge in opportunities for traditional VFX artists who have experience working in real-time, e.g. Unreal Engine, for use in previs and virtual production pipelines.We model a real-world studio environment at AIE, so in addition to learning the technical skills required for a career in the VFX industry, students are taught to work in teams to real and simulated client briefs, manage asset creation and deadlines, and develop an online portfolio they can use to apply for a job. Students have access to state-of-the-art workstations and are instructed in industry-standard software including Maya, Renderman, Houdini, Katana, Nuke and Substance Suite. Every week, our Lunchbox program brings in professionals to talk about their experience in the industry and provide our students with opportunities to network. We have regular contact with members of the Visual Effects Society that AIE sponsors who give Lunchbox talks, provide mentorship and detailed portfolio reviews.The ability to make nuanced changes to service the clients direction can be very challenging with the current AI tools. This isnt an issue within the traditional approach.Ed Bruce, Senior Vice President, SSVFXBecause of Seattles importance as a game development hub and our parallel game art track VFX students are exposed to real-time game engines like Unreal 5. Our students have gone on to work at a variety of VFX houses from boutiques to major studios. Weve also had graduates migrate to the game industry with their crossover skills. Recent notable projects from grads include The Mandalorian, Deadpool & Wolverine, Doom Patrol and Destiny 2. The VFX Class of 2023 won the 2024 Seattle Film Festival prize for Best Animated Short with their senior project, Spare Parts.
    0 Comments ·0 Shares ·119 Views
  • LAURA PEDRO EXCELS AT THE MAGIC OF MAKING EFFECTS INVISIBLE
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of Laura Pedro.Laura Pedro, Visual Effects SupervisorEven though Society of the Snow thrust Laura Pedro into the international spotlight and garnered a nomination at the VES Awards as well as trophies from the European Film Awards and Goya Awards, she has been amassing an impressive portfolio that includes A Monster Calls, Way Down (aka The Vault) and Superlpez. Born and raised in Montgat, Spain, Pedro currently resides in Barcelona where she is a visual effects supervisor for El Ranchito.I did not come from an artistic family, but I always liked photography. When I was little, my father gave me his camera, and I began trying to tell stories with it; that is when I figured out this is maybe something for me. When I was 16, our English teacher proposed to us to make a project for the government that would give us money to learn English outside of Spain. My classmates and I decided to make a movie about the robbery of The Scream by Edvard Munch. We won the money to travel to Toronto and stayed there for a month to learn English and finish the movie. The intervening years have strengthened her self-confidence. The difference from then to now is I finally found my own voice.Photography teaches the fundamentals of making an image. I know a lot of things about visual effects, but in the end, its all about light, Pedro notes. If you dont know anything about light, its impossible to tell things with images. Its incredible how photography connects with everything. Originally, the plan was to study photography, not visual effects, at ESCAC (Escola Superior de Cinema i Audiovisuals de Catalunya), which is the alma mater of filmmaker J.A. Bayona. I had an accident during the first year of school; I lost all of the exams and couldnt specializein photography. Because of that, I decided to go for my second selection, which was visual effects. The schooling was beneficial as it provided an overall understanding of the filmmaking process. When I was studying, every week we did a short film, and I was either producing or doing camera or directing. That caused me to work with different teams and learn various things for five years. When I finished school, it was easy for me to start as a visual effects supervisor and compositor, and know what the director wants and help them with visual effects.Robert Zemeckis has left a lasting cinematic impression. There are a lot of movies that I cant get out of my mind, like Death Becomes Her and Who Framed Roger Rabbit. In my career, I normally do invisible effects, but when I have the opportunity to do comedies with visual effects, its fun for me to be part of that. Of course, you know that its fake, but they tried to do it in the most realistic way. Innovation can come from unlikely sources. InDeath Becomes Her, they developed how to do skin in CGI, then the same technology was used for Jurassic Park. For me, its interesting and cool that you are developing a technology not doing Titanic, but a comedy. Its awesome to have the chance to create new technology. This has never happened in Spain because the industry is small, but we have the opportunity in the films we make to take the technology from the big productions and try to use it in our smaller ones, and teach the producers in Spain that they can trust these new technologies and ways of working with visual effects.Pedro on a set for The Vault (aka Way Down), which is part of the supporting chamber located under the Bank of Spain.(Photo: Jorge Fuembuena)Its important to have role models. Now that we have the schools, maybe in 10 years I will work with more supervisors who are women. Its not about gender. Its more about the age you start doing visual effects or become avisual effects supervisor because Im 34 and other supervisors I know are 40 or 50. We are not of the same age and have different ways of thinking.Laura Pedro, Visual Effects SupervisorOver the past decade, a better understanding of visual effects has taken root in the Spanish film industry where digital artists are brought into pre-production to properly plan for complex shots rather than being seen simply as post-production fixers. Visual effects are in all of the productions, so its easy to work continually, Pedro states. There are more visual effects companies that do television commercials trying to upgrade to narrative fiction. Television and streaming have become a driving force in Spain. Here, the film productions are small, so television and streaming productions allow us to continue working through the year. Maybe you do a film and at the same time two TV shows.Filmic visual effects have made their way to the small screen. The difference when you do a project like Game of Thrones or Ripley is that theres a lot of work in pre-production trying to find the perfect design with the cinematographer and production designer in the most cinematic way, Pedro remarks. Other projects work faster. One has to be willing to adapt. In the end, every project, director and producer is different. Its like a new adventure. When I begin working with a new client, I need to have a lot of patience, try to understand and be faster because I only have three months. Normally, I work with visual reference found on the Internet or pictures or videos taken with my camera. I have this capacity to find what is exactly in the mind of the filmmaker with the reference that I have imagined and later start working with our team doing the concept.Pedro in the Andes in Chile while shooting the scenes where Nando and Canessa attempt to find help in Society of the Snow. (Photo: Quim Vives)For Society of the Snow, the art department built two fuselages and the previs department at El Ranchito helped director J.A. Bayona design each shot of the crash. (Photo: Quim Vives)Pedro participated as Visual Effects Supervisor in the Spanish interpretation of the superhero genre, which resulted in Superlpez (2018).In 2013, Pedro made her debut as a visual effects supervisor for Barcelona nit destiu (Barcelona Summer Night) by director Dani de la Orden, and she would reunite with him a decade later for Casa en flames (A House on Fire), which is currently the most viewed movie in Catalonia. A major career turning point occurred when the emerging talent was recruited by a prominent Spanish visual effects company. I was doing a short film for a Spanish singer named David Bisbal, and El Ranchito called me to begin working with them on A Monster Calls, Pedro recalls. Flix Bergs [Founder and President, El Ranchito] is my mentor, and I learned from him its better to start with the real world and plates, and after that begin working with CGI because that mixture works better for the human eye. Also, he gave me the power to say, No when Im on set.Pedro and Special Effects Supervisor Pau Costa with their Goya Award for Best Special Effects for Society of the Snow in 2024. (Photo: Papo Waisman)Visual Effects Supervisor Flix Bergs, Special Effects Supervisor Pau Costa and Pedro celebrate Society of the Snow winning the Goya Award for Best Special Effects in 2024. (Photo: Ana Beln Fernandez)Visual Effects Supervisor Flix Bergs and Pedro on the set of A Monster Calls (2016), which was their first project together. (Photo: Quim Vives)Pedro after winning her first Goya Award in 2019 for Superlpez. which was shared with Special Effects Supervisor Llus Rivera.Personal highlights include a childrens fantasy tale, a comic book adaptation and the recreation of a historical event. Its not common in Spain to do a movie about a superhero like Superlpez, Pedro observes. We had to build a robot that was 10 to 12 meters tall. Before Superlpez I worked on A Monster Calls where we needed to build a monster that was also 12 or 13 meters tall, so I knew how to film a movie about this difference of dimensions and create something really big. Society of the Snow is a movie that has entirely invisible visual effects. We achieved that by traveling to the Valley of Tears, doing all of the environments with real photography, managing all of this material and developing the tools to work with five vendors at the same time while maintaining consistency. It was a lot of work.Nowadays, the protg has become a mentor. Its important to have role models, Pedro states. Now that we have the schools, maybe in 10 years I will work with more supervisors who are women. Its not about gender. Its more about the age you start doing visual effects or become a visual effects supervisor because Im 34 and other supervisors I know are 40 or 50. We are not of the same age and have different ways of thinking. Patience is a key trait. The most important thing is to be yourself and talk about things. I continue to learn by reading books and watching films. I try to remain connected with the technology and new tools, but its completely impossible to know everything. Real-time and machine learning have introduced new ways of working. There is a good and bad way of using technology. We need to be calmer because we rely on each other in the end to do the things that we love, which in turn creates an emotional response from the audience.
    0 Comments ·0 Shares ·103 Views
  • WICKED IS A COLLAGE OF DIFFERENT LENSES AND TALENTS
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of Universal Studios.Elphaba (Cynthia Erivo) and Glinda (Ariana Grande) get a private tour of the Royal Palace of Oz.In recent years, exploring the backstories of iconic villains has become more in vogue with the release of Maleficent, Joker and now Wicked, a Universal Pictures production that brings the Broadway musical adaptation of Wicked: The Life and Times of the Wicked Witch of the West by Gregory Maguire to the big screen. No stranger to musicals is filmmaker Jon M. Chu, who has been making them ever since he was a USC film school student, but this time around, the scale is a throwback to Hollywood classics such as The Wizard of Oz, with the added benefit of the visual effects industry, which didnt exist back then.There is this grandiose nature to Wicked, but from the beginning, we always wanted it to feel touchable and immersive, director Jon M. Chu explains. We wanted to break the matte painting of Oz that we have in our mind. What happens if you could live in it? What happens if you can touch the dirt and textures? Visual effects are extremely powerful to be able to do that. Of course, we worked hand in hand with building as well by planting nine million tulips and having a real train and Wizards head, but were all in it together.Jonathan Bailey as Prince Fiyero performs in front of the rotating university library set, which was constructed by the special effects team led by Paul Corbould.Massive sets were built. I firmly believe youve got to build as much set as you possibly physically can, or do as much for real as you possibly physically can because the real photography on that set informs visual effects on how everything should look, states Production Designer Nathan Crowley. That is fundamental. You cant just put a bluescreen up because youre going to get enough of that anyway. Youve got to try to balance it. The act of physical construction is extremely informative. Crowley says, The thing is, if you do a concept and dont build it, then you miss out on the art direction of making it. Doing concept art in 3D was imperative. We will build, paint and finish a 3D model and will deliver it rendered to Pablo Helman [Visual Effects Supervisor]. Pablo has to rebuild it because visual effects have to go into a lot more specific areas, but at least he knows what it should look like. We also go scout places, and even if we dont film that place, well say to Pablo and Framestore, which does a lot of the environments, Thats what we need it to look like. We need to go to the south coast down to Poole and Bournemouth and get that set of cliffs, and that becomes Shiz. Emerald City is a hard one because youre going much higher [digitally]. I would try to build enough below 50 feet so he would have textures.Cinematographer Alice Brooks stands behind director Jon M. Chu as he discusses a shot she captured with custom-made lenses by Panavision.Cynthia Erivo decided to go with practical the stairs, and it all becomes this one long Steadicam shot that makeup for the green skin of Elphaba, which was then finessed digitally in post-production.Special Effects Supervisor Paul Corbould built the The Emerald Express, which was designed personal motorized carriage for the Wizard of Oz.Unreal Engine influenced the shot design. Emerald City was the last set that was built and was behind schedule, states Cinematographer Alice Brooks. We had this idea for when Elphaba and Glinda get off of the train, and we start to push down the stairs, and it all becomes this one long Steadicam shot that ends on a crane that lifts up. We had been working on this for months but couldnt get into the set to start rehearsals because all of the construction cranes and painters were in there. What we did do was take the game controller, go into Unreal Engine and start designing the shot. When walking the set in Unreal Engine, we realized that this big stepping-onto-crane move didnt show off the city in any spectacular way; that being low was the way you saw the city in this amazing way. Then, we threw out our amazing Steadicam idea, which our A camera operator was bummed out about, and we created something new in Unreal Engine that was perfect.Glinda, portrayed by Ariana Grande, makes use of her bubble wand.This aerial shot of Munchkinland showcases the nine million tulips that were planted.Numerous production meetings were held to discuss how to deal with the green skin of the future Wicked Witch of the West, Elphaba, portrayed by Cynthia Erivo. We wanted to have all of the options on the table then work with Cynthia herself to know what she needed as an actor, Chu explains. We did a lot of tests with a double to show Cynthia real makeup, semi-makeup where you only do the main areas, and completely non-green makeup because we knew that makeup every day for that long of a shoot could be grueling and would also take away time from actually shooting. Cynthia was like, I need the makeup. Of course, there is some cleanup that we needed to do because sometimes her hands were thinner on certain days than others. The green skin had to look believable and work in any lighting condition. David Stoneman, who is a chemist who makes products for our industry, took my green design, which was from products called Creamy Air and Illustrator, and the discontinued product that I had found, and put three drops of yellow neon into the base, explains Hair Designer/Makeup Designer/Prosthetics Designer Frances Hannon. It reflected off the dark skin tone and made it look like it was her skin, not like it was green-painted on the surface, and more than that, it worked in every light.A lens flare, rainbow and the Yellow Brick Road are incorporated into an establishing shot of the Emerald City.The head of the Wizard of Oz was a massive animatronic puppet hung from the ceiling of the studio.Prosthetic makeup was required to show the characters of Boq (Ethan Slater) and Fiyero (Jonathan Bailey) being transformed into Tin Man and Scarecrow. One of my most important things was working with Mark Coulier [Prosthetic Makeup Designer] again, Hannon remarks. For Tin Man, we wanted to achieve something sympathetic because it should have never happened to Boq. In our story, Elphabas spell goes wrong in Nessarose [Marissa Bode]s office, and everything metal in that room attaches to Boq; his breast plate would be the tray on the table, and his hands become the thimbles, salt and peppers. Then, the visual effects took over because all the joints were blue. With Scarecrow, Jon and Mark particularly wanted to keep Jonathan Baileys face shape. We also kept his nice teeth and natural eye color for Scarecrow. I used contact lenses on Jonathan for Fiyero, so we had a nice change there. Then, for his head element, I put masses of gold blonde through his look as Fiyero, which carried onto Scarecrow in a straw-colored wig; that kept Fiyero attractive because Elphaba and he fall in love.Most of the 2,500 visual effects shots were divided between ILM and Framestore, with other contributors being OPSIS, Lola VFX, Outpost VFX and BOT VFX. The CG creatures were difficult because they also talk, but they are mainly animals, Helman remarks. They dont walk on two legs. If its a goat that talks and is a teacher, its basically a goat if you look at it, then he talks. It was a fine line stylizing the talking so that it doesnt feel like a completely stylized character, but also finding the expression, the eyebrows, eyes and mouth, the phonemes, and how articulate those creatures are. We had an animal unit of about 10 people or so that would play animals on set, and we would shoot a take or a few takes with them. We had a transformation scene where the monkey transforms and gets wings, so we had the whole animal unit performing and being directed by Jon. Sometimes, the second unit would stay there to shoot plates. Besides the music, dancers, choreography and huge sets, then there were the animals.The mandate was to capture as much in-camera, which gave Nathan Crowley the freedom to construct massive sets.Magic was always treated in a grounded manner. Its not a cutesy, glowing, sparkling thing, Helman notes. There is nothing wrong with those kinds of things; its just that this version of Oz is not magical. You have to remember, when you go back to the original story, the Wizard of Oz is not really a wizard. Creative solutions had to be applied to achieve the desired effect. How do you make a book glow without making it look completely fantastical and cartoony? Helman explains, Maybe what you do is provide a language inside of the book with words that may become golden that wasnt golden in the beginning. So, you see a transition between a word that is on a black ink parchment to something golden that produces a glow and is completely grounded. Broomsticks are a form of aerial transportation. We worked with the stunt department to get the center of gravity correct and to be able to move the actors around. Cynthia Erivo wanted to do her own stunts, so she did. All of that wirework was closely planned. There are two things: Theres the center of gravity and what the body is doing in the air, and the lighting. If we get those two things right then were fine, Helman says.Water was seen as the key method of transportation to Shiz.Elphaba (Cynthia Erivo) begins to master the art of flying a broom.An establishing shot of Shiz University.Elphaba (Cynthia Erivo) and Glinda (Ariana Grande) decide to take a more relaxed approach to flying by taking off in a hot air balloon.A major accomplishment was the practical realization of the Emerald City Express, which is the personal train of the Wizard of Oz. It was Nathan Crowleys vision, states Special Effects Supervisor Paul Corbould. We built the running gear and the track and motorized it. Its hydraulically driven. Construction clad it with all of the fiberglass panels. The motion was repeatable. The train could be programmed to drive to a particular spot, run down and stop at a position, and when told to start again, make its next move and run to the end of the track. You can take it back to the beginning and keep on doing that, remarks Special Effects Design Supervisor Jason Leinster. Equally impressive was the construction of the Wizards head. Jason reverse-engineered the scale model and changed the electric servos to hydraulic rams and a whole control system, Corbould explains. It progressed from that. The head was suspended from the ceiling of the stage. It was a civil engineering project to have something like that floating in the middle of space, Leinster notes. It was 22 axes and puppeteered by one person. Most of it was done live and was often changing.Anti-gravity architecture serves as the basis for Kiamo Ko, which is a castle located on the peak of Knobblehead Pike.Other complicated rigs included the rotating library. Because it was first up in the schedule, we built one wheel and ladder, and the dancers with Chris Scott [Choreographer] rehearsed on that one, Corbould states. As we built another one, they were rehearsing on that with more dancers, and we built a third one. It took three months. An amazing prop was the fountains. The petals opened up, and they wanted water to come out, Corbould remarks. Weve got these hydraulic motion bases, and in the middle is a slit ring that allows you to turn a rig round and round without winding the cable up. We had to take a slit ring off, which you normally run a hydraulic oil through, and put that on the fountain. It ruined it because we were running water through it; that was quite a challenge. A bricklaying machine gets pulled by a bison. There was no bison to pull it, so the machine was self-driven, Leinster reveals. You could sit back and steer it. We had a roadway of foam bricks rolled up inside, and as the machine drove forward, it unrolled the Yellow Brick Road. Eventually, it drove off into the sunset, being pulled by the bison. You probably wont realize that is an effect.Madame Morrible (Michelle Yeoh) has the ability to control the weather, so there is a cloud motif to her hairstyle.To convey the impression of a floating castle, the concept of anti-gravity architecture was developed. Kiamo Ko isnt just a castle, Crowley observes. Its a defiant emblem of a bygone era, a testament to the forgotten magic that once pulsed through Oz. Its architecture, though ancient, utilizes lost principles of levitation, defying gravity yet remaining grounded in a sense of order and purpose. The key to Kiamo Kos defiance lies not in defying gravity entirely but in manipulating it subtly. Imagine a series of inverted arches, their points reaching skyward. These arches wouldnt be perfect mirrors of one another; instead, they possess a slight asymmetry, a calculated tilt that interacts with the forgotten magic of the land, generating a gentle, constant lift. This subtle slant would also provide a visual cue, hinting at the castles orientation even from a distance. By incorporating these design elements, Kiamo Ko transcends the trope of a generic floating castle. It becomes a character itself, a silent testament to a forgotten age and a beacon of hope for Elphaba and Fiyeros new beginning.Skies played a major role in setting the proper tone for scenesThroughout the whole movie, there is this idea that the sun is always rising for Glinda (Ariana Grande) and setting for Elphaba (Cynthia Erivo).Jonathan Bailey plays the role of Fiyero, who goes on to be transformed into the iconic Scarecrow.Lenses were developed specifically for the production (which evolved into the new series of Ultra Panatar II) that were paired with the ARRI ALEXA 65 cameras. Jon told me that he wanted Wicked to be unlike anything anyone had ever seen before, and the photography needed to represent that, Brooks states. I was on the movie so early I was able to design them with Dan Sasaki at Panavision in Woodland Hills. We called them the Unlimiteds after Elphaba singing Unlimited in Wicked because at the time they didnt have a name. Those lenses capture all of the pictures that Nathan, Jon and I put together for so many months, and they wrap the light beautifully on our actors. Usually, youre matching close-ups on the same lens, but on Elphaba, we shot her on a 65mm lens and Glinda on a 75mm lens, and we matched the size, but those two lenses did different things to their faces. Oz is a different place, and something is a little bit off everywhere. Our A and B 65mm lenses were not the same. It was a collage of lenses. Each one had such a different characteristic, and that made our movie feel different. Elphaba even has one line in the movie that goes, Some of us are just different. Thats what we want our Oz to be.Musical numbers were as complicated to plan and execute as action sequences.Various animals are part of the faculty at Shiz University, with Peter Dinklage doing facial capture and the voice of Dr. Dillamond.Apple Vision Pro is an essential part of the editorial process. I am overseeing the edit in the Vision Pro, Chu explains. Instead of being trapped in a monitor on a desk, which isnt the most creative, I can be like I am in the room with Myron Kertstein [Editor] where Im walking around or sitting on the couch. We can do visual effects approvals there too. I can bring it on and draw with my finger where certain areas need to be improved or whatnot. Hannon looks forward to seeing everything being brought together. For me, its seeing those finishing touches. The sets were 60 feet high. then we would have bluescreen. I do believe Paul Tazewell [Costume Designer] and myself, to the best of our abilities, gave Jon the spectacular, extraordinary and timeless look that he was after.Wicked is spanning two movies, with the first one centered around the song Defying Gravity and the second song For Good. Its in two parts, but we shot the whole movie in one lifetime! Helman laughs. I look at every project as a traumatic project where you develop these scars and learn from those scars, but you wear them proudly. Teamwork reigns supreme for Chu. Each department can make everything, but the reality is that we need to work together to make the thing that none of us can make alone. I feel lucky to be working with a team at the highest level, with the bar at the highest place for us to cross. It has been an amazing journey.
    0 Comments ·0 Shares ·119 Views
  • STREAMING AND VFX: CULTIVATING THE ABILITY TO ADAPT TO CONSTANT CHANGE
    www.vfxvoice.com
    By CHRIS McGOWANShgun (Image courtesy of FX Network)Despite the lingering effects of 2023s writers and actors strikes, the streamers continue to disrupt the industry. Streaming has increased the demand for VFX work and accelerated the growth of all parts of the production and post-production industries, says Tom Williams, Managing Director of DNEG Episodic.Among the leading streamers, Netflix had 277.65 million paid subscribers worldwide as of the second quarter of 2024, according to Statista research, an increase of over eight million subscribers compared with the previous quarter, and Netflixs expenditures on content were expected to stabilize at roughly 17 billion U.S. dollars by 2024. Also, by 2024, the number of Amazon Prime membersin the United States was projected to reach more than 180 million users. In Q2 2024, the number of Disney+ subscribers stood at around 153.6 million, according to Statista, while the combined number of subscribers to Warner Bros. Discoverys Max (formerly HBO) and Discovery+ services surpassed 103 million. Apple TV+, Hulu, Paramount+ and Peacock are among the others with significant viewers.Such subscriber numbers have bankrolled a lot of visual effects and animation. Streaming has been a game-changer for the VFX industry. It has significantly increased demand. With platforms constantly producing new content, visual effects studios have more opportunities than ever before, comments Valrie Clment, VFX Producer at Raynault VFX Visual Effects & Environments. The rise of streaming has also shifted the focus from traditional films to high-budget series, which has diversified the types of projects we work on at Raynault. Jennie Zeiher, President of Rising Sun Pictures (RSP), remarks, The advent of streaming had a huge impact that were still feeling today, not only for global consumers,but studios, production companies, TV channels, post houses, VFX studios; the entire industry was impacted. [It was a major disruption in the industry] that changed how content was consumed.The Last of Us (Image courtesy of HBO)BUDGETS & MODELSStreaming changed the way the industry was divided up and took away market share from broadcast and theatrical, according to Zeiher. She explains, In 2017, RSPs work was still wholly theatrical. We predicted that over the course of that year, we would be progressively taking on more streaming projects and that the year following, our work would be distributed 50/50. This indeed played out, and it tells the story of how a disruptive change can affect a business model. Fast forward to today, the industry is more complex than ever, made more so by the fact that streaming opened up distribution to a global, multi-generational audience, which is more diverse than ever.Everyone is more budget-conscious at the moment, which is not a bad thing for VFX as it encourages more planning and the use of previs and postvis, which helps everyone deliver the best possible end product, Williams says. We are a technology-driven industry that is always moving forward, combined with incredible artists, so I think we will always see improvements in quality. Zeiher adds, I think studios are still trying to settle on their model. There are fewer big hits due to diversity in taste, and there are more risks around greenlighting productions at a higher price point. What made a hit five or 10 years ago isnt the same as it is today. Thereis more diverse product in the pipeline to attract more diverse audiences. The streamers are producing high-end series, but they are more concentrated to a handful of studios.3 Body Problem (Image courtesy of Netflix)House of the Dragon (Image courtesy of HBO)Foundation (Image courtesy of AppleTV+)The Lord of the Rings: The Rings of Power (Image courtesy of Prime Video)The Boys (Image courtesy of Prime Video. Photo: Jan Thijs)SHARING WORKProductions normally split work between multiple vendors, Zeiher notes. This work can be sensitive to timing and schedule changes. Therefore, VFX vendors need to have a plan on how they manage and mitigate any changes in schedule or type of work. Besides capability and the quality of the creative, this is the biggest singular challenge for VFX vendors and is the secret to a successful studio! Zeiher adds, Studios have always split work between multiple vendors, and only in limited scenarios kept whole shows with single vendors, and this continues to be the trend. The studios are splitting work among their trusted vendors who have the capability in terms of crew and pipeline to hit schedules and manage risks.The increase in work has meant that more shows than ever before are being shared between different VFX houses, so that will add to the cooperation. Being a relatively young industry, it doesnt take long to find a mutual connection or 10 when you meet someone else from VFX at an event, Williams says. Comments Wayne Stables, Wt FXs VFX Supervisor on House of the Dragon Season 2, Im not sure that Ive seen a big change [in businessand production models]. We bring the same level of creativity and quality to everything we do, be it for feature film or streaming, and use the same tools and processes. I approach it the same way asI would working on a film. I think episodic television has always pushed boundaries. I remember when Babylon 5 came out [and] being amazed at what they were doing, and then seeing that ripple through to other work such as Star Trek: Deep Space Nine.Fallout (Image courtesy of Prime Video)In Your Dreams. Coming in 2025. (Image courtesy of Netflix)The Wheel of Time (Image courtesy of Prime Video)HIGHER EPISODIC QUALITYWorking with the VFX studios, the streamers have set the visual effects bar high by bringing feature film quality to episodic television. Game of Thrones comes to mind despite starting before the streaming boom. It revolutionized what viewers could expect from a series in terms of production value and storytelling. Laterseasons had blockbuster-level budgets and cinematic visuals that rivaled anything youd see in theaters, Clment says. Netflix has also made significant strides with shows like Stranger Things, which combines appealing aesthetics and compelling storytelling, and The Crown, known for its luxurious production design and attention to detail. Also, series like Westworld and Chernobyl both deliver sophisticated narratives with stunning visuals that feel more like feature films than traditional TV. These are just a few examples, of course. The range of projects that have made a significant impact in the streaming world is vast.Zeiher also points out the streaming titles The Rings of Power, Avatar: The Last Airbender, Shgun, Monarch: Legacy of Monsters, Loki Season 2, Fallout [and] the Star Wars universe, with recent series such as Andor, Ahsoka and The Acolyte as having brought feature-film quality to episodic. Stable comments, As the techniques used on big visual effects films have become more common, we have seen more high-end work appear everywhere. Looking at work in Game of Thrones and then, more recently, Foundation and through to shows like Shgun. And, of course, I am proud of our recent work on House of the Dragon Season 2, Ripley and The Last of Us.EXPECTATIONSThe expectation of quality never changes showrunners, writers and directors can spend years getting their visions greenlit no one is looking to cut corners. We all want to do our best work, regardless of the end platform, Williams says. Regarding the delivery dates for series episodes, Stables comments, I havent ever found the timeframes to be short. The shows tend to be very structured with the fact that you have to deliver for each episode, but that just brings about a practicality as to what is important. As with everything, the key is good planning and working with the studio to work out the best solution to problems. Clment says,While the compressed timelines can be challenging, the push for high-quality content from streaming platforms means that we are constantly striving to deliver top-notch visuals, even within tighter schedules. This is always exciting for our team.Sakamoto Days. Coming in 2025. (Image courtesy of Netflix)A Knight of the Seven Kingdoms: The Hedge Knight. Coming in 2025. (Image courtesy of HBO)CHANGES IN THE STREAMER/VFX RELATIONSHIPI think that showrunners and studios are seeing that it is now possible to create shows that perhaps in the past were not financially feasible. So, we are developing the same relationships [with the streamers] that we have had with the film studios, seeing what we can offer them to help tell their stories, Stables states. Relationships can be reciprocal, or they can be transactional, Zeiher observes. In VFX, we very much operate in a reciprocal relationship with the studios and their production teams; its a partnership at every level. Our success is based on their success and theirs on ours.Knuckles (Image courtesy of Paramount+ and Nickelodeon Network)GLOBAL COOPERATIONStreaming is enhancing global cooperation among VFX studios by creating a greater need for diverse talent and resources. Clment says, As streaming platforms produce more content, studios around the world are teaming up to manage the growing amount and complexity of VFX work. Advances in remote work technology and cloud tools make it easier for teams from different regions to collaborate smoothly and effectively. Zeiher explains, RSPs work on Knuckles is a great example of global, inter-company collaboration. Instead of using a single vendor, the work was split between several, mostly mid-size, vendors. The assets were built to a specification and shared using Universal Scene Description, allowing asset updates to be rolled out simultaneously across vendors and providing a consistent look across the characters. Paramounts approach to Knuckles was very smart and could be indicative for future workflows.The Witcher: Sirens of the Deep. Coming in 2025. (Image courtesy of Netflix)VFX is a tumultuous industry and, off the back of the WGA and SAG-AFTRA strikes, weve entered a time of consolidation, says Zeiher. Studios, often backed by private equity, are acquiring small to mid-size studios. This is helping them to distribute work globally across many jurisdictions. Dream Machine is an example of this new collaborative model with its recent acquisition of Important Looking Pirates and Cumulus VFX, joining Zero, Mavericks and Fin Design. Likewise, RSP has its sister studios FuseFX, FOLKS and El Ranchito under its parent company Pitch Black; its a new form of global collaboration, mid-size studios, with different offerings across brands and locations who can collaborate under one banner.I think that the streaming distribution model was the first disruption, and that distribution continues to evolve, Zeiher comments. The production model may now be disrupted through the use of GAI. Combining the distribution evolution, audience consumer changes and using GAI in production, were in forlots more changes in the year(s) to come. Clment states, As streaming platforms experiment with new content formats and distribution methods, VFX studios will adapt to different types of media and storytelling approaches.
    0 Comments ·0 Shares ·124 Views
  • AI/VFX ROUNDTABLE: REVOLUTIONIZING IMAGERY THE FUTURE OF AI AND NEWER TECH IN VFX
    www.vfxvoice.com
    By JIM McCULLAUGHHere features a de-aged Tom Hanks and Robin Wright. Their transformations were accomplished using a new generative AI-driven tool called Metaphysic Live. (Image courtesy of Metaphysic and TriStar Pictures/Sony)The VFX industry is still in the formative stage of a revolutionary transformation, driven by rapid advancements in artificial intelligence (AI) and its tech cousins VR, Virtual Production, AR, Immersive and others. As we begin 2025, AI promises to redefine both the creative and technical workflows within this dynamic field. To explore the potential impacts and necessary preparations, a roundtable of leading experts from diverse corners of the global VFX industry brings insights from their experiences and visions for the future, addressing the critical questions.Q. VFX VOICE: How do you foresee AI transforming the creative and technical workflows in the visual effects industry by 2025, and what steps should professionals in the industry take today to prepare for these changes? Are we entering AI and Film 3.0, the phase where filmmakers are figuring out workflows that put together a string of specialized AI tools to serially generate an actual project? Still, lots of fear (era 1.0) and cautious experimentation (era 2.0), but most forward-looking are figuring out actual production processes.With the help of Metaphysic AI, Eminems music video Houdini created a version of Eminem from 20 years ago. Metaphysic offers tools that allow artists to create and manage digital versions of themselves that can be manipulated. (Images courtesy of Metaphysic and Interscope Records)Blue Beetle marked the first feature film where Digital Domain used its proprietary ML Cloth tool, which captures how Blue Beetles rubber-like suit stretches and forms folds and wrinkles in response to Blue Beetles movements. (Image courtesy of Digital Domain and Warner Bros. Pictures)A. Ed Ulbrich, Chief Content Officer & President of Production, MetaphysicBy 2025, AI will profoundly reshape the visual effects industry, enabling creators to achieve what was once deemed impossible. AI-powered tools are unlocking new levels of creativity, allowing artists to produce highly complex imagery and effects that were previously out of reach. These innovations are not only pushing the boundaries of visual storytelling but also drastically cutting costs by automating labor-intensive tasks and streamlining workflows.Moreover, AI will accelerate production and post-production schedules, transforming the entire filmmaking process. With AI handling time-consuming tasks, teams can focus more on the creative elements, leading to faster, more dynamic productions. To stay ahead, professionals should embrace AI, continuously learning and adapting to rapid advancements, ensuring they are prepared to harness these tools to their fullest potential. AI-powered filmmaking tools are like jet fuel for creativity.Fuzzy Door Techs ViewScreen in action from the Ted TV series. ViewScreen Studio is a visualization tool that enables real-time simulcam of visual effects while ViewScreen Scout is an app for iPhone. ViewScreen Studio visualizes and animates a complete scene, including digital assets, in real-time and for multiple cameras simultaneously. (Image courtesy of Fuzzy Door Tech)Harrison Ford transforms into Red Hulk for Captain America: Brave New World. (Image courtesy of Marvel Studios)A. Lala Gavgavian, Global President & COO, Digital Domain AI tools are already making strides in automating rotoscoping, keying and motion capture cleanup, which are traditionally labor-intensive and time-consuming tasks. In 2025, these tools will be more sophisticated, making post-production processes quicker and more accurate. The time saved here can be redirected to refining the quality of the visual effects and pushing the boundaries of whats possible in storytelling. AI has the possibility of being added to the artists palette, allowing expansion to experiment with different styles in a rapid prototyping way. By harnessing the power of AI, VFX professionals can unlock new levels of creativity and efficiency, leading to more immersive and personalized storytelling experiences.We are indeed moving into what could be considered the AI and Film 3.0 era. This phase is characterized by transitioning from fear (1.0) and cautious experimentation (2.0) to practical application.Filmmakers and VFX professionals are now figuring out workflows integrating specialized AI tools to create full-fledged projects. These tools can handle everything from pre-visualization and script breakdowns to real-time rendering and post-production enhancements. However, this transition is not without its challenges. There will be concerns about job displacement and the ethical implications of AI-generated content. To address these issues, the industry must adopt a balanced approach where AI augments human creativity rather than replacing it. Transparent discussions about the role of AI and its ethical implications should be held, ensuring that the technology is used responsibly.A. Brandon Fayette, Co-Founder & Chief Product Officer, Fuzzy Door TechBy 2025, AI is poised to significantly transform both creative and technical workflows in the visual effects industry. AIs impact is already evident in the entertainment sector, and it is set to become the standard for automating repetitive tasks such as shot creation and rendering. This automation is not limited to VFX; we cansee AIs efficiency in code generation, optimization, testing and de-noising audio, images and video. Technical workflows will become more flow-driven, utilizing AI to dynamically adapt and drive the desired creative results. This means AI will assist increating templates for workflows and provide contextual cues that help automate and enhance various stages of the creative process.AI is advancing rapidly, with new tools and techniques emerging almost daily. To stay ahead of these changes, VFX professionals should remain aware of new trends in AI and generative content.Continuous learning and adaptation will be crucial. However, the industry needs to establish standards and guidelines to ensure AI complements rather than compromises the artistic process. Our focus with the ViewScreen family of ProVis tools is on usingAI to support and enhance human creativity, not replace it. By improving processes across production workflows, AI can make jobs easier while respecting and preserving the craft and expertise of entertainment professionals.With GPU-accelerated NVIDIA-Certified Systems combined with NVIDIA RTX Virtual Workstation (vWS) software, professionals can do their work with advanced graphics capabilities from anywhere, able to tackle workloads ranging from interactive rendering to graphics-rich design and visualization applications or game development. (Image courtesy of NVIDIA)Examples of joint deformations before and after AI training shapes. (Image courtesy of SideFX)A. Nick Hayes, ZEISS Director of Cinema Sales, U.S. & CanadaThis past year, we have already seen fingerprints left by AI in both the technical and creative sides of the film industry.Companies like Strada are building AI-enabled production and post-production toolsets to complete tasks widely considered mundane or that nobody wants to do. In turn, this new technology will allow VFX artists and post-production supervisors more freedom to focus on the finer details and create out of this world visuals never seen before. I see this resulting in a higher grade of content, more imagination and even better storytelling.Recently, Cinema Synthetica held an AI-generated film contest. The competition founders argued that the use of generative AI empowers filmmakers to bring their stories to life at a much lower cost and faster than traditional filmmaking methods. Now, creatives can use software tools from companies like Adobe and OpenAI to create content from their minds eye by simply describing their vision in just a few sentences. In a way, the use of AI can be inspiring, especially for filmmakers with lower budgetsand less experience. In fact, in the next 12-24 months, we will see a surge of highly entertaining, imaginative content created by humans, assisted by AI.Character poses created in Houdini and used for AI training of joints. (Image courtesy of SideFX)Final result of posed character after AI training of joints, created and rendered in Houdini by artist Bogdan Lazar. (Image courtesy of SideFX)To stay ahead, professionals should embrace AI, continuously learning and adapting to rapid advancements, ensuring they are prepared toharness these tools to their fullest potential. AI-powered filmmaking tools are like jet fuelfor creativity.Ed Ulbrich, Chief Content Officer & President of Production, MetaphysicThere will be concerns about job displacement and the ethical implications of AI-generated content. To address these issues, the industrymust adopt a balanced approach where AI augments human creativity rather than replacing it. Transparent discussions about the role of AI and its ethical implications should be held, ensuring that the technology is used responsibly.Lala Gavgavian, Global President &COO, Digital DomainA. Neishaw Ali, Founder, President, Executive Producer, Spin VFXAI is set to transform the VFX industry by automating repetitive tasks, enhancing creativity and enabling real-time rendering. By staying up-to-date with AI tools, collaborating across disciplines, experimenting with new technologies and focusing on creative skills, professionals can effectively prepare for and leverage these advancements to enhance their workflows and deliver more innovative and compelling visual effects.We have been working with AI for many years in VFX and only now is it made available at a consumer level and poised to significantly transform both creative and technical workflows in the visual effects industry in several key areas such as: Concept Development Allows for visual ideation among the director, creative teamand VFX to solidify a vision in hours rather than weeks. It enables real-time alignment of the creative vision through text-to-image generation, a process not unlike Google image searches but far more targeted and effective.Automation of Repetitive Tasks Automation of repetitive and non-creative tasks such as rotoscoping and tracking will significantly reduce the time and effort required for these laborious processes thus allowing our artists to concentrate more on the creative aspects of the scene, which is both energizing and inspiring for them.Face Replacement AI is revolutionizing face replacement by enhancing accuracy and realism, increasing speed and efficiency, and improving accessibility and cost-effectiveness, allowing for high-quality face replacement for a wide range of applications. Proper authorization and clearance are necessary to ensure we do no harm to any likeness or person.Real-time rendering Though not only AI-driven, real-time rendering is most certainly changing the VFX workflow. As the quality of final renders becomes more photorealistic and AI-enabled technologies like denoising and upresing allow formore complex scenes to be scalable in software like Unreal Engine, the design and iteration process will accelerate. Changes can be instantly viewed and assessed by everyone.Steps for Professionals to Prepare: I believe one of the biggest challenges for some VFX artists and professionals is understanding that embracing AI does not mean sacrificing anything. Instead, it allows you to work smarter and more effectively, dedicating more time to creative tasks rather than monotonous, repetitive ones.A. Antoine Moulineau, CEO & Creative Director, Light Visual EffectsAI feels like the beginning of CGI 30 years ago when a new software or tool was out every week. There are a lot of different techs available, and its very hard to focus on one thing or invest in specific workflows. At LIGHT, we are focusing on better training artists with Nukes Copycat and new tools such as comfyUI. Up-res or frame interpolation are already huge time-savers in producing high-res renders or textures. AI like Midjourney or FLUX has already disrupted massively concept art and art direction; they play now a major part in the workflow. 2025 will be about animated concepts and possibly postvis if the tools mature enough to have the control required. Animating concepts with tools such as Runway 3.A major blocker for final use remains controlling the AI and the lack of consistency of the tools. As said earlier, there is so much happening now, that it is hard to keep up or rely on the tools to be able to integrate in a pipeline.I dont know if it will be for 2025, but I can see AI disrupting the CGI pipelines in the very short term; generative AI could replace traditional rendering in many scenarios and reduce the need for texturing or high-resolution modeling in the near future, specifically for wide environments. Lip-sync is also a process where AI is really shining and will disrupt traditional workflows in 2025.We will start seeing directors preparing an AI version of their films with an edit of animated concepts with music during the pitching/concept phase, especially for advertising. This is such a helpful process to understand and communicate their vision. Its kind of a Moodboard 3.0, and I can certainly imagine this process becoming the norm very quickly. For very short-form social content, it will probably replace entirely traditional workflows. That being said, I think long-form remains an art form where actors and performance remain central, and I dont see AI taking over anytime soon. It is hard for me to see the point of that. We need real people to identify with so we can connect to the content. Art is about the vision; it captures society and the world as it is in the time it is made. In other words, AI remains a gigantic database of the past, but we still need the human creation process to create new art. A good example is, AI wouldnt be able to generatea cartoon version of a character if someone hadnt invented cartoon previously. It will accelerate processes for sure but not replace them.A. Christian Nielsen, Creative Director, The MillPredicting the future is challenging, especially given AIs rapid advancement. However, I anticipate an increasing integration of AI tools into the VFX pipeline. Were already seeing this to some degree with AI-powered rotoscoping and paint tools, which address some of the most common repetitive tasks in VFX.Additionally, inpainting and outpainting techniques are emerging as powerful tools for removing elements from shots and creating set extensions. ComfyUI has already become an integral part of many AI pipelines, and I foresee its integration expanding across most VFX studios.I strongly recommend that everyone in the VFX industry familiarize themselves with AI to better understand its capabilities and implications. The integration of AI into VFX is both inevitable and unstoppable.AI is advancing rapidly, with new tools and techniques emerging almost daily. To stay ahead of these changes, VFX professionals should remain aware of new trends in AI and generative content. Continuous learning and adaptation will be crucial. However, the industry needs to establish standards and guidelines to ensure AI complements ratherthan compromises the artistic process.Brandon Fayette, Co-Founder & Chief Product Officer, Fuzzy Door TechIn a way, the use of AI can be inspiring, especially for filmmakers with lower budgets and less experience. In fact, in the next 12-24 months, we will see a surge of highly entertaining, imaginative content created by humans, assisted by AI.Nick Hayes, ZEISS Director of Cinema Sales,U.S. & Canada[O]ne of the biggest challenges for some VFX artists and professionals is understandingthat embracing AI does not mean sacrificing anything. Instead, it allows you to work smarter and more effectively, dedicating more time to creative tasks rather than monotonous, repetitive ones.Neishaw Ali, Founder, President, Executive Producer, Spin VFXI can see AI disrupting the CGI pipelines in the very short term; generative AI could replace traditional rendering in many scenarios andreduce the need for texturing or high-resolution modeling in the near future, specifically for wide environments. Lip-sync is also a process where AI is really shining and will disrupttraditional workflows in 2025.Antoine Moulineau, CEO & Creative Director, Light Visual EffectsTheres still progress to be made before full text-to-video tools like Runwayml Gen-3 or Sora can be used to create complete AI commercials or movies. The main challenge is the lack of precise control with AI. If a director dislikes a specific element in a shot or wants to make changes, theres currently no way to control that. As a result, AI tools are generally not very director-friendly. At present, these tools work best for ideation and conceptdevelopment, like how we use Midjourney or Stable Diffusion for still concepts. Initially, AI could be used for creating stock elements, but Im confident that OpenAI and others are working on giving users more control.Over the past 12 months, weve used AI for several commercials and experiences, learning as we go. This technology is so newin the VFX industry that theres little experience to draw from, which can lead to some long workdays.A. Mark Finch, Chief Technology Officer, ViconThe industry is going through considerable change as audience preferences and consumer habits have evolved significantly in recent years. More people are staying in than going out, tentpole IPs are reporting decreased excitement and financial returns, and weve seen a period of continuous layoffs. As a result, theres a lot of caution and anticipation as to whats next.In a transitional period like this, people are looking at the industry around them with a degree of trepidation, but I think theres also a significant amount of opportunity waiting to be exploited. Consumer hunger for new worlds and stories powered by VFX and new technologies is there, along with plenty of companies wanting to meet that demand.For the immediate future, I predict were going to see a spike in experimentation as people search for the most effective ways of utilizing these technologies to serve an audience whose appetite knows no bounds. Vicon is fueling that experimentation with our work in ML/AI, for example, which is the foundation of our markerless technology. Our markerless solution is lowering the barriers to entry to motion capture, paving the way for new non-technical experts to leverage motion capture in their industries.An example weve come to recognize is giving animators direct access to motion capture who historically would have only had access to it through mocap professionals on the performance capture stage, which is expensive and in high demand. This unfettered access reduces the creativity iteration loop, which ultimately leads to a faster final product that is representative of their creative dream.Theres a lot of excitement and noise surrounding the rapid growth of AI and ML-powered tech. Its impossible to look anywhere without seeing tools that encourage new workflows or provide enhancements to existing ones. A consequence of this is that you can fall into the mindset of, This is the way everything is going to be done, so I need to know about it all. When technology is moving so fast, you risk spreading yourself thin across a wealth of tools that are still finding their feet and may themselves be redundant, replaced or improved beyond recognition in the future.The best preparation comes from understanding the problem before the solution, in other words, identifying the obstacle you need to overcome first. You get this by focusing on people speaking to them about their challenges, researching those that exist across their industry in general, and gaining an understanding of why a certain tool, workflow or enhancement might exist.A. Paul Salvini, Global CTO, DNEGAI, especially machine learning, is poised to significantly impact the visual effects industry, transforming both creative and technical workflows. At DNEG, we are investing in the development of new AI-enabled tools and workflows to empower artists and enhance the creative process. For us, storytelling remains paramount so our use of AI is directed towards activities that provide better feedback for artists and deeper creative control.In terms of artist-facing tools, some of the areas likely to see early adoption of AI and ML techniques throughout 2025 include: Improving rendering performance (providing faster artist feedback); automating repetitive tasks; procedurally creating content; generating variations; and processing, manipulating and generating 2D images.AI techniques and tools are being increasingly used to generate ideas, explore creative alternatives and build early stand-ins for various locations, characters and props. As with all new tools, professionals can prepare by learning the basics of AI, and seeing how these tools are already being explored, developed and deployed in existing industry-standard packages.Some AI and ML tools work invisibly, while others require direct user involvement. An abundance of publicly available and user-friendly websites has emerged, allowing artists and the general public to experiment with various ML models to better understand their current capabilities and limitations.These new tools, while impressive, further emphasize the importance of human creativity, communication and collaboration. Our collective job of developing and bringing great stories to life remains unchanged. However, as our tools improve, we can dedicate more time to creative endeavors and less on mundane tasks. This is truly a better way to create better content.A. Christopher Nichols, Director, Chaos LabsMachine learning has been transforming the industry for years, so its nothing new to VFX artists. Especially when it comes to digital humans, rotoscoping, fluid sims and analyzing data/camera tracking information. AI will continue to take on a bigger piece of the workflow and replace a lot of traditional VFX techniques in time. The industry will just continue to adapt.Creating high-level content is going to become much more accessible, though. Soon, independent filmmakers will create shots that would have been the sole domain of high-end VFX houses. This will free the latter to experiment with more ambitious work. Currently, Chaos is trying to help artists get to LED screens faster via Project Arena and NVIDIA AI technology; youll likely see AI solutions become commonplace in the years ahead. Youll also probably see fewer artists per project and more projects in general, too, as AI makes things more affordable. So instead of 10 movies a year with 1,000 VFX artists on each movie, itll be more like 1,000 films with 100 names per project.The elephant in the room is generative AI. However, the big movie studios are reluctant to use it due to copyright issues. Right now, the matter of where the data is coming from is being worked out through the court system, and those decisions will influence what happens next. That said, I dont think an artist will bereplaced by a prompt engineer anytime soon. The best work you see coming out of the generative AI world is being done by artists who add it to their toolsets. You still must know what to feed these tools and artists know that better than anyone.I strongly recommend that everyone in the VFX industry familiarize themselves with AI to better understand its capabilities andimplications. The integration of AI into VFX is both inevitable and unstoppable.Christian Nielsen, Creative Director, The MillA consequence of [the rapid growth of AI] is that you can fall into the mindset of, This is the way everything is going to be done, so Ineed to know about it all. When technology is moving so fast, you risk spreading yourself thin across a wealth of tools that are stillfinding their feet and may themselves be redundant, replaced or improved beyond recognition in the future.Mark Finch, Chief Technology Officer, ViconOur collective job of developing and bringing great stories to life remains unchanged.However, as our tools improve, we can dedicate more time to creative endeavors and less on mundane tasks. This is truly a better way to create better content.Paul Salvini, Global CTO, DNEG[Y]oull likely see AI solutions become commonplace in the years ahead. Youll also probably see fewer artists per project andmore projects in general, too, as AI makes things more affordable. So instead of10 movies a year with 1,000 VFX artists on each movie, itll be more like 1,000 films with 100 names per project.Christopher Nichols, Director, Chaos LabsA. Greg Anderson, COO, Scanline VFX and Eyeline StudiosIn 2025, AI tools and technology are poised to significantly transform how visual effects are created, from automating the most mundaneof tasks to expanding the possibilities of the most complex visual effects sequences. Several compositing packages already incorporate AI-based features that greatly improve rotoscoping, tracking, cleanup speed and quality. These features will continue to improve in 2025, allowing artists to spend more time on the final quality of shot production. The ongoing and fast-moving development of generative AI tools and features will change the process, efficiency and quality of everything from digital environments to effects and character animation.From a technical and production workflow standpoint, AI will continue to optimize render processes, allowing for more iterations and leading to more convincing imagery that is faster and cost-effective. New tools will assist VFX teams in organizing, managing and accessing vast libraries of digital assets, making it easier for artiststo find and reuse elements across different projects. Data-driven insights will also allow AI tools to predict which assets might be needed based on project requirements.Overall, AI technology is poised to revolutionize the VFX industry next year and beyond, as weve only yet to scratch the surface of what will be possible. In preparation, anyone working in the VFX industry should lean heavily toward curiosity, continuous learning and skill development. Time spent experimenting with AI tools and technologies in current workflows will heighten the understanding of AIs capabilities and limitations. Additionally, while AI can enhance many technical aspects, creativity remains a human domain. Artists should focus on developing artistic vision, storytelling skills and creative problem-solving abilities.A. David Lebensfeld, President and VFX Supervisor, Ingenuity Studios and Ghost VFXIn 2025, we will see a continuation of idea genesis happening by leveraging generative AI tools. We will also find that our clients use generative AI tools to communicate their ideas by leveraging easy-to-use tools they have never had before. The sacrifice being controllability, but the benefit is ease of communication.Most of our studio clients have a real sensitivity to how AI is being used on their projects, and they want it to be additive to the projects versus a threat to the ecosystem. In the short term, generative AI will be used more as a tool for communication than it is for execution.Well continue to see AI-based tools in our existing software packages, giving both in-house and vendor tool developers and software developers room to expand their offerings. While AI advancements will continue to improve existing toolsets, they wont replace team members at scale, especially in the high-end part of the market.Looking ahead, I think the best professionals in our industry are already dialed in to developing toolsets and new technologies. Its always been the case that you have to be agile and stay aware of continual software and hardware developments. VFX is theintersection of technology and art; you must know and constantly improve both to stay competitive. Also on a professional level, I dont think well see meaningful changes in 2025 to how VFX final pixels get made at the studio side, for a multitude of reasons, two being a lack of granular control and sour optics.How people are talking about AI can often feel like a marketing trick. Everyone is using the same basic technology layer, and that always gets better as all boats rise. Like anything else, the people who know and leverage advanced technology the best and the most creatively will continue to win.A. Mathieu Raynault, Founder, Raynault VFXWhen I first thought about how AI might affect the visual effects industry, I felt both skeptical and anxious. But since I started in computer graphics in 1996, I havent seen anything with this much potential for exciting transformation.At Raynault VFX, AI is set to significantly boost our efficiency by automating routine tasks and letting our team focus more on the creative parts of our projects. Were a small team of 55 and creativity is at the heart of what we do. Weve started using AI to increase our productivity without sacrificing our artistic integrity. With seven full-time developers, were heavily invested in research and development, including AI, to improve our workflows.Looking ahead, I see AI enhancing our current tools, helping us keep control over the creative process and refine our work with client feedback. This blend of AI and human creativity is crucial because filmmakers will still rely on creative teams to bring their visions to life. Although theres some worry about AIs ability to create entire films or TV shows on its own, I think these tools wont replace human-driven filmmaking anytime soon.AI will certainly transform our workflows and could lead to shifts in employment within our industry. VFX artists will become more productive, able to deliver more work in less time, which might lead to a reduction in job numbers compared to pre-strike highs. For VFX professionals, integrating AI into their workflows is essential, yet its crucial to preserve and enhance our existing skills. In the field of concept art, for example, AI can assist in drafting initial designs, but the intricate process of refining these concepts to align with a directors vision will still require human expertise. Artists who can both direct AI and iterate while creating concept art themselves will be invaluable.In summary, Im quite optimistic. As we move toward 2025, adopting AI requires us to change our skills and approaches to stay competitive and innovative. As a business owner in the VFX industry, its incredibly motivating!AI technology is poised to revolutionize the VFX industry next year and beyond, as weve only yet to scratch the surface of what will be possible. In preparation, anyone working inthe VFX industry should lean heavily toward curiosity, continuous learning and skilldevelopment.Greg Anderson, COO, Scanline VFX and Eyeline StudiosHow people are talking about AI can oftenfeel like a marketing trick. Everyone is using the same basic technology layer, and that always gets better as all boats rise. Like anything else, the people who know and leverage advanced technology the best and the most creatively will continue to win.David Lebensfeld, President andVFX Supervisor, Ingenuity Studios and Ghost VFXWhen I first thought about how AI might affect the visual effects industry, I felt both skeptical and anxious. But since I startedin computer graphics in 1996, I havent seen anything with this much potential for exciting transformation.Mathieu Raynault, Founder,Raynault VFXI know some people look at AI and its use as being somehow catastrophic for our business but, at the end of the day, I think itll be just another tool in our arsenal and, used wisely, a great one. The faster artists and companies embrace it and learn to use it in their workflows, the better, and were already seeing that adaptation now.Viktor Mller, CEO, Universal Production Partners (UPP)A. Viktor Mller, CEO, Universal Production Partners (UPP)To some extent, AI has already begun to transform the industry.We see demonstrations of its growing capabilities almost on a weekly basis, and there seems to be a lot of fear around that.Honestly, Im not worried about it at all. I could sense it coming long before it started turning up in the media, which is why UPP has been quietly building out our VP and AI departments for the last six years.I know some people look at AI and its use as being somehow catastrophic for our business but, at the end of the day, I think itllbe just another tool in our arsenal and, used wisely, a great one. The faster artists and companies embrace it and learn to use it in their workflows, the better, and were already seeing that adaptation now.A. Kim Davidson, President & CEO, SideFXOver the past year, we have seen several advancements in AI in the visual effects industry and we expect this to continue in 2025. So far, the advancements have been more evolutionary than revolutionary. AI is not replacing creatives or the production pipeline butis greatly speeding up many of the more mundane tasks while not fully eliminating them yet. Tracking and rotoscoping are key examples of tasks that have been improved and sped up. We predict that 2025 will see more AI-based tools being used throughout the pipeline, with improved AI implementations andsome brand-new tools. These AI-enhanced workflows will include design concept, asset (model and texture) creation, motion stabilization, improved character animation and deformation (e.g. clothing, hair, skin), matching real-world lights, style transferring, temporal denoising and compositing.Of course, there will be improvements (and more releases) of prompt-based generative video applications. But for a variety of reasons we dont see this as the best workflow for creative professionals, certainly not the be-all and end-all for art-directed content creators. We believe in providing artists with AI/ML-enhanced toolsets to bring their creative visions to life more quickly and efficiently, allowing for more iterationsthat should lead to higher quality. We are at an exciting stage in the confluence of powerful hardware andAI-enhanced software where creative talent will be more important than ever and able to harness creative platforms to tell stories in truly extraordinary new ways.A. Dade Orgeron, Vice President of Innovation, Shutterstock2025 is here, but with generative AI technology moving so quickly, I think we can expect to see AI continueto transform the visual effects industry, particularly through advancements in generative video and 3D tools. As AI models continue to improve, we can expect notable enhancements in temporal consistency and reduced distortion, along with compositing tools to help seamlessly integrate AI-generated content into live-action footage or easily remove/replace unwanted people or objects. In the next wave of generative video models, complex mechanical devices and other intricate details will be represented with unprecedented precision, and advanced dynamics and fluid simulations will start to become achievable with generative video rather than traditional, time-consuming simulation engines. Will it be perfect? Maybe not in the next six months, but perhaps within the next year.To prepare for these advancements, VFX professionals should invest in upskilling themselves in AI and machine learning technologies. Understanding the capabilities, and particularly the limitations of AI-driven tools, will be essential. They should experiment with generative image and video technologies as well as 3D tools that leverage AI to streamline their workflowsand enhance their creative skills. Thats something at Shutterstock that we are actively enabling through partnerships with NVIDIA and Databricks. For instance, weve developed our own GenAI models to accelerate authentic creative output, all with ethically sourced data. Early adoption and a shift towards embracing new technologies and methodologies will enable artists and technicians to remain competitive and innovative in these rapidly evolving times.A. Gary Mundell, CEO, Tippett StudioThe big question is: What will AI mean to us in 2025? As we move through the Gartner Hype Cycle, AI seems to be transitioning from the Trough of Disillusionment into the Slope of Enlightenment, much like the early days of the .com era. AI is poised to bring a suite of tools that handle obvious tasks roto, match move, res up, FX but thats just the tip of the iceberg. Anything described by a massive database can use AI. If youcan articulate your prompts, and theres a database to train the answers, youre set. Forget influencers soon, prompters will drive production with AI-generated insights.By 2025, AI will fundamentally change VFX production. Imagine a system capable of generating an entire schedule and budget through prompts. AI could create a VFX schedule for a 1,200-shot project, complete with budgets, storyboards, 3D layouts and animatic blocking, all tailored to a directors style and the level of complexity. However, where todays AI falls short is in the temporal dimension it struggles with believable, complex animation. Current engines tend to produce flowy, slow visuals lacking continuity, and while many tools claim to address this, it will take time before AI excels at high-quality animation.At Tippett Studios, we leverage AI for previsualization, conceptualization and project management. Using TACTIC Resource, we integrate AI into planning and resource management, handling vast production data to predict outcomes and streamline workflows. As we move into 2025 and beyond, AIs data management capabilities will be key to future productivity and financial success, even as we await more advanced animation tools. As AI continues through the Peak of Inflated Expectations and towards the Plateau of Productivity, its role in VFX production will become increasingly significant.We are at an exciting stage in theconfluence of powerful hardware and AI-enhanced software where creative talent will be more important than ever and able to harness creative platforms to tell stories in truly extraordinary new ways.Kim Davidson, President & CEO, SideFXEarly adoption and a shift towards embracing new technologies andmethodologies will enable artists and technicians to remaincompetitive and innovative in these rapidly evolving times.Dade Orgeron, Vice President of Innovation, Shutterstock[W]here todays AI falls short is in the temporal dimension itstruggles with believable, complex animation. Current engines tend to produce flowy, slow visuals lacking continuity, and while many toolsclaim to address this, it will taketime before AI excels at high-quality animation.Gary Mundell, CEO, Tippett Studio
    0 Comments ·0 Shares ·119 Views
  • PAUL LAMBERT CROSSES THE ARTISTIC AND TECHNICAL DIVIDE
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of Paul Lambert, except where noted.Paul Lambert, Visual Effects Supervisor. A proud accomplishment for Lambert was creating the IBK Keyer, which is still used today in Nuke to deal with bluescreen and greenscreen plates.Nowadays, Paul Lambert is at the forefront of Hollywood productions as a visual effects supervisor, with memorable visual accomplishments being the dystopian Los Angeles cityscapes and the lead hologram character from Blade Runner 2049, the transition to the Moons surface in IMAX in First Man and the realism of the worlds of the Dune franchise. Ironically, the ability to combine art and technology, which has been the key to his success, originally made him an anomaly in the British education system. Forced to choose between the two, he initially decided to earn a degree in Aeronautical Engineering at the University of London. Upon graduating, Lambert realized that engineering was not his calling, so he took a job as a courier in London and studied sculpture as a part-time art school student. Frequently, deliveries for Salon Productions led to visits to Shepperton and Pinewood Studios, and eventually saw him hired by the company that provided editing equipment to the film industry.At Salon, I learned how to put together and fix Steinbecks, KEMs and Moviolas, Lambert recalls. I even had to go over to Moscow to fix a Steinbeck being used by [Editor] Terry Rawlings for The Saint. It was during this time that Lambert became aware of the digital transition in the film industry. Avid and Lightworks non-linear editing systems were starting to disrupt the industry. It was this digital transition that made me more aware of something called visual effects. The discovery was worth exploring further. SGI had a big old building in Soho Square and were running week-long courses under the name of Silicon Studios, where you could play with Monet, Flint [the baby version of Flame] and Houdini. I left Salon and did this course, which was amazing. A six-month odyssey of looking for employment came to an end when a runner at Cinesite went on a two-week vacation. They kept me because I was so enthusiastic and hungry for knowledge. It was at a time when you could jump ontothe graphics workstations, whether it be the Flames or Infernos or Cineon machines, at night in your own time. I taught myself. I was so hungry and focused. I had finally found what I wanted to do. It was a good balance of creativity and technical know-how. When I started at Cinesite, they had two Flames, and by the time I left I was the head of that department and we had seven.A portion of a seawall was constructed for Blade Runner 2049, with the action shot in a water tank in Hungary. (Image courtesy Warner Bros. Pictures)Why would you put up with these crazy deadlines or having to move around the world if you didnt truly love it? If you truly love something, youre going to come up with creative ways of doing things and participate in some of these beautiful movies.Paul Lambert, Visual Effects SupervisorFascinated by a proprietary compositing software developed by Digital Domain, Lambert had a job interview with the visual effects company founded by James Cameron, Stan Winston and Scott Ross. I added substantial pieces of technology to Nuke because by that time I had figured out the ins and outs of compositing, Lambert reveals. It was an obsession of mine of how an image comes together. Digital Domain was on the verge of commercializing Nuke but didnt have a keyer. I spent six months playing around with this idea of keying, came back to them and showed them this algorithm. It was the IBK keyer, and thats still in Nuke. Simplicity drove the programming process. What I cant stand as a compositor is when there is a node and its got 50,000 sliders in there. Nobody knows what those sliders do! Its trial and error. What I tried to develop is something simple but a process where, if you can combine these things in a particular way, you can work with bluescreens and greenscreens, which are uneven, and it gets you to a good place quickly. The irony is, now I tend to try not to rely on bluescreens or greenscreens!Director/writer/producer Denis Villeneuve, left, and Lambert on the set of Dune: Part Two. (Image courtesy of Warner Bros. Pictures. Photo: Niko Tavernise)Lambert celebrates winning an Oscar for Dune: Part One with his wife, Mags Sarnowska.Over 90 minutes of footage had to be created for the LED screens used for First Man. (Image courtesy of Universal Pictures)A major benefit of using the LED screens for the space and aerial scenes in First Man was the ability to capture reflections on the visor and in the eyes of Ryan Gosling, which are extremely difficult to achieve in post-production. (Image courtesy of Universal Pictures)After 12 years at Digital Domain, Lambert joined DNEGs facility in Vancouver in 2015 where he began his transition as a production visual effects supervisor starting with The Huntsman: Winters War. The size of the visual effects budget is only part of the equation for success. By the time we had finished First Man it was a $7 million visual effects budget, which is relatively tiny, but we came up with some incredibly creative ways to do stuff, Lambert remarks. We used a number of great techniques for the visuals. Doing a bigature and miniature for space work is ideal because you can control the light so that shadows are really hard. We used real 1960s footage for the launch, but we repurposed that footage with CG to make it more cinematic. Also, we utilized one of the first LED screens, but we had it up for six weeks with operators for a fraction of the cost of what it costs now. Ninety minutes of LED screen content had to be created. This is where my gray hair has come from! We did not take the visor off one single shot. We even got reflections in the eyes!Two fundamental elements have to be respected for a visual effects shot to be believable. Im going to try not to change the actors performance or the light because I know that changing the light with our current tools always looks a bit artificial, Lambert explains. Your eye will pick up on something which takes you out, and in our current environment people will say, Its bad CGI. No, its the fact that youve taken the natural balance of the original image and gone too far by changing the background to acompletely different luminance or trying to add a different light on the character. You see it all the time. Im sure you will be able to do it with generative AI soon enough where youre relighting or regenerating the image based on some form of transformer and diffusion model, but using current tools I try to avoid it. I would rather the continuity of a background be off rather than have a composite feel wrong. If I shoot something knowing that a background is going to be a certain background in post, then I try to have that screen be of a tone of luminance that Im going to put the background in. Hence the sand-colored backing screens on Dune: Part One and Two.Never underestimate the significance of having a clear vision. With Denis Villeneuve there is such a clarity of vision as to what he wants, so its a pleasure to work with him, and you dont do crazy hours and overtime, Lambert states. There isnt a mad rush. Its a sensible approach to things. There are hiccups along the way, but its not like you have to ramp up to 1,000 people towards the end because youre 5,000 shots short. For Dune, the concepts were the basis of what we built and photographed and what I ultimately created in visual effects. Blade Runner 2049 was a special project with Lambert working on behalf of DNEG. It was special to come into this world and see pure professionalism at work with Denis and [Director of Photography] Roger Deakins, and witness them shooting with a single camera all the time. He is also proud of his collaboration with Cinematographer Greig Fraser on Dune: Part One and Two. Greig uses a multitude of lenses and some were old Russian lenses. Hes totally into degrading and giving character to the image. Then, of course, I have to try to match these things! We have a good understanding of the way we work. Greig is given untold freedom in how he wants to do things, but when I need something, he listens and will adapt, he says.Lambert in the Mojave Desert near Edwards Air Force Base for the landing of the X15 in First Man.Moviemaking is becoming more accessible to the masses. Youll see the cream rise to the top like you always do in whatever industry, Lambert notes. You will have directors who have a vision and bring that forward. I keep reading and seeing this whole idea of democratizing our industry, and it will happen. It depends on whether we put guardrails up or not to help with the transition. Youll have different ways to visualize things. Youll have the ability to put your VR goggles on and enjoy the movie that you just created. Great films are built upon solid collaborations. Ive been lucky with my path so far in that Ive never had a bad experience with another HOD [head of department]. In the end, Im only successful if the photography that we have shot works and people have put their heart into it. If I get the best foundation that I can, then I can add to that and bring it to the final where the director will hopefully love it.Blade Runner 2049 marked the first time that Lambert collaborated with Denis Villeneuve as a facility supervisor at DNEG, and it resulted in him receiving his first Oscar. (Image courtesy of Warner Bros. Pictures)From left: Rebecca Ferguson (Lady Jessica), Director/Writer/ Producer Denis Villeneuve, Lambert and Production Designer Patrice Vermette on the set of Dune: Part Two. (Image courtesy of Warner Bros. Pictures. Photo: Niko Tavernise)First Man resulted in Lambert winning his second Oscar and his first as a production visual effects supervisor.Lambert joined Wylie Co. in 2021 as the Executive Creative Director and is currently working on Project Hail Mary with directors Phil Lord and Chris Miller as well as Cinematographer Greig Fraser. Im thinking on my feet on Project Hail Mary more than Ive ever done before because of trying to keep the camerawork and everything fluid, Lambert remarks. That means youre not clinically breaking up the shot into layers because what tends to happen is you lose some of the organic feel of a shot if you do this and that element. Im a big believer in having a harder comp which will always give you a better visual. Even with a trio of Oscars, his enthusiasm remains undiminished. Why would you put up with these crazy deadlines or having to move around the world if you didnt truly love it? If you truly love something, youre going to come up with creative ways of doing things and participate in some of these beautiful movies.
    0 Comments ·0 Shares ·117 Views
  • BANDING TOGETHER ONCE AGAIN FOR GLADIATOR II
    www.vfxvoice.com
    By TREVOR HOGGAll images courtesy of Paramount Pictures.Lucius Verus (Paul Mescal) seeks vengeance against Roman General Marcus Acacius (Pedro Pascal).Not often does a film crew get to reunite two decades later to make a sequel that makes swords and sandals cool again, but that is exactly the case with Gladiator II where Ridley Scott collaborates once again with Production Designer Arthur Max and Special Effects Supervisor Neil Corbould, VES. Russell Crowe as Maximus is not returning to the Colosseum to wreak havoc on the Roman Empire; instead, the task has been given to his equally determined son Lucius Verus (Paul Mescal).It was an amazing experience to see the Colosseum back up again, states Corbould. It was like stepping back in time 20-odd years because it was an exact replica of what we did before. I felt that the first one was damn good. To revisit this period again and take it a step further was quite an incredible and daunting task. The scope has been expanded. We were using the same old tools, like physical builds and handmade craftsmanship, that we always did, remarks Max. Only this time around, the digital technologies have come into that world as well, and that enlarged and increased the scope of what we could do in the time and on budget. It has been a gigantic shift from the first one to the sequel.Visual ties still exist between the original and the sequel. We wanted people to be able to recognize the [different] world from the first to the second, Max notes. It was also opportunistic of us to try to use some of the earlier footage to blend in. We did that in flashbacks and in the live-action, where we produced some of the Gladiators original crowd footage. We tried to match the sets in actual detail, particularly in the arenas, both provincial and in the capital like the Colosseum set as closely as possible to the first one. There were changes, but they were subtle. That was a nod to economical filmmaking. Why waste the time shooting crowds cheering when you have it in the can already? We did a few of those kinds of things.From left: Stunt Coordinator Nikki Berwick; VFX Supervisor Mark Bakowski; DP John Mathieson; Prosthetics Designer Conor OSullivan; Director Ridley Scott and Head Stunt Rigger Christopher Manger (ground) discuss the gladiator battle featuring the animatronic rhino created by Neil Corbould and his special effects team.Ridley said, I want to have a rhino there. I spoke to Mark Bakowski [Visual Effects Supervisor] about it. I said, We can create a remote-drive gimbal rig underneath, which is completely wireless, with a little six-axis motion base, and a muscle suit that we put a textured skin on with the head and body. Then Ridley said, I want it to do 40 miles per hour and turn on a dime. That was like, Oh, Christ. Another thing! But we did it. It was powered by flow-cell batteries and used electric car motors. This thing was lethal.Neil Corbould, Special Effects SupervisorAlso coming in handy was the Jerusalem set from Kingdom of Heaven, which was repurposed as a Numidian coastal fort attacked by the Roman fleet. The technology of water software thank you, James Cameron and Avatar and other productions had evolved to such a degree of sophistication that it made sense. Also, a credit to Neil Corbould, who found an incredible all-wheel-drive remote-control platform that was used for transporting enormous pieces of industrial technology great distances, like cooling chambers of nuclear power stations. We had a couple of those to put our ships on. This is where we were innovative, Max states.The advancements in technology allowed for more of the Rome set to be built physically for Gladiator II than for the original film.Pedro Pascal, Ridley Scott and Paul Mescal share a light moment in between takes.The entrance arch of the Colosseum had to be enlarged to allow the ships to pass through.The Colosseum had to be constructed higher than the original to accommodate the CG water needed for naval battles.Corbould was inadvertently responsible for a cut sequence appearing in the sequel. He recalls, I was going through some of my old archive stuff of the original Gladiator and found the storyboards of the rhino. After the meeting finished, I said, By the way, Ridley, I found these. I put them on the desk and he went, Wow! This is amazing. Weve got to do this. And thats how the rhino came about. It was like, Oh, Christ, I didnt think he would do that! Then Ridley said, I want to have a rhino there. I spoke to Mark Bakowski [Visual Effects Supervisor] about it. I said, We can create a remote-drive gimbal rig underneath, which is completely wireless, with a little six-axis motion base, and a muscle suit that we put a textured skin on with the head and body. Ridley said, I want it to do 40 miles per hour and turn on a dime. That was like, Oh, Christ. Another thing! But we did it. It was powered by flow-cell batteries and used electric car motors. This thing was lethal. It was good and could move around. We didnt do it like a conventional buggy. We did it like the two-drive wheels were on the side, and we had the front and back wheels in the middle, which were stabilizing wheels. We were driving it like a JCB excavator around the arena; that, in conjunction with the movement of the muscle suit and the six axes underneath, gave some good riding shots of the guy standing on top of it.The gladiator battle with the rhino was revived for the sequel when Neil Corbould showed Ridley Scott the original storyboards.Not everything went according to plan, in particular the naval battle in the Colosseum. Life got in the way because of the strikes, remarks Visual Effects Supervisor Mark Bakowski, who was a newcomer to the project. The Colosseum was originally to be more wet-for-wet and less dry-for-wet. But it works well in the end. There is a speed of working that suits Ridley Scott; he shoots quickly and likes to move quickly. That worked, shooting it dry-for-wet because Ridley could get his cameras where he wanted, reset quickly and get the shots; whereas, there are more restrictions being in the proper wet kind of thing. When it comes to integration, I was wary of having too much of a 50/50 split where you have to constantly match one to the other. We had a certain style of shot that was wet-for-wet, as in someone falling into the water, or Neil did some amazing impacts of the boats where the camera is skimming along the surface. Those made sense to do wet-for-wet because there are lots of interactions close to the camera. The water went through a major design evolution. Bakowski adds, We started off looking at the canals of Venice as our reference for the Colosseum, and then we started to drift.Ridley was showing pictures of his swimming pool in Los Angeles and saying, Can you move it that way? It took us a while to find the look of the Colosseum water, but we got there in the end.We tried to match the sets in actual detail, particularly in the arenas, both provincial and inthe capital like the Colosseum set as closely as possible to the first [Gladiator]. There were changes, but they were subtle. That was a nod to economical filmmaking. Why waste the time shooting crowds cheering when you have it in the can already?Arthur Max, Production DesignerThere were times when DP John Mathieson had to coordinate as many as 11 cameras for action sequences.The Colosseum naval battle was a combination of dry-for-dry and wet-for-wet photography.We built the boats in the U.K., shipped them out and then assembled them there, which was the right thing to do because it was almost impossible to get that material in Morocco or the sheer quantity of of steel and timber we needed. We put them in 30 40-foot trucks going across the Atlas Mountains, and as they were arriving, we were assembling them. It was like clockwork. On the day when we were shooting, we were still painting bits. It was that close.Neil Corbould, Special Effects SupervisorThe attack on the Numidian coastal fort was shot using the landlocked Jerusalem set from Kingdom of Heaven, with boats moved around on self-propelled modular transporters (SPMT) and CG water added in post-production.Technological advances allowed for the expansion of Rome. We built much more than we did on the first one in terms of the amount of site we covered, Max explains. We went from one end to the other. CNC sculptures and casting techniques were expedited greatly on the sequel because we had the technology. The timeframe was compressed from getting a finished drawing or working drawing to the workshop floor and also being able to farm out digital files, not only to one workshop but to multiple workshops simultaneously, which increased the speed of production. To a large extent, we met the demands of Ridleys storyboards, but there was still a large amount of [digital] set extensions. The accomplishment was impressive. It was an amazing set to wander around, Bakowski states. We had like a kit of parts that we could dress in the background. Technically, a certain hill should be in a particular place. We established it in this shot, or a certain building should be at a specific angle. But if it didnt look good, of course, it moved because Ridley is a visual director; his work is like a moving painting every time, and we responded to that by trying to make everything beautiful, which was the main thing.Visual ties still exist between the original and the sequel, such as the Colosseum.The baboons fighting the gladiators was a complicated sequence that required multiple passes.Visual effects took over some tasks previously looked after by special effects. In Gladiator, we did a lot of launching arrows, but in this one, we didnt do any of that, Corbould reveals. It wasall Mark [Bakowski]. That allowed Ridley to shoot at the speed he did, which was good. I concentrated on the fires, dust and explosions in the city. But we only shot that once, with 11 cameras. A major contribution was the practical black smoke. Corbould describes, We were burning vegetable oil, and when you atomize it at high pressure, it vaporizes, ignites and gives you this amazing black smoke. Everyone smells like a chip shop! We had six of these massive burners that were dotted around the set, and then wehad to chase the wind. We would have some wind socks up or look at the flags. You had to try to anticipate it because of the speed at which Ridley works. We must have had 16 people just doing black smoke. We built a beach as well. Ridley said, Its supposed to be on the coast, and I want an 80-meter stretch of beach with waves washing the bodies onto the shore. We constructed an 80 x 80-meter set of beach. I made this wave wall, which was basically one section of the wall, but the whole 80 meters of it pushed in. Its a bit like a wave tank. We put a big liner in it and sprayed sand over the liner; that gave it a nice, majestic wash-up against the beach.ILM led the charge in creating the 1,154 visual effects shots, followed by Framestore, Ombrium VFX, Screen Scene, Exceptional Minds and Cheap Shot VFX. The baboons were a fun ride and tough, Bakowski remarks. The speed that Ridley likes to shoot is fantastic, but someone interacting and fighting with a troop of baboons does take some planning and thought to go into it. Its complicated business. Bakowski adds, He was generous in terms of letting us shoot the passes we wanted to shoot. In general, we kept to the logic that there were a couple of hero guys in the middle and a bunch of supporting baboons on the edge. We do one pass where we put all of the baboon stunt performers in there. Everyone would run around acting like baboons. After that, we pulled out the baboons that werent interacting with people because it was a nightmare with the amount of dust being kicked up. We did a pass with only the contact points and then a clean pass afterwards. It was a challenge to have it all come together.A character in its own right is the capital city of Rome.One of the major creative challenges was developing the look of the CG water.Nothing was achieved easily on Gladiator II. This is the most challenging project that Ive ever done given the scale and scope of it and the conditions under which we worked, Max states. We had the sandstorms in Morocco, and the idea of doing a naval battle in the desert had its problems. We had to keep the dust down, and the physical effects team was always out there with water hoses, and they were clever. They had water cannons to replicate physical waves coming over the bow of the ships. It was a lot of technology on an enormous scale. The boat scenes were the most complicated. Explains Corbould, I was probably one of the first people on the show with Arthur, and our prep period was quite short. We built the boats in the U.K., shipped them out and then assembled them out there, which was the right thing to do because it was almost impossible to get that material in Morocco or the sheer quantity of steel and timber we needed. We put them in 30 40-foot trucks going across the Atlas Mountains, and as they were arriving, we were assembling them. It was like clockwork. On the day when we were shooting, we were still painting bits. It was that close.The visual effects work was as vast as the imagination of Scott. Were doing extensions in Rome, crowds in theColosseum, creatures and water, Bakowski notes. For the final battle. We were adding vast CG armies in the backgrounds of virtually every shot. We did some little pickups as well, so its integrating these pickups that came back to the U.K. with the stuff that was shot in Malta. Its not groundbreaking stuff, but the volume of it is quite high because its one of those things that adjusts and adapts as the edit develops. The Colosseum naval battle encapsulated both what Im looking forward to people seeing and also a big challenge. The baboons were a fun challenge, and the rhino just worked, which was fantastic. By the end, we knew how we were doing in the Colosseum, and our crowds look beautiful. I cant wait for you to see all of it.
    0 Comments ·0 Shares ·109 Views
  • NEXT-GENERATION CINEMA: THE NEW STANDARD IS PREMIUM AND ITS WORKING
    www.vfxvoice.com
    By CHRIS McGOWANGladiator II was given the IMAX Maximum Image treatment in November. (Image courtesy of Paramount Pictures)Cinema audiences are increasingly showing an appetite for higher-resolution, higher-quality movies, often with large-format presentations and/or 4D effects. Soon, they will also be exploring AR movie augmentations and an increasing number of film-related experiences as well.When audiences began returning to theaters after the pandemic, they wanted experiences that they couldnt get in their homes. Now, in a post-pandemic world, moviegoers want something premium and special for their time, and audiences seek out IMAX because it truly is a premium experience, says Bruce Markoe, Senior Vice President and Head of Post & Image Capture for IMAX.IMAX is a pioneer and leader in premium cinema. Markoe notes, As of June 30, 2024, there were 1,780 IMAX systems (1,705 commercial multiplexes, 12 commercial destinations, 63 institutional) operating in 89 countries and territories. The numbers speak for themselves. In 2023, IMAX delivered one of the best years in our history. with nearly $1.1 billion in global box office. And while last years Hollywood strikes dealt the entire entertainment business a temporary setback, it was just that: temporary. Weve had an incredible 2024 to-date at IMAX, marked by several recent box-office successes, including Inside Out 2, Twisters and Deadpool & Wolverine, and as we look ahead, this trend shows no signs of slowing down.Markoe explains, Every IMAX location in the world is built to our precise standards they are designed, built and carefully maintained with meticulous care, and every location is customized for optimal viewing experiences. Only IMAX uses acousticians, laser alignment and custom theater geometry, combined with our Academy Award-winning projection technology, precision audio and optimized seating layouts, to ensure every element is immersive by design.Joker: Folie Deux launched on IMAX in October. (Image courtesy of Warner Bros. Pictures)Markoe continues, Our theaters are calibrated daily to ensure audiences get perfectly tuned sound and image every time, at every location, regardless of where in the world it is. We also have incredible partnerships with filmmakers and studios. Were seeing a dramatic shift to IMAX among filmmakers and studios. We are increasingly creating specifically for the IMAX platform. [And,] we have more Filmed for IMAX titles in production than any time in our history, Markoe says. We are dramatically expanding our Filmed for IMAX program to feature many of the worlds most iconic filmmakers and directors alongside rising talents in the industry. To date, we have 15 Filmed for IMAX titles set for release this year more than double any previous year as filmmakers and studios from Hollywood and international territories increasingly create uniquely optimized versions for the IMAX platform.Markoe notes, To meet growing demand among filmmakers to shoot in IMAX, the company is developing and finalizing the roll-out of four next-generation IMAX 15/65mm film cameras. IMAX tapped such prolific filmmakers and cinematographers as Christopher Nolan, Jordan Peele and Hoyte van Hoytema, among others, to identify new specs and features for the prototype. The new cameras recently entered production.Dolby Cinema is a premium cinema experience created by Dolby Laboratories that combines proprietary visual and audio technologies such as Dolby Vision and Dolby Atmos. (Image courtesy of Dolby)The worlds largest 4DX theater is the Regal Times Square located at 247 West 42nd St. in New York City. (Image courtesy of Full Blue Productions and 4DX)IMAX launched the Filmed for IMAX program in 2020 to certify digital cameras that were officially approved to create IMAX-format films. Markoe explains, The program is a partnership between IMAX and the worlds leading filmmakers to meet their demands for the ultimate IMAX experience. Working directly with IMAX, the program allows filmmakers the ability to fully leverage the immersive IMAX theatrical experience, including expanded aspect ratio and higher resolution. Through the program, IMAX certifies best-in-class digital cameras from leading brands, including ARRI, Panavision, RED Digital Cinema and Sony, to provide filmmakers with the best guidance to optimize creatively how they shoot to best work in the IMAX format when paired with IMAXs proprietary post-production process.IMAX continues to innovate on the cutting edge of entertainment technology, Markoe states. Our IMAX with Laser system was recently recognized with a Scientific and Technical Academy Award. Combining our industry-leading technology and new tools with the enthusiastic embrace by filmmakers to specifically and creatively design their movies to be the most immersive, high-quality presentation, we continue to find new and innovative ways to expand the IMAX canvas moving forward. We see an opportunity for our platform to serve as a conduit for sports leagues to expand their global reach and provide a launchpad for projects from some of the worlds most iconic music acts. The future of cinema is multi-pronged, combining visionary works from Hollywood and local language blockbusters, original documentaries and exclusive events.The Wild Robot landed on IMAX in September. (Image courtesy of Universal Pictures)The E3LH QuarterView Dolby Vision Cinema Projection System. Dolby Vision is part of the companys Dolby Cinema package, which includes the Dolby Atmos sound system and special theater treatments to reduce ambient light and enhance comfort.(Image courtesy of Dolby)DOLBYThe appetite for premium cinema is huge, and its a clear factor in whats drawing people to see movies in theaters, says Jed Harmsen, Dolby Vice President and General Manager of Cinema & Group Entertainment. 2023 marked Dolby Cinemas strongestyear in history at the box office, with U.S. Dolby Cinema ticket sales eclipsing pre-pandemic levels, up 7% from 2019. Furthermore,Dolby boasts the highest average per-screen box office among all premium large-format offerings, which is a testament to theconsumers recognition and value of enjoying their films in Dolby.According to Comscore data, the domestic large-format gross box office was up +10.1% in 2023 vs. 2019, illustrating how the popularity of premium cinema is growing and overtaking pre-pandemic levels. Also, per Comscore, market share of the domestic large-format gross box office (in relation to the entire domestic gross box office) grew from 9.7% in 2019 to 13.3% in 2023. According to Harmsen, Premium cinema experiences are becoming a larger share of all cinema experiences. Its one of the reasons we continue to work with our cinema partners to make Dolby Vision and Dolby Atmos available to as many moviegoers around the world as possible. We created Dolby Cinema to be the best way for audiences to see a movie, featuring the awe-inspiring picture quality of Dolby Vision together with the immersive sound of Dolby Atmos, all in a fully Dolby-designed theater environment. There are around 275 Dolby Cinemas globally.Inside Out 2 hit IMAX giant screens in 2024. (Image courtesy of Pixar/Disney)Twisters was unleashed on IMAX in 2024. (Image courtesy of Universal Pictures)The IMAX 70mm Film Camera. One of the IMAX film cameras used by Christopher Nolan to shoot Oppenheimer. (Image courtesy of IMAX)Harmsen adds, Our global footprint for Dolby Cinema spans 28 exhibitor partners and 14 countries, with the first Dolby Cinema opening in 2014 in the Netherlands. Were excited to have true collaborations with multiple trusted partners and advocates like AMC, who have been a huge proponent in bringing the magic of the Dolby Cinema experience to moviegoers.Harmsen underscores the value of Dolby sound and vision to the viewing experience. Dolby Vision allows viewers to see subtle details and ultra-vivid colors with increased contrast ratio and blacker blacks, delivering the best version of the picture that the filmmaker intended. Dolby Atmos offers powerful, immersive audio, allowing audiences to feel a deeper connection to the story with sound that moves all around them. And Dolbys unique theater design allows audiences to experience both technologies in the best possible way by limiting ambient light, ensuring an optimal view from every premium seat and more.Dolby Vision and Dolby Atmos have revolutionized premium movie-going and have been embraced widely by creators and exhibitors, allowing us to bring Dolby-powered media and entertainment to more and more audiences, Harmsen says. To date, more than 600 theatrical features have been released or are confirmed to be released in Dolby Vision and Dolby Atmos, including recent box-office hits like Inside Out 2, Dune: Part Two, Deadpool & Wolverine and more. Harmsen concludes, We see exhibitors continuing to outfit their auditoriums to support more premium cinema experiences to meet the demand were seeing from moviegoers. At Dolby, we see premium as the new standard in cinema. Its clear audiences worldwide do as well.On the outskirts of the Las Vegas strip, Sphere is a literal expansion of cinema. (Image courtesy of Sphere Entertainment)The 4DX Cinema Sunshine Heiwajima movie theater in BIG FUN Heiwajima, an entertainment complex in Tokyo. (Image courtesy of 4DX)4DX4D cinema adds motion seats and multi-sensory effects to blockbuster movies. The experience adds about $8 to each ticket. South Koreas CJ 4DPLEX is the leader in this area. Globally, there are some 750 4DX screens affiliated with the company, which has teamed up with partners like Regal Cinemas. According to the Regal site, 4D movies utilize physical senses to transport viewers into a whole new viewing experience. Regals 4DX theaters are equipped with motion-enabled chairs, which create strong vibrations and sensations, as well as other environmental controls for simulated weather or other conditions such as lightning, rain, flashing (strobe) lights, fog and strong scents.SPHEREOn the outskirts of the Las Vegas strip, Sphere is a literal expansion of what cinema is. When not hosting concerts or events, Sphere shows high-resolution films on a wraparound screen that is 240 feet tall and covers 160,000 square feet, with a 16K by 16X resolution. Digital Domain worked on the visual effects of Darren Aronofskys movie Postcard from Earth, shown in the gigantic spherical venue. Working on Postcard from Earth for the Sphere was an extraordinary experience, marking our debut in such an impressive venue. Collaborating with Darren Aronofsky, a filmmaker whose work weve long admired, added an extra layer of excitement as we brought his vision to life in this dramatic setting, comments Matt Dougan, Digital Domain VFX Supervisor.Deadpool & Wolverine made a historic global IMAX debut last July. (Image courtesy of Walt Disney Studios Motion Pictures)With the Apple Vision Pro, stunning panorama photos shot on the iPhone expand and wrap around the user, creating the sensation that they are standing where the photo was taken. (Image courtesy of Apple Inc.)The IMAX Commercial Laser Projector designed specifically for multiplexes. (Image courtesy of IMAX)NETFLIX HOUSENetflix House is another next-generation cinema addition. The experiential entertainment venue will bring beloved Netflix titles to life, beginning with locations in malls in Dallas, Texas, and King of Prussia, Pennsylvania, in 2025. Building on previous Netflix live experiences for Bridgerton, Money Heist, Stranger Things, Squid Game and Netflix Bites, Netflix House will go one step further and create an unforgettable venue to explore your favorite Netflix stories and characters beyond the screen year-round, according to Henry Goldblatt, Netflix Executive Editor, on the Netflix website.At Netflix House, you can enjoy regularly updated immersive experiences, indulge in retail therapy and get a taste literally of your favorite Netflix series and films through unique food and drink offerings, says Marian Lee, Netflixs Chief Marketing Officer, on the Netflix site. Weve launched more than 50 experiences in 25 cities, and Netflix House represents the next generation of our distinctive offerings. The venues will bring our beloved stories to life in new, ever-changing and unexpected ways.ARAugmented reality is expected to expand the experience of movie-going, adding interactive and immersive elements to movie posters and trailers, interaction with characters, personalized narrative and cinematic installations. Apple Vision Pro undoubtedly brings new immersive opportunities to the table with its advanced mixed-reality capabilities offering unique ways to engage with stories, comments Rob Bredow, ILM Senior Vice President, Creative Innovation and Chief Creative Officer. He explains that cinema is a highly mature art form with well-established storytelling traditions and audience expectations. While Apple Vision Pro can be used to help create compelling new experiences, its not about replacing these mediums [such as film] but rather complementing them. The device opens doors to hybrid forms of entertainment that blend interactivity and immersion in ways that are uniquely suited to its technology. [Cinema] will continue to grow and thrive, enriched by these new possibilities, but certainly not overshadowed by them.Looking forward, one of the changes in next-generation cinema may be one of content. IMAXs Markoe says, Audiences are increasingly interested in must-see events things that cannot be experienced the same way at home on a TV. Recent events, such as our broadcast of the 2024 Paris Olympics Opening Ceremony or broadcasting the 2024 NBA Finals to select theaters in China, brings these larger-than-life experiences to audiences in a way that cant be replicated elsewhere. Increasingly, concert films like Taylor Swift: The Eras Tour have appealed to audiences who want to feel fully immersed in the action.Indeed, the future of cinema looks to lie more and more in premium cinema as well as immersive experiences that expand what movies are today.
    0 Comments ·0 Shares ·112 Views
  • OSCAR PREVIEW: NEXT-LEVEL VFX ELEVATES STORYTELLING TO NEW HEIGHTS
    www.vfxvoice.com
    By OLIVER WEBBDune: Part Two has significantly more action and effects than Dune: Part One, totaling 2,147 VFX shots. (Image courtesy of Warner Bros. Pictures)Godzilla Minus One made history at last years 96thAcademy Awards when it became the first Japanese film to be nominated and win an Oscar for Best Visual Effects, and the first film in the Godzilla franchises 70-year history to be nominated for an Oscar. Will the 97th Academy Awards produce more VFX Oscar history? Certainly, VFX will again take center stage, with a number of pedigree franchises and dazzling sequels hitting movie screens in the past year. From collapsing dunes to vast wastelands, battling primates and America at war with itself, visual effects played a leading role in making 2024 a memorable, mesmerizing year for global audiences.Dune: Part One won six Academy Awards in 2022, including Best Achievement in Visual Effects, marking Visual Effects Supervisor Paul Lamberts third Oscar. Released in March, Dune: Part Two is an outstanding sequel and has significantly more action and effects than the first installment, totaling a staggering 2,147 visual effects shots. The film is a strong contender at this years Awards. It was all the same people from Part One, so our familiarity with Deniss [Villeneuve] vision and his direction allowed us to push the boundaries of visual storytelling even further, Lambert says.The production spent a lot more time in the desert on Dune Two than on Dune One. Cranes were brought in and production built roads into the deep deserts of Jordan and Abu Dhabi. Concrete slabs were also built under the sand so that the team could hold cranes in place for the big action sequences. A lot of meticulous planning was done by Cinematographer Greig Fraser to work out where the sun was going to be relative to particular dunes, Lambert explains.Editorial and postvis collaborated with the VFX team to create a truly unique George Miller action sequence for Furiosa: A Mad Max Saga. (Image courtesy of Warner Bros. Pictures)We had an interactive view in the desert via an iPad that gave us a virtual view of these enormous machines at any time of day. This allowed us, for example, to figure out the shadows for the characters running underneath the spice crawler legs and the main body of the machine. VFX was then able to extend the CG out realistically, making it all fit in the same environment. Dune: Part One was a collaborative experience, but Dune: Part Two was even more so as we went for a much bigger scale with lots more action.The first topic discussed during pre-production among department heads and Villeneuve were the worm-riding scenes. Villeneuve envisaged Paul Atreides mounting the worm from a collapsing dune an idea that immediately struck the team as visually stunning and unique. The challenge lay in making this concept and the rest of the worm-riding appear believable. Filming for the worm sequences took place in both Budapest and the UAE. A dedicated worm unit was established in Budapest for the months-long shoot. The art department built a section of the worm on an SFX gimbal surrounded by a massive 270-degree sand-colored cone. This setup allowed the sun to bounce sand-colored light onto the actors and stunt riders who were constantly blasted with dust and sand, Lambert describes. Shooting only occurred on sunny days to maintain the desert atmosphere. Most of the actual worm-riding shots were captured here, except for the widest shots, which were later augmented with CG. In post-production, the sand-colored cone was replaced with extended, sped-up, low and high-flying helicopter footage of the desert.The VFX team at Framestore delivered 420 shots for Deadpool & Wolverine, while Framestores pre-production services (FPS) delivered 900-plus shots spanning previs, techvis and postvis. (Image courtesy of Marvel Studios)Wt FX delivered 1,521 VFX shots for Kingdom of the Planet of the Apes. Remarkably, there are only 38 non-VFX shots in the film. (Image courtesy of Walt Disney Studios Motion Pictures)Earlier footage from Gladiator was blended into Gladiator II flashbacks and live-action, especially original Gladiator crowd footage and in the arenas. The Colosseum set for Gladiator II was detailed as closely as possible to the first film. (Photo: Aidan Monaghan. Courtesy of Paramount Pictures)Blowing up the Lincoln Memorial for Civil War was shot in a parking lot in Atlanta. The single-story set was extended with VFX and the explosion grounded in real footage. Soldiers fired at a bluescreen with a giant hole in the middle. (Image courtesy of A24)For the collapsing dune scene, an area was scouted in the desert, and then a 10-foot-high proxy dune crest was created on flat desert.Three concrete tubes attached to industrial tractors were buried in this proxy dune and were used to create the collapsing effect while a stunt performer, secured by a safety line, ran across and descended into the collapsing sand as the tubes were pulled out. We could only attempt this once a day because of the need to match the light to the real dune, and the re-set to rebuild the crest took a few hours. On the fourth day, Denis had the shot he wanted. Post-production work extended the dunes apparent height to match the real dune landscape. The sequence was completed with extensive CG sand simulations of the worm moving through dunes, all contributing to the believability of this extraordinary scene.Mad Max: Fury Road was nominated for Best Visual Effects at the 2016 Academy Awards. Spin-off prequel/origin story Furiosa: A Mad Max Saga, the fifth installment in the Mad Max franchise, is the first of the films not to focus on the eponymous Max Rockatansky. DNEG completed 867 visual effects shots for the finished film. When DNEG came onboard with the project, main conversations were focused on the scope of the film and the variety of terrains and environments. Furiosa covers much more of the Wasteland than Fury Road did and details a lot of places that had only been touched on previously, notes DNEG VFX Supervisor Dan Bethell. It was really important that each environment have its own look, so as we travel through the Wasteland with these characters, the look is constantly changing and unique; in effect, each environment is its own character.Twisters features six tornadoes for which ILM built 10 models. (Images courtesy of Universal Pictures)Twisters features six tornadoes for which ILM built 10 models. (Images courtesy of Universal Pictures)The Stowaway sequence was particularly challenging for the visual effects team to complete. Apart from being 240 shots long and lasting 16 minutes, it had a lot of complex moving parts; vehicles that drive, vehicles that fly, dozens of digi-doubles, plenty of explosions and, of course, the Octoboss Kite! says Bethell. Underneath it all, a lot of effort also went into the overall crafting of the sequence, with editorial and postvis collaborating with our VFX team to create a truly unique George Miller action piece. The Bullet Farm Ambush was also a big challenge, although one of my favorites. Choreographing the action to flow from the gates of Bullet Farm down into the quarry as we follow Jack, then culminating with the destruction of, well, everything was very complex. We work often on individual shots, but to have over a hundred of them work together to create a seamless sequence is tough.Working on a George Miller project is always a unique experience for Bethell. Everything is story-driven, so the VFX has to be about serving the characters, their stories and the world they inhabit. Its also a collaboration; the use of VFX to support and enhance work from the other film departments such as stunts, SFX, action vehicles, etc. I enjoy that approach to our craft. Then, for me, its all about the variety and scope of the work. Its rare to get to work on a film with such a vast amount of fresh and interesting creative and technical challenges. On Furiosa, every day was something new, from insane environments and FX to the crazy vehicles of the Wasteland this movie had it all!Robert Zemeckis Here follows multiple generations of couples and families that have inhabited the same home for over a century. The movie required de-aging Tom Hanks and Robin Wright. Nearly the entire movie was touched by VFX in some form or another. (Images courtesy of TriStar Pictures/Sony)Robert Zemeckis Here follows multiple generations of couples and families that have inhabited the same home for over a century. The movie required de-aging Tom Hanks and Robin Wright. Nearly the entire movie was touched by VFX in some form or another. (Images courtesy of TriStar Pictures/Sony)Alex Garlands Civil War required over 1,000 visual effects shots as Garland pushed the importance of realism. The more grounded and believable we could make Civil War, the scarier it would be, notes Production VFX Supervisor David Simpson. We deliberately avoided Hollywood conventions and set a rule that all inspiration should be sourced from the real world. Every element VFX brought to the film had a real-world reference attached to it drawing from documentaries, news footage, ammunition tests and war photography.Due to the strict rules about shooting from the skies above Washington D.C., capturing the aerial shots of the Capitol would have been impossible to do for real. This resulted in full CG aerial angles over D.C. and the visual effects team building their own digital version, which covered 13 square miles and 75 distinct landmarks, thousands of trees, buildings, lampposts and a fully functioning system of traffic lights spread over 800 miles of roads. Plus, there are roadworks, buildings covered in scaffolding, parked cars, tennis courts and golf courses, Simpson adds. One of my favorite touches is that our city has cranes because all major cities are constantly under construction!The visual effects team went even further, building a procedural system to populate the inside of offices. When the camera sees inside a building, you can make out desks, computers, potted plants, emergency exit signs, water coolers. The buildings even have different ceiling-tile configurations and lightbulbs with slight tint variations. We literally built inside and out! Once the city was complete, it was then turned into a war zone with mocap soldier skirmishes, tanks, police cars, explosions, gunfire, helicopters, debris, shattered windows and barricades.Here follows multiple generations of couples and families that have inhabited the same home for over a century. Three sequences in the film were particularly CG-dominant, the first being the neighborhood reveal, which was the last shot in the movie. It was challenging mainly because it was subject to several interpretations, compositions and lighting scenarios, and the build was vast, says DNEG VFX Supervisor John Gibson. The sequence surrounding the houses destruction was also incredibly complex due to the interdependence of multiple simulations and elements, which made making changes difficult and time-consuming.Godzilla x Kong: The New Empire was directed by Adam Wingard, who developed a distinctive and appealing visual style for the film. Compelling VFX work was completed by Wt, Scanline VFX, DNEG and Luma Pictures, among others. (Images courtesy of Warner Bros. Pictures and Legendary Entertainment. GODZILLA TM & Toho Co., Ltd.)Dune: Part Two was even more of a collaborative experience than Dune: Part One, on a bigger scale with more action. (Image courtesy of Warner Bros. Pictures)The biggest challenge was the grand montage, which required seamless transitions through various time periods and environments. The Jurassic Era beat was especially challenging in that we needed to flesh out a brand-new world that had real-time elements mixed with accelerated time elements, and they all had to be set up to transition smoothly into the superheated environment and maintain a consistent layout, Gibson details. By far the most challenging aspect of the grand montage was the tree and plant growth. As it would have been very difficult to modify existing plant growth systems to match our cornucopia of plant species using the existing software available for foliage animation and rendering, we had to develop a host of new techniques to achieve the realistic results we were after.Gibson lauds the collaborative spirit of the team. He cites their willingness to experiment, learn new techniques and support each other as instrumental in overcoming the challenges of the condensed production schedule. Boundaries between departments dissolved, folks seized work to which they thought they could contribute, there was little hesitation to bring in and learn new software or techniques, and we brainstormed together, constantly looking for better and better ways to get results. Thats what stood out to me: the cohesion within the team.Cassandra inserting her hand through Mr. Paradoxs head was one of the many challenging VFX shots required for Deadpool & Wolverine. (Image courtesy of Marvel Studios)Framestore VFX Supervisor Robert Allman praises Marvels collaborative approach to VFX on Deadpool & Wolverine, which he describes as a melting pot for filmmakers and artists. (Images courtesy of Marvel Studios)Framestore VFX Supervisor Robert Allman praises Marvels collaborative approach to VFX on Deadpool & Wolverine, which he describes as a melting pot for filmmakers and artists. (Images courtesy of Marvel Studios)I love Marvels collaborative approach to VFX things are often hectic at the end, but that is because stuff is still being figured out, largely because its complicated! In this melting pot, the filmmakers look to the artists for answers, so your ideas can end up in the film. For hard-working VFX artists, nothing is better than that.Robert Allman, VFX Supervisor,Deadpool & WolverineWt FX delivered 1,521 VFX shots for Kingdom of the Planet of the Apes. Remarkably, there are only 38 non-VFX shots in the film. VFX Supervisor Erik Winquist ran through a gauntlet of challenges, from a cast of 12 new high-res characters whose facial animation needed to support spoken dialogue, to a minute-long oner set in an FX extravaganza with 175 apes and 24 horses to choreograph, he notes. The scenes that Id say were the most challenging were those that featured large water simulations integrating with on-set practical water, digital apes and a human actor. The bar for reality was incredibly high, not only for the water itself but also in having to sell that waters interaction with hairy apes, often in close-ups. It was an incredibly satisfying creative partnership for me and the whole team, working with [director] Wes Ball. From the start, he had a clear vision of what we were trying to achieve together and the challenge was about executing that vision. It gave us unshifting goal posts that we could plan to, and we knew that we were in safe hands working on something special together. That knowledge created a great vibe among the crew.More shooting time was spent in the desert on Dune: Part Two than on Dune: Part One. Cranes were brought in and production built roads deep into the deserts of Jordan and Abu Dhabi, UAE. (Image courtesy of Warner Bros. Pictures)Strict rules about shooting from the skies above Washington, D.C. prevented capturing aerial shots of the Capitol for Civil War, which resulted in full CG aerial angles over D.C. and the VFX team building a digital version covering 13 square miles and 75 distinct landmarks. (Image courtesy of A24)Deadpool & Wolverine has grossed more than $1.264 billion at the box office, a staggering feat. The VFX team at Framestore delivered 420 shots, while Framestores pre-production services (FPS) delivered 900-plus shots spanning previs, techvis and postvis. Robert Allman served as Framestore VFX Supervisor on the film. I love Deadpool, so it was tremendously exciting to be involved in making one, he explains. However, more than this, I love Marvels collaborative approach to VFX things are often hectic at the end, but that is because stuff is still being figured out, largely because its complicated! In this melting pot, the filmmakers look to the artists for answers, so your ideas really can end up in the film. For hard-working VFX artists, nothing is better than that.The atomizing of Cassandra in the final sequence was technically tough to achieve. Making a completely convincing digital human and the atomizing effects as detailed and dynamic as the shots demanded was a huge challenge. Most problematic was creating an effect within the borders of good taste when the brief disintegrate the face and body of a human seems to call for gory and horrifying. Many takes of this now lie on the digital cutting-room floor. An early wrong turn was to reference sandblasted meat and fruit, for which there are a surprisingly large number of videos on YouTube. However, this real-world physics gave rise to some stomach-churning simulations for which there was little appetite among filmmakers and artists alike. In the end, the added element of searingly hot, glowing embers sufficiently covered the more visceral elements of the gore to make the whole thing, while still violent, more palatable to all concerned.Traveling through the Wasteland with the characters of Furiosa: A Mad Max Saga, the look is constantly changing and unique. Each environment had to have its own look and, in effect, became its own character. (Images courtesy of Warner Bros. Pictures)Traveling through the Wasteland with the characters of Furiosa: A Mad Max Saga, the look is constantly changing and unique. Each environment had to have its own look and, in effect, became its own character. (Images courtesy of Warner Bros. Pictures)Ridley Scotts Gladiator was met with critical acclaim upon its release in 2000. It won five awards at the 73rdAcademy Awards, including Best Visual Effects. Nearly 25 years later, Gladiator II hits screens as one of the most anticipated releases of the year. Last year, Scotts highly anticipated Napoleon was also nominated for Best Visual Effects, and Scotts films are, more often than not, strong contenders at the Awards.Work for Gladiator II was split between Industrial Light & Magic, Framestore, Ombrium, Screen Scene, Exceptional Minds and Cheap Shot, with 1,154 visual effects shots required for the film. For Visual Effects Supervisor Mark Bakowski, the baboon fight sequence was particularly daunting. Conceptually, this was a tough one, he explains. Very early on, Ridley saw a picture of a hairless baboon with alopecia. It looked amazing and terrifying but also somewhat unnatural. Most people know what a baboon looks like, but a baboon with alopecia looks a bit like a dog. Framestore did a great job and built a baboon that looked and moved just like the reference, but viewed from certain angles and in action, unfortunately, it didnt immediately sell baboon. Its one thing to seeone in a nature documentary, but to have one in an action sequence with no introduction or explanation was a visual challenge.One of the biggest challenges facing the VFX team on Kingdom of the Planet of the Apes was the cast of 12 new high-res characters whose facial animation needed to support spoken dialogue. (Images courtesy of Walt Disney Studios Motion Pictures)One of the biggest challenges facing the VFX team on Kingdom of the Planet of the Apes was the cast of 12 new high-res characters whose facial animation needed to support spoken dialogue. (Images courtesy of Walt Disney Studios Motion Pictures)Bakowski explains that working with Ridley Scott was a crazy and unique experience. So many cameras and such scale, its a real circus and Ridleys very entertaining. He talks to everyone on Channel 1 on the radio, so you can follow along with his thought process, which is by turns educational, inspirational and hilarious. A lovely man. I enjoyed working with him. The VFX team was all fantastic and so capable both on our production side and vendor side. Ive never worked with such an amazing bunch on both sides. Our production team was a well-oiled machine sometimes in both senses but mainly in terms of efficiency and, vendor side, its great just being served up these beautiful images by such talented people. Both made my job so much easier. The locations were stunning, both scouting and shooting 99% of the film was shot in Maltaand Morocco, so youre there for a long time; you get to immerse yourself in it. That was multiplied by the fact we got impacted by the strikes, so we ended up going back to Malta multiple times. I felt I got to know the island quite well and loved it and the people. That said, I wont be going back to Malta or Morocco for a holiday soon. I feel like Ive had my fill for a while!Other outstanding releases that could potentially compete for Best Visual Effects include Twisters, which took everyone by storm earlier in 2024 (with ILM as the main vendor), Godzilla x Kong: The New Empire featuring compelling work by Wt, Scanline VFX, DNEG and Luma Pictures, among others, and A Quiet Place: Day One, a fresh, frightening addition to the Quiet Place series.
    0 Comments ·0 Shares ·105 Views
  • CONTEXTUALIZING VIRTUAL PRODUCTION DESIGN
    www.vfxvoice.com
    By TREVOR HOGGAs technology advances, there are ramifications that create new skill sets and cause traditional approaches to be reassessed. This includes the role of the production designer in virtual production, where physical set builds and digital backgrounds are combined to create an in-camera shot that at one time could have only been completed in post.A consequence of further entwining the art department and visual effects is the emergence of the virtual art department (VAD) and virtual production designer. Does this mean that projects will be divided between a production designer and a virtual production designer? Thats the core question, notes Alex McDowell, Co-Owner and Creative Director at Experimental Design. There is no possible way that having a virtual production designer and an actual production designer is useful. The production designer has always taken command of whatever tools are available to them, so its evolutionary. Its reasonable to consider virtual production as another aspect of the design space. If the production designers job is to essentially frame the story, which means everything that contextualizes the narrative is the designers responsibility. There are a lot of divisions to that, like the cinematographer, costume designer, special effects and stunts.Minority Report (2002) had the first digital art department. (Image courtesy of 20th Century Fox)Virtual production for product designers is very flexible. You can use it as just a live session with them directing the cameras and making requests in the engine while sitting with an operator. We can also offer more control to the viewer, allowing them to view the sets in VR, giving them a sense of scale and flow, or with virtual cameras and mocap allowing framing within virtual sets and testing story beats early and making changes to the set accordingly.Michael Zaman, Realtime Supervisor, FramestoreAn effort to bridge the gap between the art department and visual effects was made during the making of Minority Report. In 2000, when I started to get embedded in technological digital aspects of production with Minority Report, which had the first digital art department. From that point forward, we were beginning to mix tools dramatically. Now, when you speak with Andrew Leung, for example, he is using the tools of visual effects from start to finish, McDowell states. There was a division based on the platforms in which we worked or the tools that we used, particularly when visual effects took command of the back end of the digital. The front end, which the art department created for the virtual, fell into a black hole in the center of production and had to be rebuilt by visual effects. There was this incredible inefficiency where the production designer was designing the film, but the production system did not understand that the visual effects in the back end were an extension of the environment that the designer created. What we did was to send 14 books of material to visual effects so there was a through-line of intent, content and design.Andrew Leung started off as a visual effects artist before becoming a concept designer. (Image courtesy of Andrew Leung)Aiding in maintaining the visual aesthetic throughout the filmmaking process is virtual production, as the focus is on creating content before shooting commences rather than relying entirely on post-production. There is still this idea that visual effects, art department and virtual art department are somehow different as opposed to a through-line of execution, McDowell remarks. You wouldnt separate the construction coordinator, painters and all of the people who make the in-camera physical sets from the art department. Visual effects is doing exactly the same as construction; theyre taking design intent and building it until its finished for camera. When I started Star Wars: Episode IX with Colin Trevorrow, ILM, the production designer and the art team worked together from the beginning. Even though we werent doing virtual production because there were pre-LED screens in the frame, we were building virtual assets for the director to scout in VR. It was very efficient because the flow was continuous then. Visual effects is giving all of their knowledge to the front end so there is no waste to the assets being created.Added stress is placed upon the visual effects team. Theyre taking stuff from the end and moving it to the front, and that becomes an issue in terms of scheduling, observes Concept Designer Andrew Leung. With post, you can always extend it, but there is a huge pressure to get everything designed before shooting. There are some advantages to that, as we get a lot of stuff for free in lighting. At the center of the design process is Unreal Engine. If you want to put a few trees in there, usually the application will quit on you. With Unreal Engine, I can put in millions of trees and it wont complain. That alone opens up huge design opportunities. Unreal Engine has given the ability to design whole worlds. Leung notes, I did a pitch for a Paramount film where I built a whole map for a huge fantasy film, so we were able to talk about it in such a way with the director that you could travel around and talk about how characters travel. It was the same process when I worked on The Lord of the Rings: The Rings of Power, where I built whole sections of Middle Earth, and we were able to talk about travel times for the characters. You couldnt do that before because simply the technology wasnt there.Andrew Leung conceptualizes a battle and the arrival of the witch in Mulan (2020). (Image courtesy of Walt Disney Studios Motion Pictures)Size matters when it comes to shooting in a volume. The biggest complaint is that people tend to overbuild the volume, where they go, We have the biggest volume on stage, Leung states. Most of the time you want to use it for small sets, which comes with its own host of issues. I did a volume shoot with Alex McDowell over at Amazon, and we were constantly butting heads with people running the stage who were telling us that the set was too small. We didnt want to redesign the set much larger than it was because that wasnt part of the story. We kept going back and forth. Eventually, the set was made bigger than it should be and looked funky. A portable volume is what we want. We find that more useful because were not locked to a particular stage. For an LED stage volume to move forward in the future, the portable solution is going to be the best.At the center of the design process for the virtual art production is Unreal Engine. Concept art by Andrew Leung for Black Panther. (Image courtesy of Andrew Leung and Marvel)Being proficient with 3D software is critical. If you are already working in concept design, you should be familiar with set design stuff like SketchUp, Vectorworks and Rhino, Leung remarks. I dont know a single concept designer right now who does not have any kind of 3D skills. The contemporary concept designer, at the bare minimum, should know Blender. If you are talking about me specifically, Im slightly unusual in that I came from visual effects before going into the art department. I am well aware of the post process and use that as part of my design. What matters is the final result, not the medium. I remember working with Jan Roelfs, who designed Gattaca, and one of the things that he said that always stuck in my head was, Pixels are plywood. What a lot of production designers now love is, Wow. More of my stuff is making it into post, instead of it being this contentious relationship between post and the art department. Now, its more tied together.If you want to put a few trees in there, usually the application will quit on you. With Unreal Engine, I can put in millions of trees and it wont complain. That alone opens up huge design opportunities.Andrew Leung, Concept DesignerBeing proficient in 3D software is critical nowadays for concept artists. Concept art of Shuris Lab in Black Panther: Wakanda Forever. (Image courtesy of Andrew Leung and Marvel)Virtual production excels when dealing with visual effects-heavy films where locations are limited and not explored extensively. You build two pieces of a set that is then extended in the volume, and you have this wonderful view of Tatooine, states Supervising Art Director Chris Farmer. Your foreground is a practical set, but everything else beyond that is built in Unreal Engine as opposed to going out on location and building all of that stuff. Those types of things that are fantastical or not something you could capture by going out with a 360 camera and shooting out on location and setting your scene in front of it. Its something that requires fantasy and visual effects-heavy work that you would do in front of a greenscreen, and all of the post is done later. Its something you can build before you shoot and eliminate a lot of visual effects work. Process work is also elevated. We did a lot of putting cars in the volume and running 360 moving plates behind them. These cars driving down the highway you would never know. A police car parked on the highway with a moving plate in the background. Its flawless.Virtual production excels when dealing with visual effects-heavy films and shows that have limited locations that are not explored extensively, such as The Mandalorian. (Image courtesy of Disney+ and Lucasfilm Ltd.)Practical sets for The Mandalorian are constructed in the foreground while everything else is built in Unreal Engine. (Image courtesy of Disney+ and Lucasfilm Ltd.)The flexibility to create controllable environments ranging from Shanghai in the 1930s to the futuristic ones found in The Mandalorian is a major advantage of virtual production. (Image courtesy of Disney+ and Lucasfilm Ltd.)There is going to be a shift where designers have to start working with and understanding a virtual art department, and getting up in front and convincing producers that it can be done, believes Farmer. You can design sets, environments and locations. You can scan locations, and import and manipulate them to get what you want. You can build Shanghai in the 1930s in Unreal Engine, light it, give it the atmosphere it needs, and shoot it without spending weeks on end in a location and then having to do visual effects on top. Twilight is not restricted to a couple of hours within a day. Farmer adds, You have total control of the location and the scene on the fly. You can change it and do whatever you want. Ive been talking to a friend who is a designer about promoting the idea that the art department with cinematographers, designers and directors can deliver almost completed scenes lit and ready to go. You just add your foreground pieces and actors. Rightly or wrongly, a lot of decisions get made in the back end; theyre not necessarily the people who are the creative forces driving the picture. The designer and cinematographer should be the ones making all of those decisions.Virtual production is about giving production designers a flexible 3D workspace where they can interactively make requests and expect a rapid turnaround. Behind the scenes on the Netflix show 1899. (Images courtesy of Netflix)When I started The Mandalorian with Andrew Jones and ILM, we had a regular art department with set designers that Doug Chiang would send down. We would get our designs down to ILM who would break them down, turn them into sets and draw them up in 3D, Farmer recalls. At the same time, we began to build a virtual art department, which housed modelers, lighters and texture artists in a separate room because, at the time, they were technically non-union, and the other part of the art department was union. I do think that the push is to embrace the virtual art department and bring it into the art department, [especially] when you get to a bigger scale doing virtual art department work like Fallout, which has got a lot of volume work. It is my understanding, from the coordinator I had previously worked with, that she was a virtual art department coordinator. It was a separate department managed by a different team, but also reporting to the production designer and art directors.These days, its hard to find a feature or episodic show that does not utilize some form of CG in their planning phase. From Barbie. (Image courtesy of Warner Bros.)As with most elements in production, proper planning ensures the greatest chance for success for virtual production. From Tim Webbers 2023 short film FLITE. (Image courtesy of Framestore Films and Inflammable Films)Virtual production for production designers is very flexible. Virtual production for production designers is about giving them a flexible 3D workspace where they can interactively make requests and expect a rapid turnaround, states Michael Zaman, Realtime Supervisor at Framestore. It requires artists to be mindful of the way we build these virtual sets and make sure we consider the way the production designers, directors and other stakeholders might want to change things on the fly and be prepared for these changes. The artists need to make sure assets are prepared with optimization and customization in mind, breaking elements into well-thought-out chunks that allow for quick movement. Without this, the process becomes very similar to CG production design. Virtual production for product designers is very flexible. You can use it as just a live session with them directing the cameras and making requests in the engine while sitting with an operator. We can also offer more control to the viewer, allowing them to view the sets in VR, giving them a sense of scale and flow, or with virtual cameras and mocap allowing framing within virtual sets and testing story beats early and making changes to the set accordingly.I have not heard of a virtual production designer, admits Connor Ling, Virtual Production Supervisor at Framestore. On a typical show, its simply an extension of a traditional production designer. When working with a production, our real-time supervisor becomes more of a virtual art director, falling underneath the production designer. These days, you will be stretched to find a feature or episodic show that doesnt utilize some form of CG in its planning phase. Commercials vary a little more due to the time they have, but even if we compare to a fully animated CG film, the role of a production designer is still used or they may lean into an art director more. Extra time is required in pre-production. Ling notes, As with most elements in production, proper planning ensures the greatest chance for success. I would love to continue to see productions adopt and commit to the practices earlier in the production process and fully utilize the technology to the best of their ability. Of course, this is dependent on need and whether virtual production is correct for their project. When it is and planned appropriately, thats when you see the best results.
    0 Comments ·0 Shares ·115 Views
  • VISUAL EFFECTS AND ANIMATION BRING HISTORICAL EVENTS TO LIFE FOR DOCUMENTARIES
    www.vfxvoice.com
    By TREVOR HOGGWhen it comes to covering historical events, documentarians go on a journey to find and acquire rights to archive footage and photographs or fill in the visual gaps with talking heads or reenactments. In some cases, the reenactments are more about being authentic to the emotion of a moment than to the actual physical details. With technology becoming more affordable and accessible, the ability to have visual effects and animation within a tight budget has allowed for even more creative and innovative ways to bring the past to cinematic life.Bad RiverWe do social justice documentaries, states Andrew Sanderson, Associate Producer at 50 Eggs Films.Bad River deals with a Native American tribe called the Bad River Band, located in Northern Wisconsin, who are fighting for their sovereignty. Some things are happening now, and some things happened back in 1845 or 1850 that we dont have any photos, footage or music from, so we had to be creative when we were making the film. We want to tell stories the best we can. A lot of the Elders who we interviewed from the band would tell stories of Chief Buffalo, the historic chief of La Pointe Band of the Ojibwe, and other Ojibwe leaders going to Washington, D.C. in 1852 to try to convince President Millard Fillmore not to remove them from their land. These are stories that have been passed down from generation to generation, and its important for us to get it right but let the folks doing the interview tell their story.There is a great sense of community, so we wanted to include Bad River as much as we could in the filmmaking process. We would identify some local youth artists in the area. They would make sketches for us of different scenes or elements we were trying to capture. Then we take those sketches and give them to Punkrobot, an animation company in Chile, which would bring them to life. Andrew Sanderson, Associate Producer, 50 Eggs Films, Bad RiverIllustrations by Bad River Band youths as well as courtroom drawings were the inspiration for the animated sequences created by Punkrobot. (Images courtesy of 55 Eggs Films)Sanderson employed unique approaches to making the film. He remarks, There is a great sense of community, so we wanted to include Bad River as much as we could in the filmmaking process. We would identify local youth artists in the area, and they would make sketches for us of different scenes or elements we were trying to capture. Then we take those sketches and give them to Punkrobot, an animation company in Chile, which would bring them to life.Jackie ShaneAnimated sequences were expanded upon. There is a scene where one of the interviewees is describing when he was younger, people from the Bureau of Indian Affairs driving around the reservation trying to catch kids to bring them to boarding schools, Sanderson explains. We had one of our youth artists draw a man coming out of a car. Then we would have Punkrobot animate that and bring it even a step further into a whole animated sequence. Sometimes, it would transition to another still that we had or another piece of media, so it flowed well. In another example, we had licensed some black-and-white footage of the front lawn of the White House that had sheep eating the grass. We had Punkrobot sketch out what would be the next scene, and from there, it transitioned into the sketch of the interior of the White House where theyre plotting to take land from different reservations. A legal battle between the Bad River Band and Canadian oil and gas pipeline operator Enbridge is included. They had a case that was in Madison Western District Superior Court, so we werent allowed to have any photographs or recording devices in the court, but we wanted to show what was going on. We hired a courtroom sketch artist, told him who the key people were, and had him get a selection of sketches over two days. Then, we had Punkrobot animate those sketches to tell the story of what was going on in the courtroom when we couldnt have told it any other way visually. Sanderson adds, We basically used different mediums and blended them all together to make sequences that are visually appealing and can help bring people into the story.Machine learning and Stable Diffusion enabled the animated sequences to go from 15 to 40 minutes of screentime in Any Other Way: The Jackie Shane Story. (Images courtesy of Banger Films and the National Film Board of Canada)We developed an interesting visual effects process where we ended up with something that was shot relatively inexpensively, and through clever piecing together of strange techniques, we made it look as though 2,000 frames were painted by hand.Luca Tarantini, Director of Animation, Any Other Way: The Jackie Shane StoryPiecing together the life of a trans soul singer, who is revered along with her contemporaries Etta James and Little Richard, and who vanished from public view 40 years ago, is Any Other Way: The Jackie Shane Story, directed by Michael Mabbott and Lucah Rosenberg-Lee and produced by Banger Films and the National Film Board of Canada. We had to bring Jackies story to life, and roto seemed like a cost-effective way to do that because we are starting with an actor, not doing animation from scratch, which can be expensive and not look good if you dont have the right team, remarks Director of Animation Luca Tarantini. We developed an interesting visual effects process where we ended up with something that was shot relatively inexpensively, and through clever piecing together of strange techniques, we made it look as though 2,000 frames were painted by hand. Machine learning and Stable Diffusion were cornerstones of the animation process. Stable Diffusion is meant for you to type in a sentence, and it generates an image of that thing. But we were using it where you start with an image, type in a bit of prompt, and it gives you an interpretation of that original image. If you get the settings just right, it doesnt distort the original image enough but stylizes it in the correct way.Adding flares and working with a virtual set in the animated sequences for Any Other Way: The Jackie Shane Story. (Images courtesy of Banger Films and the National Film Board of Canada)As the edit evolved, it became clear that the animation was a major component of the storytelling and consequently went from 15 to 40 minutes of screen time. Not only did the amount of animation and the time we spent on it have to change, it became impossible without experimenting with new techniques to try to make it feasible for a tiny team of two or three people to deal with that volume of content, notes Co-Director of Animation Jared Raab. We managed to mix a bit of everything that everybody knew from shooting on an actual soundstage in a scrappy, music video-style way, greenscreen. Luca pioneered simple camera tracking to get camera position data for when he created the backgrounds, which were made in 3D using Cinema 4D, then I did a ton of Adobe After Effects work to create some of the 2D animation of the space. Last, Luca created entirely 3D lighting using the camera data to get the lens flares and some of the stuff that we loved from early archival music documentaries. It was a sprinkling of a little bit of everything that we knew how to make a film into the project, and the chemistry gave us just the right recipe to pull it off.Pigeon TunnelUnion VFX made a shift from working on feature films and high-end episodic to contributing to the Errol Morris documentary The Pigeon Tunnel, which explores the life and career of John le Carr through a series of one-on-one interviews with the former intelligence officer turned acclaimed novelist. Generally, visual effects for documentaries are all about enhancing the audiences understanding of the real-life events and subject matter that the narrator is talking about, observes David Schneider, DFX and Technical Supervisor for Union VFX. It is important for the work to focus on realism and subtle invisible effects that stay true to the historical moments being described during the interview. The core value of a documentary is to educate, so we generally have to keep augmentation minimal, not exaggerate, and retain a factually accurate depiction of events. Digital augmentation was not confined to one aspect, as there were 154 visual effects shots, and five assets had to be created. Schneider adds, We handled everything from equipment removal during interview shots to creating CG creatures and augmenting environments. The films many dramatizations gave Union VFX the chance to shine with standout assets, like an unlucky pigeon and a Soviet freighter. One of the highlights was a nighttime airplane sequence where we delivered several fully CG shots that brought the scene to life.For the Monte Carlo pigeon shoot sequence, we needed a close-up of a pigeon being shot out of the sky. To achieve this, we had to create an entirely new feather simulation system that captured the realistic movement of feathers when the pigeon was hit. While weve worked with CG birds before, this was the first time we had been so close to the camera that individual feathers were clearly visible.David Schneider, DFX and Technical Supervisor, Union VFX, The Pigeon TunnelUnion VFX handled everything from equipment removal during interview shots to creating CG creatures and augmenting environments for a total of 154 visual effects for The Pigeon Tunnel. (Images courtesy of Union VFX and Apple)Early on, Union VFX received detailed storyboard animatics. It helped us get on the same page, and since documentaries dont typically use heavy visual effects, this was invaluable, Schneider states. Some scenes required complex augmentation. For example, the sequence in which Kim Philby makes his escape to the Soviet Union required us to build the Dolmatova [a Soviet-era freighter], place it into provided plates, and enhance the surrounding dock with cargo and a digital gangway leading to the ship. All of this was integrated into the practical fog that was present on set. For the Monte Carlo pigeon shoot sequence, we needed a close-up of a pigeon being shot out of the sky. To achieve this, we had to create an entirely new feather simulation system that captured the realistic movement of feathers when the pigeon was hit. While weve worked with CG birds before, this was the first time we had been so close to the camera that individual feathers were clearly visible. We meticulously modeled the texture and styled the pigeons feathers to ensure they moved naturally, both in flight and when they detached from the bird.EnduranceCutting back and forth from the ill-fated 1915 Antarctica expedition to the South African research vessel S.A. Agulhas II searching the Weddell Sea in 2022 for the sunken ship captained by renowned Irish explorer Ernest Shackleton is the National Geographic documentary Endurance, directed by Elizabeth Chai Vasarhelyi, Jimmy Chin and Natalie Hewit. There were 28 men, and most of them wrote diaries or were able to tell their stories after the fact, so there is a lot of historical detail, states Producer Ruth Johnston. We used AI voice conversion technology so that every word that you hear is from one of seven guys [from the expedition] who lead us through the story [by reading from their writings]. Virtual content was built for three separate re-creations of three different campsites with various types of ice flows in the backgrounds. These ice flows were important because it was something we would not have been able to easily recreate in real life, remarks Virtual Production Supervisor Eve Roth. We color-corrected the virtual snow around the camp to match what the art department ended up putting down. Because we knew what kinds of harsh weather we were trying to recreate for the campsites, the virtual content was created in a way where we could dial up or down the wind and snow effects. We were also able to change the type of clouds in the sky, to dial that up and down.These ice flows were important because it was something we would not have been able to easily recreate in real life. We color-corrected the virtual snow around the camp to match what the art department ended up putting down. Because we knew what kinds of harsh weather we were trying to recreate for the campsites, the virtual content was created in a way where we could dial up or down the wind and snow effects. We were also able to change the type of clouds in the sky, to dial that up and down.Eve Roth, Virtual Production Supervisor, EnduranceStept Studios focused on the reenactments. We had the urge to chase some fancy camera work, but ultimately, we wanted to shoot it the same way Frank Hurley [Endurance Expeditions official photographer] would have on sticks with composed frames, explains Nick Martini, Founder and Creative Director of Stept Studios. This visual approach allowed us to intercut our footage with the archival material seamlessly. Most of the visual effects work was completed before production.Our efforts were centered around building the environments where the story takes place using Unreal Engine, Martini states. Those worlds were then projected in LED volume stages to be used as interactive backgrounds on a stage in Los Angeles. This allows for an organic in-camera look when we shoot and provides more realistic lighting than a traditional greenscreen approach. In post, some additional clean-up and effects were added to sell the gag.(Weddell Sea Pictures/Jasper Poore)(Photo: Jasper Poore. Image courtesy of Weddell Sea Pictures)(Photo: Frank Hurley. Image courtesy of BFI)(Photo: James Blake. Image courtesy of Falklands Maritime Heritage Trust)(Photo: Nick Birtwistle. Image courtesy of Falklands Maritime Heritage Trust)(Photo: Esther Horvath. Image courtesy of National Geographic)(Photo: Esther Horvath. Image courtesy of National Geographic)Intercut with contemporary footage of the expedition to find Endurance, the backstory of the sunken ship was told through historical photographs taken by Frank Hurley as well as reenactments taking place in a parking lot and LED volume.Atmospherics were added to the archival still photographs. We didnt want effects to overwhelm or take away from the original photography, rather to enhance the imagery or add impact in dramatic moments, states Josh Norton, Executive Creative Director and Founder of BigStar. Blowing smoke and snow were added only when we felt those moments of drama were necessary or the original photo called for it.Orienting the audience is a collection of maps showing the progression of both expeditions. The filmmakers had a desire to make sure the films graphics didnt feel too expected or conservative, Norton remarks. We were able to work with colorful type, energetic transitional language and texture while still making sure that we were being accurate to the historical research, especially on the maps. As for any lessons learned from the project, he replies, Dont go to the Weddell Sea without a backup plan!
    0 Comments ·0 Shares ·106 Views
  • RIDING ON THE BACK OF GIANTS FOR DUNE: PART TWO
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of Warner Bros. PicturesNo doubt, the entire success of Dune: Part Two was riding on the shoulders of Paul Atreides as he takes his maiden journey on top of a massive desert creature. This was something that filmmaker Denis Villeneuve was strongly aware of, so he had a separate unit working on the sequence over a period of three months. Given the sandworm does not actually exist, this iconic moment could not be accomplished without the expertise of Visual Effects Supervisor Paul Lambert, who won an Oscar for the first installment. The first discussion we had with HODs was about the worm ride. Denis gave this incredible pitch that was a completely original idea, which was: For a Freman to get onto a worm he had to climb a dune, the worm would burst through the dune, the Freman would go down with the sand and land on the worm. We never went into how you get off a worm! That could be for later. It was like, Oh, my god, what an absolutely incredible idea! How the heck do we do that?The worm doesnt exist without sand. Youre talking about incredible simulations which have to be created to get that scale. We saw it briefly in Dune: Part One, and that was in darkness. But for this one, Paul was actually going to get onto the worm in broad daylight. There wasnt going to be hiding anything whatsoever. We started down a path of some early development with DNEG as to how we have a worm crash through a dune.Paul Lambert, Visual Effects SupervisorAn example of a lighting panel utilized by Cinematographer Greig Fraser.Co-existing with the sand is the sandworm. The worm doesnt exist without sand, Lambert notes. Youre talking about incredible simulations which have to be created to get that scale. We saw it briefly in Dune: Part One, and that was in darkness. But for this one, Paul [Timothe Chalamet] was actually going to get onto the worm in broad daylight. There wasnt going to be hiding anything whatsoever. We started down a path of some early development with DNEG as to how we have a worm crash through a dune. They went through trying to get the speed of the worm to be correct. DNEG used what is called a ball pit render and would start the simulations with those size particles of sand because they could do fast iterations. We would stay in low-res until it felt right, and thats when DNEG would up-res. If you imagine DNEG working in an area that was 80,000 balls in a pit, once they up-res it there were 800 million! They got smaller and smaller and far more complicated. DNEG found that every time the tube [the worm] was rammed through a dune, the dune would explode. They went around and around in circles until somebody had the idea of, Why dont we open its mouth as its coming out? That solved it. We wanted to keep the simulation as believable as possible without cheating and cutting corners because the moment you do that youll get to these amazing renders, but something wont look right and you cant put your finger on it.Greig Fraser prepares to shoot a scene inside of a practically-built ornithopter cockpit.A badass moment occurs during the Harkonnen harvester attack when Chani defends herself by firing a rocket launcher at a Harkonnen soldier. Denis was worried that it might look too comical, Lambert reveals. We shot a stuntie being pulled on a rig, but after that we realized it required a lot of re-timing and obviously a digital double to take over. It took a lot of time to get that to feel right and to make sure it didnt feel comedic. Chani also uses the rocket launcher to shoot down an Harkonnen ornithopter defending the harvester. Every explosion that you see is digital. We did shoot some reference of explosions out in the desert. I actually got to film explosions out in the desert this time. It wasnt allowed on Dune [Part One] because where we were shooting, it wasnt the best idea to be setting off explosions. All of those explosions were mainly for reference, such as what a big explosion would be like in the hot desert. The guys and girls at DNEG did a fantastic job creating the simulation. When Chani takes out the Harkonnen ornithopter, we did shoot reference, but not for when the Harkonnen soldier gets shot up into the spice crawler.DNEG found that every time the tube [the worm] was rammed through a dune, the dune would explode. They went around and around in circles until somebody had the idea of, Why dont we open its mouth as its coming out? That solved it. We wanted to keep the simulation as believable as possible without cheating and cutting corners because the moment you do that youll get to these amazing renders, but something wont look right and you cant put your finger on it.Paul Lambert, Visual Effects SupervisorMaking use of LiDAR scans from his iPhone and Unreal Engine, Greig Fraser was able to meticulously plan to ensure that every shot was backlit.Oppenheimer had ramifications beyond the box office as it influenced how the atomic missile strike is portrayed in Dune: Part Two. It was cool, Lambert remarks. We tried to keep it as much as plate-based as possible. That sequence obviously has a huge digital effect in the background and Greig Fraser [Cinematographer] did some additional lighting on the characters, which had to be extended onto the background. We started off with a conventional nuclear bomb, but this was also the year of Oppenheimers release, and we were using the same references from the 1940s. We then veered to something different. Denis wanted a manga look to it. The idea was this isnt a nuclear bomb, but more like a super high explosive. It was a big, old TNT explosion, which meant that you had dust, sand and rocks flying everywhere. We shot some practical elements of people running, but there is a substantial amount of digital crowd running, simulated and motion captured for that purpose.We started off with a conventional nuclear bomb, but this was also the year of Oppenheimers release, and we were using the same references from the 1940s. We then veered to something different. Denis wanted a manga look to it. The idea was this isnt a nuclear bomb, but more like a super high explosive. It was a big, old TNT explosion, which meant that you had dust, sand and rocks flying everywhere. We shot some practical elements of people running, but there is a substantial amount of digital crowd running, simulated and motion captured for that purpose.Paul Lambert, Visual Effects SupervisorHaving a clear vision is important to Denis Villeneuve, who has a on-set conversation with Rebecca Ferguson.Giedi Prime, which is the home world of House Harkonnen, orbits around a black sun, and has a monochromatic palette for the daylight exterior shots. Greig did some intriguing camera tests. You had two crew members both with black T-shirts, and one would stay black and the other would go white. You didnt know why. It could be because of the weave or material or temperature. You could never tell what would change to a different tone. One thing it did do was give this subsurface look to the skin and make your light look super sci-fi. Denis fell in love with that particular look, but if we go down this path you cant undo it. These are modified cameras from ARRI. Youre capturing the full spectrum, infrared and real light. Then youre desaturating it to get that particular look. But we also wanted to be able to transition from real to that world. Rather than try to rely on a full-on digital effect to try to recreate the infrared, we built a stereo camera rig. One of the cameras is vertical and the other is a horizontal that shoots through a mirror or across the mirror. One camera is infrared and the other is RGB, which means we get the exact same image with one in infrared and the other is RGB. Then I can do a collage roto where you can transition from one to the other. That worked out well for when the Bene Gesserit come in or when the Baron is coming from the inside of his stadium to the outside light. The idea being that the outside is a different atmosphere where you have this infrared light.Some crowd replication was required and rubber blades had to be digitally fixed for the fight between Paul Atreides (Timothe Chalamet) and Feyd-Rautha Harkonnen (Austin Butler).A black oil aesthetic emphasizes that House Harkonnen is resembles a virus rather than a bastion of humanity.Machine learning assisted in producing 1,000 blue-eye shots found in the sequel.Accompanying the Emperor to Arrakis is the Imperial Tent, which is attached to his ship. You could have a simple shot where you put this great big battle occurring behind you, and have a shot where youre not looking at the battle but looking at the Imperial Tent, which was then attached to the chrome ball, Lambert states. That chrome ball saw the whole world 360. The idea behind the Imperial Tent was that it started flat, and the Emperors ship, the chrome ball, would pull that up so the actual structure would appear in this almost pyramid shape. You dont actually get to see how it got made. What we were able to do was a technique we developed on the first one and used extensively again on Dune: Part Two, which was proxy shooting. Rather than building out this interior tent to its final in-camera texture, we would build this proxy version of it. You would get the overall tone and shape. I would then go in in post, because I didnt have to shoot up against bluescreen or greenscreen, and add additional texture to it. Its a far more believable process than just having bluescreen or greenscreen.To avoid the need for additional sand simulations, a rule was made to not go over previous footprints in the sand when shooting.DNEG, Wylie Co. and Territory Studio provided 2,156 visual effects shots, while Rodeo FX contributed concept art and MPC did some of the previs. [The extensive crowd work] was potentially one of the harder things to wrangle and get right, Lambert remarks. We did a lot of 2D and 3D replication. Tiling was done for a couple of shots which was then augmented with thousands of others. A returning visual effects element that was refined further for the sequel were the blue eyes of the Freman because by their consumption of spice. On the first movie, Wylie Co. had to roto 300 shots. We knew on the second movie there would be a hell of a lot more blue eyes. Nuke has a new feature called CopyCat where you can say, If this image is doing that, try to replicate doing this. I did some initial tests, and we found that if I could feed it images of our actors faces from the first one along with the mattes, it could be trained to figure out whenever it saw an eye it could make this matte. The ones that didnt work, we rotod and put them back into it. It was a five-month process. By the end of it, we had trained the models on 77,000 pairs of eyes; 400 of the 1,000 shots were done completely with machine learning.Dunes are treated as characters in their own right.Infrared cameras were deployed for the exterior daylight fight scenes on Giedi Prime, which orbits a black sun.An overriding element that had to be kept in mind when dealing with the mammoth creature that lives beneath the surface is the sandworm would not exist without sand.For when Chani turns to see the Imperial Tent explode, Greig Fraser captured the shot in the United Arab Emirates while the rest of the oner involved stitching various motion capture performances together.A significant challenge for the visual effects team was all of the crowd work.All of the explosions were digitally created.Industrial tracks held black screens that to get the proper shadows for when Paul Atreides and Chani attack the Harkonnen harvester.Unlike the first movie, Paul Lambert was able shoot practical explosions as reference for Dune: Part Two.Finding a dune for Paul Atriedes to run across that was backlit and had the right wind direction was not easy given the time of year.Every explosion that you see is digital. We did shoot some reference of explosions out in the desert. I actually got to film explosions out in the desert this time. It wasnt allowed on Dune [Part One] because where we were shooting, it wasnt the best idea to be setting off explosions. All of those explosions were mainly for reference, such as what a big explosion would be like in the hot desert.Paul Lambert, Visual Effects SupervisorChani watching the Imperial Tent explode was a huge shot to execute. In [the first] Dune, we did something called the Paul oner where hes in his dream state and is fighting all of these characters. In Dune: Part Two, we had Chani doing something similar. Chani gets up, runs, fights, and she turns her head seeing the explosion of the Imperial Tent in the distance. That is a 35-second all-digital shot apart from the end shot of Chani. We shot Chanis turn out in the desert. Greig lensed it in the correct light, and we then had to back in the entire shot so it finished on that particular moment. That shot went through some interesting stages. All of the characters fighting are motion capture. We motion captured that out in Hungary using tens of stunties fighting. We actually motion captured Zendaya. We stitched her performance and all of the fighters together. We then took that to the studio at Digital Domain and brought Greig back in to compose with Denis. Greig had a virtual camera and was able to follow Zendaya through the move. We then re-lensed it based on all of the animation in there, and what you see is actually Greigs camera move going all of the way up to the point where he shot her in the Unite Arab Emirates. That shot took months and months to do. But again, we had a plan, so it was just going through it. It was a slow burn. It was a step-by-step progressive of giving notes. Thats how Ive always worked with Denis.A separate worm unit was created that spent three months to capture Paul Atreides riding the sandworm.Wind machines were used to blow the sand to the point where the stunt performer would turn orange by the end of shooting.When the sun was not cooperating, the worm unit had another assignment to complete, which was photographing the fetus of Paul Atreides sister Alia. In the corner of the studio in a tank was a prosthetic baby, which we shot that through the glass, Lambert reveals. This was a project in itself trying to get some beautiful textures around it. The idea was from that plate photography; I would add CG blood flow to give it some life. Every time you see it blink, its only CG around the eyes, but the rest of it is a prosthetic. Dune: Part Two is a film that tries to use the best technique it can for the particular visual. In that case, having a real prosthetic was the key.
    0 Comments ·0 Shares ·109 Views
  • 20 YEARS ON: REVISITING THE MATRIX RELOADED AND REVOLUTIONS, & OTHER 21st CENTURY VFX CLASSICS
    www.vfxvoice.com
    By TREVOR HOGGA true mark of innovation is when one can look back two decades later and still be impressed by what was achieved given the technology, tools and resources available at the time. Say what you will about The Matrix Reloaded (2003) and The Matrix Revolutions (2003), but it is amazing the number of top-level visual effects professionals who have emerged from them, such as John DJ DesJardin (Man of Steel) and Dan Glass (The Tree of Life). Adding to the complexity of The Matrix Reloaded and The Matrix Revolutions were the logistics of having both shot at the same time in sequential order and released six months apart from each other.While John Gaeta (The Matrix) was the Overall Visual Effects Supervisor, the real and virtual worlds were divided respectively between DesJardin and Glass. I was always fascinated by the real world as a subject because of this notion it had become a dystopian nightmare, DesJardin notes. The idea that we were going to go to Zion and see all of the aspects of it, not just where they live but the machines that keep it running and the big temples. Then the fetus fields and right down to Machine City. Those are my favorite things. The battle was a nail-biter to get that done. I recently came across a shooting assessment that I made and delivered to the producers for the how we were going to shoot the guys waging the battle in the APUs [Armoured Personnel Unit]. It was a big motion-control effort. But I will say, when I learned the story of the film, one of my favorite moments and couldnt wait to get a handle on to make was when Neo and Trinity fly up above the clouds to get rid of the Sentinels that are clinging to the ship, and you get to see the sun for the first time in the real world. Its a great idea, and I love the way it came out.The street scene took awhile because we had 50 doubles for Agent Smith wearing printed masks; along with them we built mannequins from the cast of Hugo Weaving. The doubles were in the background, and in front of them were two mannequins that they could move left and right. When Hugo brought his kids on set, they were slightly horrified! There was 151 of dad there!Dan Glass, Visual Effects Supervisor, The Matrix Reloaded & RevolutionsThe Matrix Reloaded & RevolutionsIn the Oracles Kitchen set shooting for The Matrix Reloaded and Revolutions, Digital Effects Producer Diana Giorgiutti takes chrome and gray-ball notes with Visual Effects Supervisor Kim Libreri. Visual Effects Supervisor Dan Glass is off to the side left with James McTeigue (1st AD) and Bill Pope (DP). (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)Visual Effects Supervisor John DJ DesJardin was responsible for executing the Sentinel fight sequence in The Matrix Revolutions, which involved full-sized APU units being on set at Fox stages in Sydney. (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)Crater action from The Matrix Revolutions, with Keanu Reeves lining up to shoot accompanied by mud and rain. (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)Visual Effects Supervisor Dan Glass on set with the late Diana Giorgiutti shooting chrome and gray balls for The Matrix Reloaded and Revolutions. (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)A notable shot of the Agent Smith mannequins from the Super Burly Brawl scene in The Matrix Revolutions where the many Smiths watch Agent Smith fight Neo in the rain. (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)Cutting-edge digital human technology was utilized to create the Burly Brawl and Super Burly Brawl involving the massive onslaught of Agent Smith clones. I watched the Burly Brawl fairly recently, and the reason why it holds up is the split-screen work is all photography, and as you get into the more virtual work, for its time it was ambitious and pulled off some incredible things, Glass remarks. The Super Burly Brawl took the longest to shoot. There was a side thing where the Wachowskis didnt want just rain. The raindrops had to be oversized. The special effects team was trying to figure out how to make this extra-wet, blobby rain. The street scene took awhile because we had 50 doubles for Agent Smith wearing printed masks; along with them we built mannequins from the cast of Hugo Weaving. The doubles were in the background, and in front of them were two mannequins that they could move left and right. When Hugo brought his kids on set, they were slightly horrified! There was 151 of dad there! And a lot of rain. It was grueling for us and I imagine as well for the actors. The digital doubles of Agent Smith were not simply carbon copies. You always try to bring some level of individuality so it feels more credible. The advantage of working in the Matrix for those movies was it was about a simulation, so it gave us some leeway, Glass adds.Spider-Man 2Sam Raimi rehearses a scene with Alfred Molina during the shooting of Spider-Man 2. (Image courtesy of Columbia Pictures)A signature fight in Spider-Man 2 occurs onboard a speeding train, which showcases the lethal might of Doc Ock. (Image courtesy of Columbia Pictures)A technological and creative breakthrough for Spider-Man 2 was the fidelity achieved in producing a digital double of Alfred Molina as Doc Ock. (Image courtesy of Columbia Pictures)Each one of the tentacles of Doc Ock was given a distinct personality with the circular light in the center conveying a sentient quality harkening back to HAL in 2001: A Space Odyssey. (Image courtesy of Columbia Pictures)It was important to given the digital double of Spider-Man the correct inertia and ground the camerawork to make the performance believable. (Image courtesy of Columbia Pictures)When I learned the story of the film, one of my favorite moments and couldnt wait to get a handle on to make was when Neo and Trinity fly up above the clouds to get rid of the Sentinels that are clinging to the ship, and you get to see the sun for the first time in the real world. Its a great idea, and I love the way it came out.John DJ DesJardin, Visual Effects Supervisor, The Matrix Reloaded & RevolutionsSet in New York City, Eternal Sunshine of the Spotless Mind (2004) explores what happens if a couple breaks up and go through a medical procedure to get rid of their memories of each other. This was the best script Ive ever read in my life, states Louis Morin, who was at the time a Visual Effects Supervisor for Buzz Image Group and made suggestions about ways of erasing memories, such as having abstractions melt and disappear. The producer said that Michel Gondry (Be Kind Rewind) didnt want any visual effects supervisor on set. It was to be a free camera style of filmmaking and no lights, like Breathless by Jean-Luc Godard. Camera tracking was hellish. There was the Pan from Hell, which is exactly the Breathless shot and from the peculiar mind of a director who decided to flip the image so that the actor was walking into a flipped image of himself. We then had to marry the two together with tracking, morphing, and put in a telephone pole to help us out. The camera goes four times like that. As Joel Barish (Jim Carrey) keeps going back and forth on the street, the details in the imagery begin to fade away. We had to redo the whole store in CG to be able to erase everything step by step, Morin adds.Eternal Sunshine of the Spotless MindCG chopsticks were added to make the shot transition seamless for the sofa bed scene in Eternal Sunshine of the Spotless Mind. (Image courtesy of Louis Morin and Universal Pictures)To visually depict memories being erased, subtle details were removed, such as a leg belonging to Clementine Kruczynski (Kate Winslet). (Image courtesy of Louis Morin and Universal Pictures)The Pan from Hell in Eternal Sunshine of the Spotless Mind required seamlessly transitioning back and forth from footage that was flipped 180 degrees. (Image courtesy of Louis Morin and Universal Pictures)Then there was the preceding moment featuring a missing leg belonging to Clementine Kruczynski (Kate Winslet) and a falling car. Michel wanted to have the first moment indicating that the memory of Clementine is being erased and he said, Remove a leg, Morin recalls. I said to him, Nobody is going to notice that. We did it and nobody was seeing it. Also, Clementine didnt turn her head at the right time, so we Frankensteind the shot by taking the head from a longer take, which worked well, but nobody was noticing the leg again! Somebody suggested that we could have a car fall down from the sky. Everybody thought it was ridiculous, but Michel said, Lets do it. We had to do that entire background and car in CG. At the end, everybody liked the idea, and it was powerful. There were also subtle digital adjustments taking place. Joel falls off of the sofa bed and reverses back into another shot of him on the sofa bed eating Chinese food with chopsticks. But the chopsticks werent working so we had to make them CG. A major visual effect was the collapsing house. At first Michel was talking about doing some optical iteration of the image. It wasnt looking great. Then Michel asked, Can we have a chimney collapsing? Upon seeing the test, he went, Wow. Can we have the house collapsing? The house became entirely CG and was destroyed by using rigid body dynamics, Morin says.When I talked to the people who were animating the shots of Spider-Man, I told them to imagine that he had his own cameraman, and the cameraman has to travel the same way as Spider-Man. As a result, you get a much more human or fallible version of camera operation that lends reality to it.John Dyksta, Production Visual Effects Supervisor, Spider-Man 2Departing from the normal routine from recruiting directors from within, Pixar collaborated with Brad Bird (The Iron Giant) to produce a superhero family adventure. In the process of making The Incredibles (2004), Visual Effects Supervisor Rick Sayre had to work out what he calls open problems, such as simulating the long, straight hair of Violet Parr, which was a key part of her character. Violet is a teenage girl and her power is mostly defensive, Sayre notes. She puts up a shield or turns invisible because she wants to disappear. You will often see her with one eye. She is hiding behind her hair. That was important to Brad. The existing hair simulation system had to be overhauled to allow for interaction. Explains Sayre, One of our tricks before was to randomly connect some sets of hairs to other hairs of invisible springs; that would allow for a coif to retain its volume, but that technique doesnt work with long hair because either the hair flattens out if there are no springs, or it looks like cement. We ended up embedding the simulation hairs inside of a volume, which was how they were able to couple their motions and collision responses to each other in a way that still isnt as computationally expensive as every hair looking at every other hair. Its as if theyre embedded in a block of invisible goo that is modulating these responses. We also used that block of invisible goo to infer some information that we used for lighting, shading and shadowing.The house in Eternal Sunshine of the Spotless Mind became entirely CG and was destroyed by using rigid body dynamics. (Image courtesy of Louis Morin and Universal Pictures)A comedic sequence occurs when Fashion Designer Edna Mode demonstrates how indestructible her superhero suits are to Helen Parr by putting them through a series of extreme tests, such as a flame thrower. Its funny you mention that because it has an almost live-action approach, Sayre remarks. We hadnt done a big effects film. Whats happening in the simulation chamber is done by a different team with a different set of techniques, even a different renderer, than Helen and Enda sitting on the other side of the glass. They are essentially on a set looking through the window at greenscreen where nothing is happening, pretending to react to all of this stuff that is done later and comped together. There was no way that we could have done all of that at the same time in the same system. The thing that caused our team the most headaches were the super suits, which were tight-fitting and caused simulation stability and collision fidelity issues. Because Edna is so amazing, her super suits have special visual properties. Theyre shiny and had these surface characteristics where we would see these rendering artifacts coming from the guts of how Catmull-Clark subdivision surfaces got rendered in RenderMan of the day. At some point, we were using a different kind of subdivision surface or a loop subdivision and then reprojecting it. The super suits that Edna doesnt make, like Syndrome, were easier to deal with because theyre more like regular cloth.Being able to make the protagonist crawl walls and swing through the air from buildings in a believable manner, and giving the antagonist mechanical tentacles that have a mind of their own, were a couple of many challenges John Dykstra faced as the Production Visual effects Designer on Spider-Man 2 (2004). Sam Raimi (A Simple Plan) wanted to use as many practical elements as he could, so we pursued Doc Ock that way, Dykstra remarks. It is tough for puppets to defy physics because they are in the real world and want to work in real time. We went through and prevised the entire sequence, and did computer-generated imagery for those shots where we felt puppeteering was impractical. The arms of [Doc Ock] were a digital endeavor from the get-go. The art department worked with us in terms of the design, and we worked with the vendor to figure out the animation look in regards to the speed and mass and how the arms worked. We were defying gravity a lot in Spider-Man 2. Realism was built into the CG camerawork. When I talked to the people who were animating the shots of Spider-Man, I told them to imagine that he had his own cameraman, and the cameraman has to travel the same way as Spider-Man. As a result, you get a much more human or fallible version of camera operation that lends reality to it.The IncrediblesA progression illustrating what dinner is like for the Parr family in The Incredibles. (Images courtesy of Disney/Pixar)The superpowers in The Incredibles are an extension of the character traits. (Image courtesy of Disney/Pixar)Concept art of the Parr home dining room. (Image courtesy of Disney/Pixar)Concept art by Don Shank exploring a major action sequence that occurs during the third act of The Incredibles. (Image courtesy of Disney/Pixar)Dealing with the long, straight hair, which was essential to the character of Violet, was a major technological hurdle to overcome for The Incredibles. (Image courtesy of Disney/Pixar)The Incredibles was the first time for Pixar that the principal cast consisted entirely of stylized human characters. (Image courtesy of Disney/Pixar)An extremely hard shot was the tight closeup of Doc Ock falling. Trust me, that was torn from the artists hands by the time it was put into the film! Dykstra laughs. The idea was to have a moment where we actually featured a CGI character with emotional content, and the challenge was to do it in a way that you would be convinced that it was real, especially when youre doing something with a real person. I suppose Alfred Molina could have done it, but I dont imagine he could have been underwater for so long! CG skin is always tricky. Things like pores and inconsistencies in surface reflectivity often contribute to the complex and somewhat visually noisy thing that is human flesh. In theory, Spider-Man is an ideal CG character because the material of the suit has a smooth matte finish and no hair or fur has to be simulated. When there is an absence of natural phenomenon, you end up questioning the verisimilitude of what youre looking at. It was important to improve upon the specular nature of the suit, and the way it wrinkled had variations in the texture of the surface of the body while it was in motion. Spider-Man 2 occurred during a transitional period from analog to digital solutions. Dykstra states, One of the things that we had to work on in that era was including world noise. We had to take the perfection of the computer-generated model and haul it back into the realm of the real world. Stuff like film grain and how it was reacting. Was it out or in focus? We had to study that to figure out how to apply it to the shots because its the filter through which you see the world.
    0 Comments ·0 Shares ·108 Views
  • 20 YEARS ON: REVISITING THE MATRIX RELOADED AND REVOLUTIONS, AND OTHER 21 st CENTURY VFX CLASSICS
    www.vfxvoice.com
    By TREVOR HOGGA true mark of innovation is when one can look back two decades later and still be impressed by what was achieved given the technology, tools and resources available at the time. Say what you will about The Matrix Reloaded (2003) and The Matrix Revolutions (2003), but it is amazing the number of top-level visual effects professionals who have emerged from them, such as John DJ DesJardin (Man of Steel) and Dan Glass (The Tree of Life). Adding to the complexity of The Matrix Reloaded and The Matrix Revolutions were the logistics of having both shot at the same time in sequential order and released six months apart from each other.While John Gaeta (The Matrix) was the Overall Visual Effects Supervisor, the real and virtual worlds were divided respectively between DesJardin and Glass. I was always fascinated by the real world as a subject because of this notion it had become a dystopian nightmare, DesJardin notes. The idea that we were going to go to Zion and see all of the aspects of it, not just where they live but the machines that keep it running and the big temples. Then the fetus fields and right down to Machine City. Those are my favorite things. The battle was a nail-biter to get that done. I recently came across a shooting assessment that I made and delivered to the producers for the how we were going to shoot the guys waging the battle in the APUs [Armoured Personnel Unit]. It was a big motion-control effort. But I will say, when I learned the story of the film, one of my favorite moments and couldnt wait to get a handle on to make was when Neo and Trinity fly up above the clouds to get rid of the Sentinels that are clinging to the ship, and you get to see the sun for the first time in the real world. Its a great idea, and I love the way it came out.The street scene took awhile because we had 50 doubles for Agent Smith wearing printed masks; along with them we built mannequins from the cast of Hugo Weaving. The doubles were in the background, and in front of them were two mannequins that they could move left and right. When Hugo brought his kids on set, they were slightly horrified! There was 151 of dad there!Dan Glass, Visual Effects Supervisor, The Matrix Reloaded & RevolutionsThe Matrix Reloaded & RevolutionsIn the Oracles Kitchen set shooting for The Matrix Reloaded and Revolutions, Digital Effects Producer Diana Giorgiutti takes chrome and gray-ball notes with Visual Effects Supervisor Kim Libreri. Visual Effects Supervisor Dan Glass is off to the side left with James McTeigue (1st AD) and Bill Pope (DP). (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)Visual Effects Supervisor John DJ DesJardin was responsible for executing the Sentinel fight sequence in The Matrix Revolutions, which involved full-sized APU units being on set at Fox stages in Sydney. (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)Crater action from The Matrix Revolutions, with Keanu Reeves lining up to shoot accompanied by mud and rain. (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)Visual Effects Supervisor Dan Glass on set with the late Diana Giorgiutti shooting chrome and gray balls for The Matrix Reloaded and Revolutions. (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)A notable shot of the Agent Smith mannequins from the Super Burly Brawl scene in The Matrix Revolutions where the many Smiths watch Agent Smith fight Neo in the rain. (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)Cutting-edge digital human technology was utilized to create the Burly Brawl and Super Burly Brawl involving the massive onslaught of Agent Smith clones. I watched the Burly Brawl fairly recently, and the reason why it holds up is the split-screen work is all photography, and as you get into the more virtual work, for its time it was ambitious and pulled off some incredible things, Glass remarks. The Super Burly Brawl took the longest to shoot. There was a side thing where the Wachowskis didnt want just rain. The raindrops had to be oversized. The special effects team was trying to figure out how to make this extra-wet, blobby rain. The street scene took awhile because we had 50 doubles for Agent Smith wearing printed masks; along with them we built mannequins from the cast of Hugo Weaving. The doubles were in the background, and in front of them were two mannequins that they could move left and right. When Hugo brought his kids on set, they were slightly horrified! There was 151 of dad there! And a lot of rain. It was grueling for us and I imagine as well for the actors. The digital doubles of Agent Smith were not simply carbon copies. You always try to bring some level of individuality so it feels more credible. The advantage of working in the Matrix for those movies was it was about a simulation, so it gave us some leeway, Glass adds.Spider-Man 2Sam Raimi rehearses a scene with Alfred Molina during the shooting of Spider-Man 2. (Image courtesy of Columbia Pictures)A signature fight in Spider-Man 2 occurs onboard a speeding train, which showcases the lethal might of Doc Ock. (Image courtesy of Columbia Pictures)A technological and creative breakthrough for Spider-Man 2 was the fidelity achieved in producing a digital double of Alfred Molina as Doc Ock. (Image courtesy of Columbia Pictures)Each one of the tentacles of Doc Ock was given a distinct personality with the circular light in the center conveying a sentient quality harkening back to HAL in 2001: A Space Odyssey. (Image courtesy of Columbia Pictures)It was important to given the digital double of Spider-Man the correct inertia and ground the camerawork to make the performance believable. (Image courtesy of Columbia Pictures)When I learned the story of the film, one of my favorite moments and couldnt wait to get a handle on to make was when Neo and Trinity fly up above the clouds to get rid of the Sentinels that are clinging to the ship, and you get to see the sun for the first time in the real world. Its a great idea, and I love the way it came out.John DJ DesJardin, Visual Effects Supervisor, The Matrix Reloaded & RevolutionsSet in New York City, Eternal Sunshine of the Spotless Mind (2004) explores what happens if a couple breaks up and go through a medical procedure to get rid of their memories of each other. This was the best script Ive ever read in my life, states Louis Morin, who was at the time a Visual Effects Supervisor for Buzz Image Group and made suggestions about ways of erasing memories, such as having abstractions melt and disappear. The producer said that Michel Gondry (Be Kind Rewind) didnt want any visual effects supervisor on set. It was to be a free camera style of filmmaking and no lights, like Breathless by Jean-Luc Godard. Camera tracking was hellish. There was the Pan from Hell, which is exactly the Breathless shot and from the peculiar mind of a director who decided to flip the image so that the actor was walking into a flipped image of himself. We then had to marry the two together with tracking, morphing, and put in a telephone pole to help us out. The camera goes four times like that. As Joel Barish (Jim Carrey) keeps going back and forth on the street, the details in the imagery begin to fade away. We had to redo the whole store in CG to be able to erase everything step by step, Morin adds.Eternal Sunshine of the Spotless MindCG chopsticks were added to make the shot transition seamless for the sofa bed scene in Eternal Sunshine of the Spotless Mind. (Image courtesy of Louis Morin and Universal Pictures)To visually depict memories being erased, subtle details were removed, such as a leg belonging to Clementine Kruczynski (Kate Winslet). (Image courtesy of Louis Morin and Universal Pictures)The Pan from Hell in Eternal Sunshine of the Spotless Mind required seamlessly transitioning back and forth from footage that was flipped 180 degrees. (Image courtesy of Louis Morin and Universal Pictures)Then there was the preceding moment featuring a missing leg belonging to Clementine Kruczynski (Kate Winslet) and a falling car. Michel wanted to have the first moment indicating that the memory of Clementine is being erased and he said, Remove a leg, Morin recalls. I said to him, Nobody is going to notice that. We did it and nobody was seeing it. Also, Clementine didnt turn her head at the right time, so we Frankensteind the shot by taking the head from a longer take, which worked well, but nobody was noticing the leg again! Somebody suggested that we could have a car fall down from the sky. Everybody thought it was ridiculous, but Michel said, Lets do it. We had to do that entire background and car in CG. At the end, everybody liked the idea, and it was powerful. There were also subtle digital adjustments taking place. Joel falls off of the sofa bed and reverses back into another shot of him on the sofa bed eating Chinese food with chopsticks. But the chopsticks werent working so we had to make them CG. A major visual effect was the collapsing house. At first Michel was talking about doing some optical iteration of the image. It wasnt looking great. Then Michel asked, Can we have a chimney collapsing? Upon seeing the test, he went, Wow. Can we have the house collapsing? The house became entirely CG and was destroyed by using rigid body dynamics, Morin says.When I talked to the people who were animating the shots of Spider-Man, I told them to imagine that he had his own cameraman, and the cameraman has to travel the same way as Spider-Man. As a result, you get a much more human or fallible version of camera operation that lends reality to it.John Dyksta, Production Visual Effects Supervisor, Spider-Man 2Departing from the normal routine from recruiting directors from within, Pixar collaborated with Brad Bird (The Iron Giant) to produce a superhero family adventure. In the process of making The Incredibles (2004), Visual Effects Supervisor Rick Sayre had to work out what he calls open problems, such as simulating the long, straight hair of Violet Parr, which was a key part of her character. Violet is a teenage girl and her power is mostly defensive, Sayre notes. She puts up a shield or turns invisible because she wants to disappear. You will often see her with one eye. She is hiding behind her hair. That was important to Brad. The existing hair simulation system had to be overhauled to allow for interaction. Explains Sayre, One of our tricks before was to randomly connect some sets of hairs to other hairs of invisible springs; that would allow for a coif to retain its volume, but that technique doesnt work with long hair because either the hair flattens out if there are no springs, or it looks like cement. We ended up embedding the simulation hairs inside of a volume, which was how they were able to couple their motions and collision responses to each other in a way that still isnt as computationally expensive as every hair looking at every other hair. Its as if theyre embedded in a block of invisible goo that is modulating these responses. We also used that block of invisible goo to infer some information that we used for lighting, shading and shadowing.The house in Eternal Sunshine of the Spotless Mind became entirely CG and was destroyed by using rigid body dynamics. (Image courtesy of Louis Morin and Universal Pictures)A comedic sequence occurs when Fashion Designer Edna Mode demonstrates how indestructible her superhero suits are to Helen Parr by putting them through a series of extreme tests, such as a flame thrower. Its funny you mention that because it has an almost live-action approach, Sayre remarks. We hadnt done a big effects film. Whats happening in the simulation chamber is done by a different team with a different set of techniques, even a different renderer, than Helen and Enda sitting on the other side of the glass. They are essentially on a set looking through the window at greenscreen where nothing is happening, pretending to react to all of this stuff that is done later and comped together. There was no way that we could have done all of that at the same time in the same system. The thing that caused our team the most headaches were the super suits, which were tight-fitting and caused simulation stability and collision fidelity issues. Because Edna is so amazing, her super suits have special visual properties. Theyre shiny and had these surface characteristics where we would see these rendering artifacts coming from the guts of how Catmull-Clark subdivision surfaces got rendered in RenderMan of the day. At some point, we were using a different kind of subdivision surface or a loop subdivision and then reprojecting it. The super suits that Edna doesnt make, like Syndrome, were easier to deal with because theyre more like regular cloth.Being able to make the protagonist crawl walls and swing through the air from buildings in a believable manner, and giving the antagonist mechanical tentacles that have a mind of their own, were a couple of many challenges John Dykstra faced as the Production Visual effects Designer on Spider-Man 2 (2004). Sam Raimi (A Simple Plan) wanted to use as many practical elements as he could, so we pursued Doc Ock that way, Dykstra remarks. It is tough for puppets to defy physics because they are in the real world and want to work in real time. We went through and prevised the entire sequence, and did computer-generated imagery for those shots where we felt puppeteering was impractical. The arms of [Doc Ock] were a digital endeavor from the get-go. The art department worked with us in terms of the design, and we worked with the vendor to figure out the animation look in regards to the speed and mass and how the arms worked. We were defying gravity a lot in Spider-Man 2. Realism was built into the CG camerawork. When I talked to the people who were animating the shots of Spider-Man, I told them to imagine that he had his own cameraman, and the cameraman has to travel the same way as Spider-Man. As a result, you get a much more human or fallible version of camera operation that lends reality to it.The IncrediblesA progression illustrating what dinner is like for the Parr family in The Incredibles. (Images courtesy of Disney/Pixar)The superpowers in The Incredibles are an extension of the character traits. (Image courtesy of Disney/Pixar)Concept art of the Parr home dining room. (Image courtesy of Disney/Pixar)Concept art by Don Shank exploring a major action sequence that occurs during the third act of The Incredibles. (Image courtesy of Disney/Pixar)Dealing with the long, straight hair, which was essential to the character of Violet, was a major technological hurdle to overcome for The Incredibles. (Image courtesy of Disney/Pixar)The Incredibles was the first time for Pixar that the principal cast consisted entirely of stylized human characters. (Image courtesy of Disney/Pixar)An extremely hard shot was the tight closeup of Doc Ock falling. Trust me, that was torn from the artists hands by the time it was put into the film! Dykstra laughs. The idea was to have a moment where we actually featured a CGI character with emotional content, and the challenge was to do it in a way that you would be convinced that it was real, especially when youre doing something with a real person. I suppose Alfred Molina could have done it, but I dont imagine he could have been underwater for so long! CG skin is always tricky. Things like pores and inconsistencies in surface reflectivity often contribute to the complex and somewhat visually noisy thing that is human flesh. In theory, Spider-Man is an ideal CG character because the material of the suit has a smooth matte finish and no hair or fur has to be simulated. When there is an absence of natural phenomenon, you end up questioning the verisimilitude of what youre looking at. It was important to improve upon the specular nature of the suit, and the way it wrinkled had variations in the texture of the surface of the body while it was in motion. Spider-Man 2 occurred during a transitional period from analog to digital solutions. Dykstra states, One of the things that we had to work on in that era was including world noise. We had to take the perfection of the computer-generated model and haul it back into the realm of the real world. Stuff like film grain and how it was reacting. Was it out or in focus? We had to study that to figure out how to apply it to the shots because its the filter through which you see the world.
    0 Comments ·0 Shares ·108 Views
  • ACHIEVING MAXIMUM ALTITUDE WITH THE VISUAL EFFECTS FOR EDGE OF SPACE
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of Jean de Meuron and VFX Los Angeles.While the likes of Chuck Yeager, Neil Armstrong and John Glenn looked to soar humanity to heights, Jean de Meuron took his fascination with cinema to commemorate their aerial accomplishments with the short film Edge of Space, which was a winner at the OSCAR and BAFTA Qualifying LA Shorts International Film Festival 2024 (Jury Special Mention). Running 18 minutes, the story revolves around United States Air Force test pilots being sent on a suborbital mission with the hypersonic rocket-powered X-15, paving the way for America to land on the Moon before the Soviet Union.If you look at a project like Edge of Space and think, Oh, my god, this shot with the spaceship is going to be the coolest and the hardest one. Yeah, but the last shots to be approved were the ones with the visor because theres no place to hide. Youve got to make sure that skin looks good. Youve got not to distract from the performance. Everyone can sense what a reflection looks like on a curved piece of glass, so actually those were the hardest ones to get right.Charlie Joslain, Senior Visual Effects SupervisorAn actual X-15 was used for exterior shots and scanned to create a digital version for when the hypersonic aircraft takes flight.The X-15 was able to penetrate the Krmn line, which, officially per NASA, is the edge of space where you go 330,000 feet up in the air, states producer/ director/writer de Meuron. Many X-15 test pilots received astronaut wings.A blueprint for the production was The Creator by Gareth Edwards. The Creator was cost-effective, but it also gave a sense of real scale and scope. Denis Villeneuve or Gareth Edwards said, If you make the most of the frame reel, the added visual effects blend in naturally. Charlie Joslain [Senior Visual Effects Supervisor], Izzy Traub [Visual Effects Supervisor] and I storyboarded everything in pre-production so on set we knew exactly what we wanted filmed. We also scanned multiple assets and locations during different times of the day that were then built in 3D.Something that would easily go unnoticed are the addition of 3D tombstones created by VFX Los Angeles. As often with films of this caliber, in terms of 100 plus visual effects, a lot of it is invisible effects, Joslain notes. You expect all of the contrails and airplanes, but there is a ton of clean-up little lights in the background, and traffic in the desert to create an isolated place since the airbase was supposed to be secret. Jean wanted some tombstones and they looked good, but were placed in a way that was too narrow and didnt quite give the gravitas and scale of the sacrifice of American pilots for that cause. We did some research to make sure that we found the right kind of tombstones, recreated multiple CGI ones, and ended up not quite Arlington National Cemetery but something similar in the desert.The framed picture of American President John F. Kennedy was incorporated into the set decoration to authentically recreate 1961.Crafting the contrails of the X-15 was made even more complicated by the aircraft essentially being a hypersonic rocket. That little gap between the back of the airplane and the contrail, Joslain explains. Imagine that multiplied by x amount. Would you be able to see the X-15 up in the sky, when its 45 feet long and 70,000 feet up in the air? You probably wouldnt be able to see it, but if you dont show it, what exactly is our character looking at up in the sky? We had to find that balance of what it should have looked like and how do we represent it so its engaging for the audience? Through the use of plate versus recreating similar plates, then doing a lot of calculation, work and optical engineering as to what zoom lens would create what effect thats how we created the best of both worlds. It looks historically and scientifically accurate, but its telling a story, is still engaging, and the plane feels like its there.That little gap between the back of the airplane and the contrail: Imagine that multiplied by x amount. Would you be able to see the X-15 up in the sky, when its 45 feet long and 70,000 feet up in the air? You probably wouldnt be able to see it, but if you dont show it, what exactly is our character looking at up in the sky? We had to find that balance of what it should have looked like and how do we represent it so its engaging for the audience?Charlie Joslain, Senior Visual Effects SupervisorChanneling the cinematography of Days of Heaven, Jean de Meuron opted to shoot during the magic hour.Clouds became aerial landmarks indicating size, scale and speed. At one point, Glen Ford [Chad Michael Collins] penetrates [the Krmn line], and there is this massive, beautiful shot set against the sun, de Meuron recalls. Its backlit and silhouetted, but then we wanted to give a sense of scale. This is still earthbound, but the minute he penetrates, we go to space where we dont have clouds. The clouds helped us give a sense of scale and depth as well as layers and nuances with light, shadows, and a little underexposed in the foreground. We played heavily into those cloud formations. Even more important than scale is the sense of speed. Weve seen a million films and sci-fi movies, and again the X-15 is supposed to fly Mach 5 or 6, Joslain notes. When clouds of that scale start drifting past so fast, that helps to portray the sense of speed of the aircraft.Only digital versions of the Huey helicopter and B-15 are shown flying.Visor reflections are essential to have but hellish to pull off in a believable manner. Anything to do with the visor or helmet is a mixture, Joslain reveals. Roughly half of the scenes have the visor on where we had to erase reflections from outside of the cockpits and therefore recreate the performance, repaint skin or add the twinkle in the eye. The opposite was the true of the few shots that we got with the visor off where we had to recreate a CG visor and then repaint reflections from the cockpit and Moon on that visor. Thats more or less how this whole thing was tied together. No shot was untouched. If you look at a project like Edge of Space and think, Oh, my god, this shot with the spaceship is going to be the coolest and the hardest one. Yeah, but the last shots to be approved were the ones with the visor because theres no place to hide. Youve got to make sure that skin looks good. Youve got not to distract from the performance. Everyone can sense what a reflection looks like on a curved piece of glass, so actually those were the hardest ones to get right.Outer space was based on ISS footage. I would text pictures and references from astronauts in space either from the ISS, Mercury, Apollo or Gemini when they filmed and took pictures in outer space, de Meuron states. Its interesting because gradually the tones and shades of blue [change], Charlie and I would look at that. You can see from the ISS how the blue gradually transitions into a dark black and then becomes pitch black. Discoveries could be shared at any moment. Joslain recalls a funny anecdote that tells you a lot about Jeans dedication for the last two years: You would get a bunch of texts at 4 p.m., and I would go, I know this is Jean and hes found something! But most of the time this would actually take place at 3 a.m., and youre like, Jean, not now!The logos on the X-15 had to be digitally altered to make them period-accurate.We want to respect [director] Jeans [de Meuron] vision because thats what you want to achieve, but at the same time by damaging the perfection of the whole thing is how you achieve true perfection. My favorite shot is fully CG-made. Its the X-15 taking off, and its that super-long-lens-like 5000mm view of the X-15. There is enough shake in the camera and zoom play with the lens going on to add that sense there is an actual human being filming.Charlie Joslain, Senior Visual Effects SupervisorFor exterior shots, a real X-15 was photographed and scanned. Any sort of motion, such as the gears turning, were CG; even the front wheels because they werent quite right, Joslain remarks. As far as the texture, the real X-15 was used as a reference, but a lot of the logos had to be painted out, recreated and redesigned to match the historical plane, as opposed to the current NASA museum piece that it is. The cockpit was sealed off, so it was recreated in the studio by Production Designer Myra Barrera, with the visual effects team producing a digital version as well. We had LED panels and lights, and when you see the astronaut, its frontal, de Meuron states. I didnt want the actors profile because First Man had already done that. I wanted to do my own interpretation. I wanted it to be tight and claustrophobic in a real closeup or extreme closeup so we see every nuance of his performance, and maybe how he twitches or is sweating.One of the toughest elements to create and integrate were the contrails being generated by what is essentially a rocket.Capturing the aerial establisher of the landing strip was a drone. That was a real shot, explains Traub. As the camera keeps going, you see someone working on the plane; that was the same person doing the motion capture performances of everybody! Normally for a project, you can go in and purchase model packs and use them. We had to model everything from scratch because there wasnt anything that we could find for the most part that fit the historical references. One thing that is interesting is we actually replaced the X-15 in that particular shot because it gave us more control. One cannot have an airport landing strip without a control tower. We obviously didnt have a lot of photos of the Edwards Air Force Base in the 1960s, Joslain states. An important part of an airbase is going to be the control tower. We had an overview photograph of the base at the time. Assuming the picture and information were correct, we knew which month and year this was taken, and we did a bit of reverse engineering to figure out, according to the length of the shadow, normally how high the control tower was going to be. When we put the control tower in the shot, it was too small, so we had to make it bigger!A cool color palette was adopted for the outer space shots to make the cosmic environment feel colder.The landing shot of the X-15 was extremely difficult. The drone was more or less a continuous speed, but obviously an airplane landing and slowing down is not a continuous speed, Joslain remarks. But how do you create that? We had to find the right balance of what would be an accurate speed for the X-15 to slow down and grind to a halt. But matching that stopping moment with the twist of the pan of the camera, then having the jeep and vehicles enter, that was a complicated one to figure out. Contributing to the believability were lens aberrations. We were messing a little bit with the focus here and there, Joslain states. Adding a little grain there. Adding a little bit of a deep camera shake and vibration. We want to respect Jeans vision because thats what you want to achieve, but at the same time by damaging the perfection of the whole thing is how you achieve true perfection. My favorite shot is fully CG-made. Its the X-15 taking off, and its that super-long-lens-like 5000mm view of the X-15. There is enough shake in the camera and zoom play with the lens going on to add that sense there is an actual human being filming.A mixture of shots were done with visor up and down, with the reflections added and removed as needed.The X-15 does not actually take off but is attached to and released from a B-15.Drone photography was essential for the aerial shots of Edwards Air Force Base, with the buildings, vehicles and individuals digitally recreated.Cloud formations assisted with conveying the proper size, scale and speed of the aircrafts.Shots such as the B-52 releasing the X-15 were treated as if a camera operator was capturing the moment with a long lens.Particle simulations had to be produced on the ground and in the air. The stuff on the ground was the hardest, for sure, Traub notes. We have this sequence where the X-15 lands, there is a touchdown where the back of the plane basically slaps the ground, and there is an explosion of dust that goes up in the air. Then we see underneath the plane. Basically, the tracks are ripping up the ground as its coming to a halt. The X-15 pushes through a whole bunch of dust. In that same shot, you have this helicopter moving down and landing. The particle simulations become a lot more complicated because youre locked to lighting that is on the ground, so your lighting has got to be atmospherically correct. The shadows have to cast with the particle simulations that were doing in Houdini versus in the air. A lot of the particle simulations were atmospheric.CG tombstones were added to create a setting that had the gravitas of Arlington National Cemetery.One of the unique images features the death of a colleague reflected in the sunglasses worn by Glen Ford. The reflection of the explosion in the sunglasses was one of those cases where we did an absurd amount of reverse-engineering, Joslain explains, about the scale/size of X-15 contrail, the amount of curvature the piece of glass would have applied to it, and how it should have all looked to be 100% accurate versus what it needed to look like to be emotionally impactful, as well as aesthetically pleasing. Unreal Engine became a major tool for Edge of Space. One of the big things that we were dabbling with a little bit was Unreal Engine, but Unreal Engine became a key part of the pipeline when it came to all of the CG shots, Traub states. The reason for that was simply because of real-time rendering, the ability to tweak the lighting, quickly change the camera and output multiple versions. Especially when the deadline was coming up, it enabled us to move at speed that was a lot better. One thing that we had never done before was integrating Houdini simulations with Unreal Engine. Both of those paired up nicely, and by the time we had all of our renders, you could composite everything together in After Effects or Nuke. We got fairly adept with the Unreal Engine pipeline specifically for cinematic filmmaking, and it was a great experience. Well continue to use Unreal Engine for the rest of our projects most likely. Its an amazing tool.
    0 Comments ·0 Shares ·134 Views
  • WHATS OLD IS NEW AGAIN IN THE MADCAP WORLD OF BEETLEJUICE BEETLEJUICE
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of Warner Bros. Entertainment Inc.Starting off as a motion-control cameraman on Batman, Angus Bickerton has gone on to to work with Tim Burton two more times and oversee the digital augmentation for Dark Shadows and Beetlejuice Beetlejuice, the long-awaited sequel to the cult classic where a mischievous ghost causes havoc for those alive and dead. It might be a combination of knowing me a little bit, reflects Angus Bickerton, Production VFX Supervisor on Beetlejuice Beetlejuice. Tim was wanting to put a project together and was keen to do it fairly quickly on a moderate budget. I like doing things practically wherever possible because I come from that background initially. Back when we were doing Batman, we didnt do anything digital or CGI.Tim Burton was insistent that the Shrinkers have animatronic shrunken heads and performers underneath them.Technological advances allowed for more visual sophistication. If you take the sandworms in the Titan desert, we wanted the lovely warm feel of the original stop-motion, Bickerton explains. Mackinnon & Saunders did actually stop-motion animate all of the sandworm shots, but One of Us composited the landscape. The landscape was made up of a digital matte painting and some CG noodle rocks, which are curly bedrocks. But then the simulations of the sandworm diving into the dunes on Titan thats where it got interesting. We played backwards and forwards on that one with Tim because in the original movie they probably put some sand on a tray and hit it from underneath, and they got splashes of sand that were used optically as elements for wherever the sandworm dived into the dunes. For us, we did it as a CG simulation. We showed Tim a variety, from trying to mimic a miniature element to trying to get some scale to it. We had to find an in-between level that was about right. We wanted to improve the look but didnt want to get too sophisticated.[I]f you look at the Shrinkers, which are animatronic, they have a stagey design. The Beetlejuice universe allows you a little bit of freedom of not being held to being completely photoreal. It did affect our thinking when we were approaching the sequences, how could we do it practically and augmented rather than just resort to CG? I wouldnt call that a huge challenge but a great joy.Angus Bickerton, Production VFX SupervisorMichael Keaton and Tim Burton reunite for the madcap world of Beetlejuice Beetlejuice, which aimed to capture the spirit of the original movie.One of Us worked with Framestore, BUF, Red VFX and Goldcrest VFX to create 1,200 visual effects shots. Framestore and One of Us were our two main vendors, with the Influencers sequence being the most complex as people literally get sucked into their smartphones. That was a late idea, Bickerton reveals. There is a moment in the original film when a couple have distorted their faces, which was all done with replacement animation and individually sculpted, remodeled faces. We wanted to evoke the feeling of the original, but we went for CG. I tried to stop- motion animate myself and warp myself to try being sucked into a phone. But Tim wanted more stress, like a wind tunnel effect on the faces, and have them look really frightened. Basically, we shot plates with and without them. We shot every character individually. We had to then divide the characters up into different levels, so if they were a hero character closer to camera, we scanned that particular performer in detail. We got them to go through the actions of pretending to be scared and sucked into the phone. We had a level of detail two for mid-ground characters, then a level of detail three for low-resolution background. Right at the end, Tim said, It would be nice to see the hair twitching. When you see the hair twitching, those were separately shot elements captured against bluescreen in our office.Tim [Burton] said, No. Im going to do [the Shrinkers] all practically. We never changed the performance at all. The genius is that its a combination of a physical performer and a puppeteer off-camera with a radio control unit getting those minimal movements. The Shrinkers are Keaton-esque, and by Keaton I mean Buster Keaton, as they do everything with a blank look in their eyes.Angus Bickerton, Production VFX SupervisorTim Burton and Michael Keaton discuss the finale which takes place in a church.Animatronic shrunken heads were placed on top of performers for the Shrinkers. They had a yellow suit on, white shirt, and a thin area on the top of the chest that lines up with where their eyes are, Bickerton states. They did have a limited vision, but its a credit to Tim. He said, No. Im going to do it all practically. We never changed the performance at all. The genius is that its a combination of a physical performer and a puppeteer off-camera with a radio control unit getting those minimal movements. The Shrinkers are Keaton-esque, and by Keaton I mean Buster Keaton, as they do everything with a blank look in their eyes. A new character named Delores staples herself back together. Thats an interesting mix. Tim was keen to get the majority of it in camera, so we did lot video-matics and blocking. Credit should go to Neal Scanlan and his team of puppeteers. When we did that sequence where Delores puts herself together, we had Monica Bellucci there and three other performance artists providing legs and arms the detached limbs. We did almost a black velvet theater where we blocked out the motion. Of course, they cant join a limb on, so we would get the elements as close as we could simply, then there is a fair degree of digital augmentation. If you wanted to see a stump end, we obviously had to tack on the end of a limb, arm or leg.Lydia Deetz (Winona Ryder) has gone on to become a host of a horror show called Ghost House.Recreated for the sequel from the original plans was the miniature representation of Winter River.A theatrical aesthetic rather than photorealism was the goal for Beetlejuice Beetlejuice.A running gag for Wolf Jackson (Willem Dafoe) is that he is always given a cup of hot coffee, which was combination of digital and practical steam captured in Angus Bickertons kitchen.A lot work went into the prosthetic makeup that reflected how the deceased died.Monica Bellucci portrays the ex-wife of Beetlejuice, Delores, who literally staples herself back together again.I tried to stop-motion animate myself and warp myself to try being sucked into a phone. But director] Tim [Burton] wanted more stress, like a wind tunnel effect on the faces, and have them look really frightened. We got them to go through the actions of pretending to be scared and sucked into the phone. Right at the end, Tim said, It would be nice to see the hair twitching. When you see the hair twitching, those were separately shot elements captured against bluescreen in our office.Angus Bickerton, Production VFX SupervisorRecreated using the original plans was the detailed model of the town of Winter River from which Beetlejuice emerges once again. We pulled the model apart and produced smoke and had under lights, but then we wanted to get a collapsing edge which we would have never gotten at a model of 1/58th scale, so thats CG augmented edges, remarks Bickerton. When it came to Michael Keaton actually emerging from the model, we had backed ourselves into a hole because we had built this model town into this replica attic and we were an 1/8 of a foot off on the deck. We raised the set by six feet. That allowed Michael to pull the model apart and cheat the camera angles so we could create a bigger gap and have Michael Keaton sitting on a camera dolly. We cranked him up with a camera dolly. There were two parts. Part one, he is sitting on the camera dolly so you get his head emerging. Part two, we put a standing platform on him. When you are behind him, the model is actually beyond him, and were shooting across his shoulder to make it look like hes in the miniature. The world-building revolved around plate photography. Because we aimed to get a lot of it in-camera, we were matching to what we shot. That train station to the great beyond was a big set, and we had to do minor set extensions and extensions to the train. We had an immigration hall, again a great set build by the art department, and we had good concept work, so we knew how to extend it in the same style of the cinematography, Bickerton explains.Outside of one shot where a Shrinker named Bob was enhanced with digital sweat, the rest of the performance was captured practically.The cinematography and lighting were as off-kilter as the story itself.Bickerton was excited about joining the project. When I had my earliest Zoom call with Tim, straightaway he said, I would like to do some stuff stop-motion. Neal Scanlan joined early on for a lot of the prosthetics and animatronics. With all due respect to Neal and his team, who were brilliant, if you look at the Shrinkers, which are animatronic, they have a stagey design. The Beetlejuice universe allows you a little bit of freedom of not being held to being completely photoreal. Overriding everything was the desire to retain the spirit of the original movie wherever possible. It did affect our thinking when we were approaching the sequences, how could we do it practically and augmented rather than just resort to CG? I wouldnt call that a huge challenge but a great joy.
    0 Comments ·0 Shares ·123 Views
  • SFX/VFX VETERANS OF ALIENS REUNITE TO STRIKE FRESH TERROR INTO ALIEN: ROMULUS
    www.vfxvoice.com
    By JENNIFER CHAMPAGNEImages courtesy of 20th Century Studios, except where noted.The Alien franchise has always been about pushing boundaries. From Ridley Scotts original 1979 Alien to James Camerons Aliens in 1986, the series captivated audiences with its terrifying mix of storytelling and technical mastery. The practical magic of animatronics, puppetry and miniatures gave the extraterrestrial nightmare its unforgettable texture, creating a visceral experience that lingers in the minds of fans. Now, decades later, Alien: Romulus takes the franchise into a bold new era, merging those tactile roots with state-of-the-art digital techniques to craft a visual spectacle that feels both nostalgic and forward-thinking.Uruguayan filmmaker Fede lvarez, a self-proclaimed Alien superfan, helmed this latest chapter with a clear mission: to honor the franchises iconic feel through practical effects while leveraging the latest digital applications. What we wanted was the same visceral, real experience from the original films, lvarez explains. But we also wanted to embrace where technology is today. Its about respecting the legacy while pushing it forward. His commitment to practical effects was unwavering, bringing back veteran craftsmen from Aliens to oversee creature design and animatronics while seamlessly integrating cutting-edge CG. This hybrid approach sets a new benchmark for the franchise, balancing authenticity with innovation to deliver a fresh yet nostalgic cinematic experience.Legacy Effects Supervisor/Animatronic Puppeteer Shane Mahan at work. Alien: Romulus was filmed entirely in Budapest, Hungary, primarily at Origo Studios. (Image courtesy of Legacy Effects and 20th Century Studios)The team behind Romulus exemplifies the fusion of tradition and innovation. Alec Gillis, who supervised the chestburster effects with his team at Amalgamated Dynamics, Inc. (ADI), and Legacy Effects Supervisors Shane Mahan and Lindsay MacGowan, who oversaw the adult Xenomorphs, brought decades of expertise and passion to the project. Both teams, veterans of previous Alien films, approached the project as a homecoming a chance to revisit and evolve the iconic creatures they helped bring to life. The tools we have now let us do things we could only dream of back in the 80s, Gillis notes. But the goal was always to make it feel real to make it visceral.Matching [Director of Photography] Galo [Olivaress] lighting style was one of the most rewarding and demanding parts of the project. We had to ensure the digital creations didnt just blend into the live-action shots but felt like they were lit by the same haunting glow.Eric Barba, Production VFX SupervisorDespite its $80 million budget, Romulus boasts the visual scale and ambition of a much larger production. Filmed entirely in Budapest, Hungary, primarily at Origo Studios, the team maximized resources with state-of-the-art facilities and expert local crew. Their craftsmanship, enhanced by digital techniques, ensured the film retained its tangible, gritty texture while pushing the boundaries of whats possible in modern effects. This was further elevated by Galo Olivaress cinematography, which embraced rich contrasts and shadowy depth a visual palette that amplified the horror and suspense elements while integrating the palpable realism of the practical effects with the eerie atmosphere of the alien environments.Director Fede lvarez wanted the same visceral experience from the original Alien films, but he also wanted to embrace todays technology. (Image courtesy of Legacy Effects and 20th Century Studios. Photo: Murray Close).Production VFX Supervisor Eric Barba conveyed how Olivaress cinematography provided a nice and contrasty visual palette that amplified the horror and suspense elements of the film. The intricate use of backlighting and rich contrasts not only added depth to the alien environments but also created an oppressive sense of menace that underscored the films tone. However, Olivaress technique of working on the edge of exposure where shadows were allowed to fall presented unique challenges for the visual effects team. Matching Galos lighting style was one of the most rewarding and demanding parts of the project, Barba explains. We had to ensure the digital creations didnt just blend into the live-action shots but felt like they were lit by the same haunting glow. This attention to detail contributed to the films cohesive visual narrative, making the cinematography a central element of its atmospheric appeal.From left: Legacy Effects Supervisor/Animatronic Puppeteer Shane Mahan and director Fede lvarez. A lot of work went into maintaining a balance between practical and digital effects. (Image courtesy of Legacy Effects and 20th Century Studios)From a directors standpoint, lvarez doesnt favor storyboards carved in stone or excessive previs. I dont want to be locked into something too rigid before we start shooting, lvarez states. I like to leave room for creativity and discovery as we go. However, lvarez also recognizes that a film like Alien: Romulus with its intricate blend of practical and digital effects requires a well-organized previs process. The team still heavily relied on tools like 3ds Max for specific needs, particularly for lvarez to set up the camera and lighting angles. We did a lot of previs to ensure everything worked in the digital realm before we started shooting, Barba remarks. But the goal was always to keep that initial vision open to adjustments and surprises once we were on set.A tremendous amount of work went into keeping the balance of practical versus digital effects. What made [director] Fede [lvarez] really happy was that we didnt take that cheaper, faster way out. We embraced what he had shot.Daniel Macarin, Visual Effects Supervisor, Wt FXFor the VFX team, led by Barba, 3ds Max was invaluable for tasks like lvarezs camera placement. It was a crucial tool in helping them plan out specific effects sequences. The software allowed us to plan and explore shot compositions and digital environments before we ever set foot on set, Barba continues. It was like a virtual blueprint that guided the execution of practical effects and digital enhancements, ensuring everything would mesh seamlessly. While lvarezs preference for flexibility over rigid planning encouraged creative spontaneity on set, the teams meticulous previs work provided a strong technical framework that that guaranteed that even the most complex visual effects sequences were grounded and cohesive.Legacy Effects Shane Mahan and Lindsay MacGowan, both of whom contributed to Aliens, brought their expertise to the subadult and adult Xenomorphs. Alec Gillis returned to Romulus to supervise the chestburster effects with his team at Amalgamated Dynamics, Inc. (ADI). (Images courtesy of Legacy Effects and 20th Century Studios)The chestburster scene, one of the franchises most iconic moments, was reimagined in Romulus to showcase the fusion of old and new techniques. Gillis and his team began with digital sculpting in ZBrush, crafting an intricately detailed model that could be adjusted and refined before being 3D-printed at various scales. This allowed for a level of precision and efficiency that would have been unimaginable in earlier films. Once printed, the models were brought to life with animatronics, featuring translucent silicone skin, injected black fluids for shifting textures, and servo mechanisms for lifelike articulation. lvarez wanted the sequence to feel raw and unsettling, echoing the primal, slow-motion terror of a natural birth. Gilliss team delivered with a puppet that could drool, snarl and twist with terrifying authenticity, subtly enhanced by digital touches to perfect its movements.Modern 3D scanning techniques were used to refine the designs of Xenomorph while preserving the franchises signature bio-mechanical aesthetic.The shift from traditional clay sculpting to digital sculpting and 3D printing represents a pivotal evolution in the creation process for Romulus. We decided that we were going to do this not as a traditional clay sculpture but as a 3D digital sculpt, Gillis explains. Once we had that form, it allowed our mechanical department to start designing in 3D, ensuring everything aligned perfectly. The transistion from clay to digital not only helped to streamline production but also establish consistency between the large-scale and one-to-one models to smoothly integrate practical and digital. By moving away from clay, the team gained the ability to iterate quickly while maintaining the physical authenticity of the original Alien model.We knew that animatronics could give us the tactile realism we needed, but combining it with subtle digital enhancements gave the creature a fluidity that made it truly come alive.Lindsay MacGowan, Special Effects Supervisor, Legacy EffectsMiniatures played a central role in grounding the films massive set pieces. Producer Camille Balsamo-Gillis collaborated with New Deal Studios Co-founder Ian Hunter, a miniature effects veteran, to bring these elements to life. Using 3D scans and designs from Production Designer Naaman Marshall, the team built models that aligned with the films visual aesthetic. This level of precision allowed us to create miniatures that not only looked incredible but also matched perfectly with the digital environments, Balsamo-Gillis says. One of the most striking applications of their work was the Corbelin, a ship that evolved from initial 2D designs into a richly detailed 3D model that took on character in the process. The Corbelin and the Echo Probe were crafted to honor the original Alien production design while adding fresh elements unique to Romulus. A standout sequence involved the Corbelins dramatic crash into the Romulus, where the miniatures realism brought weight and texture to the scene. Once built, the miniatures were scanned, digitized and integrated into the film with the help of Wt FX and ILM. Other effects vendors on the film included Image Engine, Fin Design + Effects, Wt Workshop, Wylie Co., Metaphysic, Pro Machina, Atomic Arts and Tippett Studios.Wt FXs Daniel Macarin and his team used Maya, Houdini and proprietary applications to create fluid, organic movements for the CG creatures, making sure they felt as alive as their practical counterparts.The blend of practical and digital elements was central to the films identity. Barba, a veteran of TRON: Legacy and Terminator: Dark Fate, notes the importance of balance. Its not about replacing practical effects with digital ones, he explains. Its about enhancing whats already there and making sure everything feels cohesive. His team used Houdini and Maya to create environments and enhance creature movements, working closely with Wt FX and ILM. Wt FXs expertise in creature animation added a sense of organic fluidity to the aliens movements, while ILM contributed to the grander large-scale effects, including breathtaking space sequences and explosive action set pieces.Facehugger facelift. The shift from traditional clay sculpting to digital sculpting and 3D printing for Romulus Xenomorphs represents a pivotal evolution in the creature creation process for the franschise. (Photo: Murray Close)The integration of practical and digital effects extended to the work of Daniel Macarin, Visual Effects Supervisor at Wt FX, whose contributions were instrumental in bringing the Xenomorphs and offspring to life with the required dramatic level of realism. A tremendous amount of work went into keeping the balance of practical versus digital effects. What made Fede really happy was that we didnt take that cheaper, faster way out. We embraced what he had shot, Macarin states. Using Maya, Houdini and proprietary applications from Wt, Macarin and his team created fluid, organic movements for the CG creatures, making sure that every twitch and snarl felt as terrifyingly alive as their practical counterparts. Macarins team worked closely with the practical effects department, utilizing 3D scans of the animatronics to match textures and scale, and fully integrate the CG creatures into the live-action.A notable challenge arose when lvarez requested lighting effects in a highly practical set. Fede had an idea to make the cargo hold feel more alive by fluctuating all the lights, Macarin explains. This meant turning practical effects into digital recreations, which was enormous, but it elevated the scene. This adaptability, coupled with close collaboration with other departments, enabled Wts team to create immersive visuals that pushed the boundaries on Romulus.The Corbelin and Echo Probe ships were crafted to honor the original Alien production design while adding fresh elements unique to Romulus.The collaboration extended across disciplines, with every department contributing to the films vision. Stunt Coordinator Mark Franklin Hanson played a pivotal role in creating the films dynamic action sequences, including zero-gravity stunts that required precise coordination with the VFX team. Hansons meticulous planning ensured that the actors movements felt natural and believable, even in digitally augmented environments. Our goal was always to make the action feel grounded, Hanson says. Even when we were dealing with aliens and zero gravity, it had to feel real.The beauty of using 3D scanning is that it allows us to capture the detail and scale of our practical builds and translate them perfectly into the digital realm. Nothing gets lost in that transition.Shane Mahan, Special Effects Supervisor/Animatronic Puppeteer, Legacy EffectsKnown for his work in the horror genre (Dont Breathe, Evil Dead), lvarez approached practical effects with the same care and attention he gave his actors. During the chestburster scenes, he directed the animatronics as if they were performers, guiding their breathing, snarling and other movements to achieve the desired emotional impact. His use of 3ds Max for previsualization allowed the team to plan complex sequences while leaving room for on-set spontaneity. Hes incredibly collaborative, Gillis says about lvarez. He knows what he wants but is always open to the magic that can happen in the moment.ILM contributed to the grander large-scale effects, including breathtaking space sequences and explosive action set pieces.The films narrative and visual design were deeply informed by its rich legacy. Mahan and MacGowan, both of whom contributed to Aliens, brought their expertise to the adult Xenomorphs, applying modern 3D scanning techniques to refine their designs while preserving the franchises signature bio-mechanical aesthetic. Mahan explains, The beauty of using 3D scanning is that it allows us to capture the detail and scale of our practical builds and translate them perfectly into the digital realm. Nothing gets lost in that transition. The creatures intricate details every fold, texture and sinew were preserved, blending physical models that were digitally designed and 3D-printed with advanced digital counterparts. One of the films most striking moments a Xenomorph emerging from a cocoon spotlighted this total attention to detail. MacGowan reflects, We knew that animatronics could give us the tactile realism we needed, but combining it with subtle digital enhancements gave the creature a fluidity that made it truly come alive.lvarez wanted the CG shots of the void of space in Romulus to be darker, scarier and emptier than in other space films. To achieve the feeling of distant depth, the 3D model was mostly backlit to accentuate the detail, and the planet and rings silhouetted the ship and space station.The films climactic sequences brought together every aspect of the teams expertise. A hero shot featuring an alien bursting out of its cocoon required months of planning, with previs helping the team map out camera angles, creature movements and digital environments. The practical puppet, created by Mahans team, featured articulated limbs and cable mechanisms that allowed it to interact with its surroundings. Digital enhancements added fluidity and realism to the creatures movements, creating a terrifying moment. Building on the intensity of the cocoon scene, the birthing sequence involving Robert Bobroczkyi, a towering Romanian basketball player, delivered a visual and emotional gut punch that tapped the teams collective skills. Bobroczkyis unique stature and elongated proportions created an immediate sense of unease, embodying the alien-human hybrids eerie otherworldliness. Mahan, MacGowan and the Legacy Effects team crafted a custom suit for the actor that featured integrated animatronics, which allowed for unnervingly precise articulation, from the aliens subtle twitches to the grotesque unfurling of its limbs.Barba described how his team worked closely with the practical effects department to enhance the aliens unsettling emergence. The real magic came from layering digital enhancements over the incredible work Robert and the Legacy Effects team brought to the scene, Barba reveals. The digital touches were minimal but deliberate, amplifying the aliens unnatural movements and adding a glistening, almost wet texture to the suit that made it feel alive. The scene also relied heavily on lighting and camera placement to maximize its visual discomfort. lvarez wanted lighting that accentuated the aliens grisly features while casting Bobroczkyis imposing frame in looming shadows. Combined with tight camera angles, the effect was a claustrophobic, almost voyeuristic experience that immersed audiences in the horror of the transformation. Roberts performance was so striking that we knew we didnt want to overshadow it with too much digital work, Mahan states. The key was to let the practical suit and his physicality shine, while the digital elements served as the finishing touch. The result was a sequence that encapsulated Romuluss hybrid approach to effects and stands as one of the films most memorable moments, capturing both the raw terror and visual ambition that define the Alien franchise.***Legacy Effects worked closely with director Fede lvarez to create the Xenomorph puppets and suits, Cocoon, Rook animatronic puppet and design/makeup effects of the Offspring. To watch a fascinating behind-the-scenes pictorial and video journey through Legacys development of the creatures for Alien: Romulus, click here: https://www.legacyefx.com/alien-romulus
    0 Comments ·0 Shares ·126 Views
  • DANCING TO THE ANIMATED BEAT OF PIECE BY PIECE
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of Focus Features.Producer/director/writer Morgan Neville has become known for his musician profiles, whether highlighting backup singers in 20 Feet from Stardom or sitting down with a legendary rock guitarist in Keith Richards: Under the Influence. What is different with Piece by Piece is that the life and career of Pharrell Williams is depicted not through the traditional documentary means of talking heads and archival footage but instead as a LEGO animated feature. Two versions were made of the documentary, with the more traditional approach used as a rough template for its animated counterpart.For a long time, documentaries were seen as just journalism with pictures. There have been people over the decades who have pushed that, such as Erroll Morris, Werner Herzog and Wim Wenders. The audience is ready and hungry for it. When I get asked, Is your film a documentary? I say, Its creative nonfiction. A documentary comes with a rulebook, and Piece by Piece is deeply faithful and truthful, but is it pure journalism? No. However, thats not what I tried to do. Its cinema first. That kind of liberation is good for filmmakers and audiences.Morgan Neville, Producer/Director/WriterDirector/writer Morgan Neville describes his LEGO persona as the prefect pasty, disheveled, bespectacled documentary filmmaker that I am!It had some original and archive footage, music videos, other movie clips and some drawings to get the story in place as much as we could before we took it into animation, Neville states. I remember it being complicated. If youre basing something on a photo, do you need to license it? There is stuff like the Oprah footage, which we licensed and then basically animated it one-to-one with what the original scene is, if you look at it side by side with his performance on that show. That is maybe 25% to 30% of the film. Licensing footage was worth the cost. Making an animated film on a smaller budget, those sequences were gifts because even though they were going to cost as much as something we had made up, we didnt have the time to make it up, remarks Animation Director Howard Baker. We still storyboarded it so that the animation studios could start breaking down the scene down. If it was live-action, their heads would have exploded!Pivotal in visually translating Williams creative process was his condition called Chromesthesia or sound-to-color synesthesia where sounds appear as colors. Synesthesia was the one thing that unlocks this fantasy gear and was perfect for animation, Neville believes. That scene worked so well that when we showed it to people they said, I want to see that movie. Given a visual representation are the catchy musical beats that are the foundation of the Williams songs. I had three people who I worked quite closely throughout the show, and we felt that the film needed a LEGO hook, Baker recalls. We always had big bowls of LEGO and LEGO toys in our story room and were playing around with them. We started making these things saying, This is like this or that sound. We made a whole bunch of them, and our producers in India created them in CG. We sent them over to Pharrells company, i am Other, and they had ideas and reasons why things didnt work. Each beat ended up having a specific personality.When Gwen Stefani talked to director/writer Neville, she was already animated and quite over the top, so using her interview as actual dialogue felt natural.Some of the dialogue for Snoop Dogg came from a podcast he did with Pharrell Williams.Interviews rather than scripted dialogue drive the narrative. Whenever I was doing interviews, I would ask, What did the room look like? What did you and they say? Neville explains. In the case of Pharrells grandmother, she is gone, so he got his aunt to do the voice, but we didnt script anything. Even the banter between Pharrell and Snoop Dogg came from a podcast they had done together, and suddenly you can transform that anywhere, like backstage at the concert. There was that playfulness of being in moments more rather than just narrate something with pictures. We could time travel in a way through the film, which you cant do in documentaries but easily can do in animation.Rather than making everything perfect, which is possible in animation, an effort was made to incorporate mistakes that would appear in a live-action documentary.Some of interviewees were a nature fit while others required editorial assistance when it came to timing. When N.O.R.E. and Gwen Stefani were talking to Morgan, they were already animated and quite over the top, so using their interview as actual dialogue felt natural, Baker notes. But then Teddy Riley didnt seem quite as natural; his acting felt like it might turn out to be stiff; however, it ends up becoming the character. The director is part of the cast. My mini-me is the prefect pasty, disheveled, bespectacled documentary filmmaker that I am! Morgan laughs. My hair has gotten much whiter than it was at the time. Everyone had opinions about the design of their LEGO persona to varying degrees. Because Pharrell was such a main character, it took a long time for us to get there, Baker states. Missy Elliott was involved in the design of her character, which is one of the most successful because she was in there pointing out things to do to make her feel comfortable. No Doubt was easy about their caricatures; they saw one version, gave some notes, and that was that.[W]e didnt script anything. Even the banter between Pharrell and Snoop Dogg came from a podcast they had done together, and suddenly you can transform that anywhere, like backstage at the concert. There was that playfulness of being in moments more rather than just narrate something with pictures. We could time travel in a way through the film, which you cant do in documentaries but easily can do in animation.Morgan Neville, Producer/Director/WriterA conscious decision was made not to have a production designer developing a unifying look. I wanted all the different designers to bring what they thought it should look like into the picture so all of these places end up having a natural personality that was different, remarks Baker, who was based at Pure Imagination Studios. Set and environmental designs were divided between Tongal and Zebu Animation Studios. Zebu Animation Studios and CDW Studios each did one-third of the animation, and the remainder was completed by animators hired by Pure Imagination Studios, who were already proficient in the LEGO stop-motion animation style. There were definitely studios we knew were better at certain things, Neville observes. CDW Studios was good at water, and the water effects are amazing. There were definitely animators who were good at base animation, and we gave them the close-up scenes.Daft Punk was very particular about the design of their LEGO persona.Real footage was shot of Pharrell Williams returning to his hometown of Virginia Beach, Virginia, which in turn inspired the animation for the scene.Piece by Piece may be best described as creative nonfiction.The scene that served as a proof of concept was when high school student Pharrell Williams listens to I Wish by Stevie Wonder for the first time on a ghetto blaster.A point of reference for the camera style and lensing was Moonlight.Unlike animation, where everything is created from scratch, documentaries need to adapt to real settings and circumstances. We did a lot of things that were common in documentaries that are uncommon in animation, like montaging through space and time, Neville states. My sense is that the big LEGO movies have some giant incredible anchor sets that they live in a lot, and we were constantly skipping through space and time from location to location. The number of sets we had would dwarf what you would normally find in LEGO movie. In a way, it wasnt about building all of these amazing castles. We just need to capture the essence of different places and times. Teddys studio, Virginia Beach and New York each have a feel.Five years was spent making Piece by Piece, which is not uncommon for an animated feature.Animation allowed for a deeper exploration of the characters, such as Justin Timberlake and Pharrell Williams.The aspect ratio was 2.39:1. The film that I talked about the most in terms of the look was Moonlight because I love the cinematography, and it has this warm, funky anamorphic look, Neville explains. There was a unifying lens look throughout the whole film even though at times we break into these archival sections that actually have a different look. Some of the archival sections we exported from finished 4K animation onto VHS and re-imported it from VHS. I dont know if that has ever been done before! In terms of framing, we did a lot of tights and mediums because that felt cinematic.We felt that the film needed a LEGO hook. We always had big bowls of LEGO and LEGO toys in our story room and were playing around with them. We started making these things saying, This is like this or that sound. We made a whole bunch of them, and our producers in India created them in CG. We sent them over to Pharrells company, i am Other, and they had ideas and reasons why things didnt work. Each beat ended up having a specific personality.Howard Baker, Animation DirectorEarly on, a topic of conversation was the limitations of LEGO animation. A big one was dance, Neville states. LEGO figures dont bend, and there is a lot of movement in the film. We had a lot of discussions about, How do we represent dance as much as we can? Howard has a dance background and has played with this. Cracking that was a major thing for us. A 24-hour-long video of Pharrell Williams hit Happy came in handy. We watched that for many hours looking for a lot of dance references to put all over in the film, Baker reveals. Im a big believer in if you can draw it, you can probably animate it. We would draw it out and show it to the animators, and sooner or later they would give us a version of it that felt right.Getting the visual treatment were the catchy beats composed by Pharrell Williams. Here is an example of that in a sequence going from storyboard, layout, animation, ambience, lighting to final.Piece by Piece expands the boundaries of documentary filmmaking. For a long time, documentaries were seen as just journalism with pictures, Neville notes. There have been people over the decades who have pushed that, such as Erroll Morris, Werner Herzog and Wim Wenders. The audience is ready and hungry for it. When I get asked, Is your film a documentary? I say, Its creative nonfiction. A documentary comes with a rulebook, and Piece by Piece is deeply faithful and truthful, but is it pure journalism? No. However, thats not what I tried to do. Its cinema first. That kind of liberation is good for filmmakers and audiences.The experience has been career-altering. Its to Pharrells credit for making this animated because animation is an emotional metaphor which allows us not be factual but at the same time make it believable, Baker remarks. You can have a singing mermaid and no one questions it. Making a story about a real persons life, then making it as emotionally visual as we get to do in animation has opened my eyes to being able to get deeper into characters and letting them tell the story. There came a point where I realized that what we do with animation is so visual and what they do now in documentary and live-action is so character-driven that I was able to bring those two things together in way that is unique and mind-boggling eye-opening.
    0 Comments ·0 Shares ·159 Views
  • JELLYFISH BALANCED BOMBED CITIES, SMOKE AND VFX TO CAPTURE THE STORY OF LEE
    www.vfxvoice.com
    By OLIVER WEBBImages courtesy of Jellyfish Pictures and Sky UK Ltd, except where noted.Released in September for a limited theatrical window in the U.S. and U.K. where it was recently nominated for British Independent Film Awards (BIFA) for Best Effects and Cinematography, Lee stars Kate Winslet as World War II photojournalist Lee Miller and explores the story behind Millers rise to fame as a fashion model turned war correspondent for Vogue magazine. Adapted from the 1985 biography The Lives of Lee Miller, written by Millers son Antony Penrose, Lee is a harrowing portrayal of one of Americas most acclaimed photographers.Production VFX Supervisor Glen McGuigan and Jellyfish Pictures VFX Supervisor Ingo Putze led the team from pre-production to post. VFX Supervisor Glen McGuigan approached Jellyfish Pictures back in December 2022 to partner on the movie Lee, Putze says. Glen knew of Jellyfish through Jon Park, who is one of our stellar VFX supervisors. Jon was already busy with another project, so I was pleased that Lee ended up on my desk.Kate Winslet is WWII photographer Lee Miller in Lee. The VFX requirements on Lee encompassed a diverse range of tasks, including set extensions and digital matte paintings to help enhance Director of Photography Pawel Edelmans live-action photography.Since the film was focused on Lees photography, the VFX needed to capture both the tone and style of her photos. Lees war photography is black-and-white, grainy, high contrast, with a handheld camera somewhat impossible to marry with the smooth, perfectly-captured pictures by [Director of Photography] Pawel Edelman, but we found a golden middle to integrate them into the look of the VFX scenes.Ingo Putze, VFX Supervisor, Jellyfish PicturesWhen it came to initial conversations about the look of the film, Jellyfish had a very short window between locked edit, shot turnover and delivery. We decided to do concept art and paint over plates of almost all the key scenes to support the creative vision of the director, Ellen Kuras, Putze explains. It was very helpful to have that common visual language to execute the workload to make the delivery in the given timeframe. For the look, we found a language that kept the story focused on the foreground, balanced with bombed cities, smoke and VFX in the background without distracting the audience.Kate Winslet and Andy Samberg in Lee. Part of the visual effects work consisted of transforming modern landscapes into WWII-era London and Germany.Due to the fast turnaround and nature of the project, Jellyfish held remote reviews almost daily to finesse the work creatively while director Ellen Kuras was in New York. This meant we could work the shots up without impacting the schedule, Putze notes. Producer Kate Solomon and Post-Production Supervisor Portia Napier were instrumental in ensuring we had maximum access to Ellen and were included in the grading process to make the final detailing efficient. Once everything was in a good place creatively, we hosted 4K screening room reviews in our theater for sign-off. It was a true collaboration in every sense of the word and made for the smoothest possible execution.Lee Millers original photographs served as the main source of reference and inspiration for Jellyfish. Having access to Millers private archives proved to be invaluable in order to accurately depict her experiences. Translating Millers black-and-white analog still photographs into live-action moving imagery while preserving their original impact proved to be a challenging task for the VFX team. Since the film was focused on Lees photography, the VFX needed to capture both the tone and style of her photos, Putze remarks. Lees war photography is black-and-white, grainy, high contrast, with a handheld camera somewhat impossible to marry with the smooth, perfectly-captured pictures by [Director of Photography] Pawel Edelman, but we found a golden middle to integrate them into the look of the VFX scenes.Lee Millers original photographs served as the main source of reference and inspiration for Jellyfish. (Photo courtesy of Elevation Pictures)Usually, VFX projects are faster, bigger, stronger, creating VFX you often have never seen before. Lee, in contrast, is a true artistic film with a strong emotional message. It shows the importance of reporting the atrocities of war from the first female war photographer. We really enjoyed calming down the amplitude of VFX to not overtake the story.Ingo Putze, VFX Supervisor, Jellyfish PicturesJellyfish were tasked to match the bombed-out church shot where Miller took her most famous photo. We managed to find out how this building looked before the bombing through old postcards today it is a modern building on the same spot, Putze explains. Millers photo You Will Not Lunch In Charlotte Street, which is featured in one shot, was actually Goodge Street. A lot of photo material was taken from similar buildings and architecture to recreate London in the 1940s. Miller witnessed the worlds first use of napalm bombing. As a base, we colorized the original black-and-white photo then replaced it with real reference and added simulated smoke explosions, animated water, etc.Translating Millers black-and-white analog still photographs into live-action moving imagery while preserving their original impact proved to be a challenging task for the Jellyfish VFX team.The close-up of the bombed church in London was one of the most challenging shots. The blueprint for the shot was established when Lee takes a photo of the church. We needed to be true to her photo and also match the live photography, which was Kate Winslet on bluescreen with a pile of bricks. Extensive research went into it and getting the right texture elements for the DMP in execution it was a 2.5D projection to simulate the camera move. It was certainly a lot of work for 15 frames in focus.The VFX requirements on Lee encompassed a diverse range of tasks, including set extensions and digital matte paintings to help enhance Edelmans live-action photography. There is a scene where Lee is at an airfield where we needed to add CG planes in the background, then landing on a field airport in France and adding a sea of army tents with animated planes on the runway; adding injuries, wounds and burns to a soldier, adding fire to a burning building, simulating propaganda leaflets raining from the sky, hotels scenes and Millers home which was shot on bluescreen and needed BG replacement, [as well as] adding explosions, ground impacts and smoke to the combat scenes, Putze details.CG planes were added into the background for the airfield sequence, along with a sea of army tents in the foreground. The VFX needed to capture both the tone and style of Lee Millers photos.We managed to find out how this building looked before the bombing through old postcards today it is a modern building on the same spot. A lot of photo material was taken from similar buildings and architecture to recreate London in the 1940s. Miller witnessed the worlds first use of napalm bombing. As a base, we colorized the original black-and-white photo then replaced it with real reference and added simulated smoke explosions, animated water Ingo Putze, VFX Supervisor, Jellyfish PicturesThe Jellyfish team was extremely moved and emotionally challenged when researching original photos of the Dachau KZ camp where Miller witnessed atrocities by the Nazis. To be authentic, we needed to add dead bodies and compose them into the live action, Putze says. Understandably, some artists found it hard to work on this scene, which of course we respected, so we had to assign the tasks and handle the feedback incredibly sensitively. Feedback for this sequence was handled delicately while still achieving the level of realism that was required to honor Millers documentation of war and the brutality of its visuals.Jellyfish were also responsible for creating bombed cities and incorporating explosions.Jellyfish added explosions, ground impacts and smoke to the combat scenes.When it came to initial conversations about the look of the film, Jellyfish had a short window between locked edit, shot turnover and delivery.Authentically recreating Lees journalistic endeavors of WWII was a delicate balancing act for Jellyfish, which delivered 180 VFX shots for the film.The visual effects work also consisted of transforming modern landscapes into WWII-era London and Germany, creating CG war planes, enhancing wounds, adding injuries, extending crowds, adding CG fire and water hose elements to the burning Vogue building, replacing windows using bluescreen, creating bombed cities and incorporating explosions. Jellyfish delivered 180 visual effects shots for the film. At Jellyfish Pictures, we have a strong environment and DMP team, Putze notes. A lot of layers and cards came from that department and were projected in a 2.5D setup in compositing. This made the show much lighter in 3D tasks.Authentically recreating Lees journalistic endeavors of WWII was a delicate balancing act for Jellyfish. Comments Putze, Im incredibly proud of the team and what we achieved. Usually, VFX projects are faster, bigger, stronger, creating VFX you often have never seen before. Lee, in contrast, is a true artistic film with a strong emotional message. It shows the importance of reporting the atrocities of war from the first female war photographer. We really enjoyed calming down the amplitude of VFX to not overtake the story. I was proud to recommend this film I worked on to my family and friends. My personal background is art direction and matte paintings, which were used heavily in the execution of Lee, so it was close to my heart.
    0 Comments ·0 Shares ·224 Views
  • ILM GOES INTO ALTERNATE MODE FOR TRANSFORMERS ONE
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of Paramount Pictures.After returning to feature animation with Ultraman: Rising, ILM released its second offering that comes from a franchise that made the VFX company synonymous with the live-action version. Transformers One was directed by Josh Cooley, who set the story back when sworn enemies Optimus Prime and Megatron were inseparable, mischievous buddies called Orion Pax and D-16.One of the good things about ILM is that we have many different varieties of visual effects shows, so our pipeline and assets need to be versatile, notes Animation Supervisor Stephen King, who divided and conquered the animation with fellow supervisors Rob Coleman and Kim Ooi. The pipeline itself didnt change too much from how we handle our visual effects shows. Where there is a difference is, rather than working on a shot for this or that sequence, on an animated feature everything moves forward together within a sequence. There is a lot more time and structure built into these animated features, which is nice.Lead Character Designer Amy Beth Christenson explores color options for Airachnid.Being that it was going to be such a clean style for this film, it immediately became clear that they were going to have to transform onscreen without cheating. We had to solve the transformations on the concept art side to make it easier for everybody down the line. Once we had that narrowed down, I immediately started building everything in 3D as a concept art model and kept it in the same file.Amy Beth Christenson, Lead Character DesignerKing had previously worked on Transformers: Revenge of the Fallen, Transformers: Dark of the Moon, Transformers: Age of Extinction and Transformers: The Last Knight as well as Transformers: The Ride 3D. Not only did we use and tweak some of our specific tools to help them transform what were developed for the live-action films, we also learned that because these are big bulky characters sometimes its hard to get clean silhouettes for action and fighting, King states. It meant that we had to stress strong dynamic poses so the audience can read [the action], and theres not that confusion because the action happens quickly. It was good because the director, Josh Cooley, was into us suggesting things like, You would get a stronger pose if we do this kick instead of a punch here. It was a great collaborative effort.Rather than follow the norm of transforming into a vehicle, Alpha Tron takes the shape of a lion.Storyboards were created and the initial layout was provided by Christopher Batty at Paramount Animation, who served as the cinematographer on the project. We always had something that helped to ground us, but it is freeing, King remarks. Michael Bay is a fantastic cinematographer and does cool camera moves, but youre always trying to put them into the plate, so youre cheating the camera. Whereas with this one, youre doing the action as it would be done on set. You adjust the camera to the action. Unlike Michael Bay, who is known for two-second shots, Cooley had an opposite approach. Josh was keen to do long shots. There is a shot on the rooftop that is over a minute long. He said, Why do things with multiple cameras, takes and cuts when I can let the audience get in and breathe whats going on? Sometimes, we had multiple animators working on the same shot. One animator can focus on one specific action and another can work on other bits that are happening around it. We tried to bring it up to a certain level, show it to the director, and if were heading in the right direction then wed add some polish and refinement and keep going, King explains.As D-16 becomes Megatron, he gets bigger and heavier, and you can feel the weight. Orion Pax is a dreamer, has a carefree bounce to his step and doesnt have the weight of responsibility. But as Orion Pax transforms, he is forced into that leadership role, and we changed his pose. Orion Pax stands more upright and his shoulders are back more. Not only does he physically change but also his persona.Stephen King, Animation SupervisorCharacter Designer Evan Whitefield conceptualizes B-127 as a miner.The toys, comic books, live-action movies and the animated series were considered to be a point of reference. We started to visually unpack everything and discussed what we wanted our proportions to be and what is the style of the movie, states Lead Character Designer Amy Beth Christenson. Josh and Jason William Scheier [Production Designer] wanted to lean into Art Deco and J.C. Leyendecker. Josh wanted the faces to emote, so we did some studies of panel work going back to the 1980s cartoon, which had much more metal skin and worked much better to get the facial expressions out. It always defaulted back to the 1980s cartoon, which was nice because from the get-go this was going to be an animated movie with its own style. Being such an uber fan, I also wanted to make sure it was distinctly recognizable and pulling in the main parts while still doing something new.Characters had to fit into the overarching style but still have distinct elements in their own right. I made sure that the negative space inside the helmet was distinctive for every single character, and they had a shape language, Christenson remarks. It was nice going back and forth between the helmet and the face making sure that they felt of one piece. B-127 has a lot more round shapes than anybody else because hes friendlier and funnier. The character design reflects the arc of the cast members. Josh brought up the fact that a lot of the times when youre going up with your friends, youll dress and act the same. We started almost with the same silhouette for Orion Pax and D-16. But if you look at Megatron, the toy from the 1980s, he is square. We went from round shapes for D-16, and I began adding in more squares and repeating triangle angles because thats a much more aggressive shape. Thats the best example where we changed the shape language but tried to keep the silhouette so you could recognize D-16 as a miner with a cog as Megatron, Christenson explains.Amy Beth Christenson imagines Bumblebee with a cog and his alternate mode.Adding further complexity to the character designs was the fact that Transformers are given that name for a reason. Being that it was going to be such a clean style for this film, it immediately became clear that they were going to have to transform onscreen without cheating, Christenson states. Beyond that, there are three versions of Orion Pax and Megatron and two versions of most of the other characters. We had to solve the transformations on the concept art side to make it easier for everybody down the line. Once we had that narrowed down, I immediately started building everything in 3D as a concept art model and kept it in the same file. I would have the robot and the alt mode, and I would make sure that I was instancing the piece over so that the chest is going to be the same scale and piece as the front of the big rig. Concept art models were keyframed animated to demonstrate the transformations. We had to work out everybody at the same time to make sure that Bumblebee didnt suddenly transform bigger than a big rig. like Megatrons tank. For Elita and Bumblebee, we had to keep hollower in the inside so when transforming into a bike or car they took up less space, because there were these cavities where things could go in tighter. Whereas, Orion Pax and D-16 were solid in the inside and have parts that can expand, which allowed them to get bigger than they should, Christenson explains.We started almost with the same silhouette for Orion Pax and D-16. But if you look at Megatron, the toy from the 1980s, he is square. We went from round shapes for D-16, and I began adding in more squares and repeating triangle angles because thats a much more aggressive shape. Thats the best example where we changed the shape language but tried to keep the silhouette so you could recognize D-16 as a miner with a cog as Megatron.Amy Beth Christenson, Lead Character DesignerA facial study of Megatron.Being given the ability to transform alters the mindset of the characters, which was incorporated into the animation. As D-16 becomes Megatron, he gets bigger and heavier, and you can feel the weight, King remarks. Orion Pax is a dreamer, has a carefree bounce to his step and doesnt have the weight of responsibility. But as Orion Pax transforms, he is forced into that leadership role, and we changed his pose. Orion Pax stands more upright and his shoulders are back more. Not only does he physically change but also his persona. Developing an emotional connection with the audience meant having the characters emote in a realistic manner. Josh still wanted them to feel like robots, so we dont have a cheekbone that our skin rises over. The eyes move more like camera lenses and apertures. We didnt make them blink because those were saved for more emotional beats.For each shot, an animator would capture reference footage of themselves performing the required action. Its my favorite style because it helps the animators get into the characters and is definitely where the industry has moved to get that realism and detail that audiences want to see now, King reveals. Because we didnt want go down the wrong path for the long shots, we would often show Josh our reference before presenting him with the animation blocking. Directing the eye of the viewer was accomplished through lighting and color. Its a colorful, saturated movie, but for each shot lighting, [Production Designer] Jason William Scheier directed the eye [of the viewer] while subtly darkening or desaturating the background because we didnt want the audience to look as if our characters would pop out.A variety of facial expressions are examined for Orion Pax.Among the most distinct characters is the Sentinel Primes enforcer Airachnid. She was a lot of fun to animate because we had her walking on her legs and doing big acrobatic flips into her transformation, King explains. On top of that, she has all of these sets of eyes on the side, which is a big story point, giving them the spider quality of always looking around independently. You feel the importance of it exactly when she locks all of them onto something. Whereas, most of the Transformers become a vehicle. Alpha Trion takes on the form of a lion with a unicorn horn and electric tail. Because hes older and from this previous generation of Transformers, the same amount of energy wasnt needed. We played Alpha Trion smaller and subtler to make him feel wise. An ancient alien race is the archenemy of the Transformers. The Quintesson boss has these floaty tentacles, so weve got this nice organic movement underneath her, which is a nice juxtaposition to the robotic movements, King notes.Considerable effort went into making sure the negative space inside the helmet was distinctive for every single character and that the helmet and face felt of one piece.A sketch of Orion Pax in his alternate mode.[Airachnid] was a lot of fun to animate because we had her walking on her legs and doing big acrobatic flips into her transformation. On top of that, she has all of these sets of eyes on the side, which is a big story point, giving them the spider quality of always looking around independently. You feel the importance of it exactly when she locks all of them onto something.Stephen King, Animation SupervisorTransformations happen with background characters as they go about their daily lives. There was a lot of world-building in a movie like this, King states. We did anything to make it feel like a real organic world. We did a couple of iterations with the fan and getting the timing right. Its supposed to be this big fan that turns on and pushes a lot of air, so it cant move too quickly, otherwise it wont have that weight, but at the same time, it had to be fast enough to be dangerous. We had it go fast on the initial startup, then it slowly decreases over time. Fun was had with the vehicles. Describes King, D-16s tank has some nice details such as independent treads and legs so he can maneuver around uneven surfaces. Elita is a tri-bike with independent suspension, so when shes banking around corners, we tested and pushed how far she could lean over and get cool, dynamic poses, especially when beating up all of the Trackers in the middle of the movie.Serving as the main antagonist is Sentinel Prime voiced by Jon Hamm.Taking part in the Iacon Race is a female racer and her alternate mode.Leading the ancient alien race known as the Quintessons is Quintus Prime.Mapping out the evolution of Orion Pax into Optimus Prime.Exploring the how scale changes going from D-16 to Megatron.Malleable metal skin was given to their face to enable the characters to emote better.Wreaking havoc as the enforcer for Sentinel Prime is Airchnid, who has the ability to transform into a helicopter.B-127/Bumblebee gets exited about being able to generate knife hands once a cog gets inserted into his body.Orion Pax was given a carefree demeanor.The eyes were treated as camera lenses to allow the characters like D-16/Megatron to emote while remaining robotic.The primary visual reference for the movie was the 1980s animated series.Reaching the surface of Cybertron, B-127, Elita, Orion Pax and D-16 encounter Soundwave, Starscream and Shockwave.Harkening back to the wild craziness of the pod race in Star Wars: Episode I The Phantom Menace and the car race in Ready Player One is the Iacon race that Orion Pax and D-16 hijack despite their inability to transform. From the first script read, that was the sequence ILM keyed on because we knew it was going to take all of our skills and departments, King reveals. We have characters racing and interacting, then we add the complexity of this road that has to come out, transform in front of them, move and have its own life. It would have been very difficult to do in live-action, so that is one of the great things about doing an animated feature.Watch the transformation videos featuring key character designs and models in Transformers: A Design Case Study from ILM. Click here: https://www.ilm.com/art-department/transformers-one-case-study/
    0 Comments ·0 Shares ·191 Views
  • HOW TERRITORY STUDIO DEVELOPED THE GRAPHIC LANGUAGE FOR ATLAS
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of Territory Studio and Netflix.Given that computer technology has become an integral component of everyday life, whether at work or home, user interfaces and screen graphics can easily be taken for granted, but a lot of thought goes into designing them, as reflected in the work Territory Studio did for the Netflix film Atlas. The sci-fi adventure, starring Jennifer Lopez and directed by Brad Peyton, required 300 screens, HUDs (Heads Up Displays) for Arc robots, and a way of visualizing the voice and emotional state of an AI entity known as Smith. These tasks were guided by Production Designer Barry Chusid and Visual Effects Supervisor Lindy DeQuattro.Territory Studio spent more than two years on the project. I always say that our storytelling with the universal language of design is what helps directors sometimes escape an entire scene because we can show a map with an A, B and a line, and the whole audience quickly gets that these guys are going from A to B, states Marti Romances, Co-Founder and Creative Director of Territory Studio. We look at the script and figure out what can be done practically. We understand how DPs love to get the lens and the light that comes from certain screens, but in the case of Atlas, there were a lot of things that were impossible to shoot in-camera, like holograms, which happen in a visual effects pipeline. The reason why we were on Atlas for so long was that we were prepping the graphics to be played back on set and working with the art department to develop the language. As shooting was wrapping, we were already doing concepts for the post-production visual effects.We are not beginning with a template. Rather, we always treat it as something to be novel and more towards what the director wants. In this case the director wanted to aim for simplicity, and Pacific Rim was the opposite. Pacific Rim was like a glass hologram, whereas with Atlas, imagine you have lasers sketching a hologram. If you approach it from that angle and really listen to what the director wants, you are always going to end up with something new.Marti Romances, Co-Founder and Creative Director, Territory StudioInspiring the graphic design for the International Coalition of Nations were government and military-industrial-complex institutions like the CIA.Having art directors and production designers act as a bridge between production and post-production ensures that there is no miscommunication as they understand the story behind each set, prop and graphic. There are projects where we did the on-set graphics and someone else did the post and vice versa, Romances notes. The nice thing about a project which uses the same company for both is its a more cohesive visual language. There were a number of efficiencies. For example, we didnt need to create a lot of the design language and icons from scratch because of our on-set library. Territory Studio thrives working for Hollywood and Silicon Valley. We believe that there will be different influences here and there from one industry to the other. General design principles apply no matter the industry. What is the best user experience for the viewer who is being entertained and how can we make it so that a broader audience can understand? But we also do that when were attacking an electric vehicle. You may have never had an electric vehicle, and we want to make sure you feel comfortable, Romances says.HUDs are a familiar high-tech element in cinema, in particular those associated with Iron Man and Pacific Rim. Despite this recognition, Territory Studio starts from scratch every single time. We are not beginning with a template, Romances notes. Rather, we always treat it as something to be novel and more towards what the director wants. In this case the director wanted to aim for simplicity, and Pacific Rim was the opposite. Pacific Rim was like a glass hologram, whereas with Atlas, imagine you have lasers sketching a hologram. If you approach it from that angle and really listen to what the director wants, you are always going to end up with something new. The directive was for militaristic AR in 2070. We were passing all of our toolkits and animated material to five different vendors, and the comment from Lindy was, If this graphic is here, I need to see the light source. We were trying to ground it in a more realistic way, rather than having something floating, not knowing where its anchored. Props to Lindy for keeping an eye on making sure the everyone understood that this is how it should be portrayed in the frame.The graphic for the voice of the Smith AI has a core surrounded by several transparent wrapping spheres, which gives it a simple and sophisticated visual aesthetic.The brief was, What if we imagine an augmented reality version of the visual that we get for Siri on our iPhones that floats around you? We did an immense number of design iterations and ended up with something that is simple yet could be mobile and could be projected from every corner of that HUD. We needed to give it some character and a connection to the vocal performance. I would have loved to do a PR stunt where we question why the Oscars are not considering Smith for Best Supporting Actor! It was a special element in the film.Marti Romances, Co-Founder and Creative Director, Territory StudioPartnering with counter-terrorism analyst Atlas Shepherd (Jennifer Lopez) is an AI known as Smith (voiced by Gregory James Cohan) that serves as the operating system for the mech robot she is piloting. When the AI was getting angry or witty, we almost had different modes but not necessarily a big change in the design, Romances remarks. It was more like, Lets play with color and the animation, and the reaction of the audio wave that is connected to the voice. How do we make sure that comes across as something that is too aggressive? When the movie Her came out, someone said to me, The future doesnt have any UI. That will never work. Humans are visceral animals. We need an anchor point. Thats why the Alexa has a little light that tells you if its thinking or talking. We approached it in the same way, and Brad wanted exactly that. The brief was, What if we imagine an augmented reality version of the visual that we get for Siri on our iPhones that floats around you? We did an immense number of design iterations and ended up with something that is simple yet could be mobile and could be projected from every corner of that HUD. We needed to give it some character and a connection to the vocal performance. I would have loved to do a PR stunt where we question why the Oscars are not considering Smith for Best Supporting Actor! It was a special element in the film.Driving the graphic design process was the need to clearly illustrate story points and move the narrative forward.Rather than rely on the typical audio waveform, which is line-oriented and has peaks and valleys much like a mountain range, a different approach was adopted to showcase the voice of Smith. Its the same amount of distortion that you would imagine in a two-dimensional line, but we applied it to a few wrapping transparent spheres that are surrounding the core, Romances explains. It was not only how it vibrates but how much it retracts or expands depending on tone. We added some particle elements so it isnt as simple as a few spheres on top of each other combined with the lighting, nice compositing, animation and color. We came up with enough variables to create a real character which was the endgame.The Centurion Tech is used by humanoid AI terrorist Harlan Shepherd (Simu Liu) and his agent Casca Decius (Abraham Popoola). It resembled the different blocks you get when looking at the fragmentation of a memory disk. Brad wanted Centurion Tech to be something that you could not understand. It was monochromatic, and we wanted to pair it with the same technology that Atlas has when different holograms appear in front of her. The only time we get to see a visualization of the voice from Centurion Tech is when Casca jams the signal on the Smith UI. Because Smiths system is being hacked, the visualization had to sit somewhere between as it was being pushed and forced to send the message.Shifting back and forth between consumer products and film and television projects has enabled Territory Studio to apply lessons learned from both experiences to their design methodology.Topographical projection with the light source indicated to ground the technology.An interesting challenge was the International Coalition of Nations (ICN) Interrogation Room, which took inspiration from the CIA. There were certain times when the director started asking for crazy things in the interrogation moment where Atlas comes out and deploys an entire system of holograms that appear next to her as she tries to extract the coordinates of Harlans base from Casca, Romances states. This is the type of storytelling that without those graphics would be difficult for the audience to understand what is happening.All of the work done during shooting was extremely helpful for the lead actress. We could show Jennifer Lopez that we are clearing out all of this space. So, when were shooting towards the exterior of this cockpit, make sure she remembers that in this area, she will have a voice talking to her and in that area she is going to have a menu. We created all of these rules for the HUD prior to starting post-production so Jennifer could have a nice visual cue on where that will be sitting in front of her, how big and far away it will be. That was critical for her.We could show Jennifer Lopez that we were clearing out all of this space, so when were shooting towards the exterior of this cockpit, make sure she remembers that in this area, she will have a voice talking to her and in that area she is going to have a menu. We created all of these rules for the HUD prior to starting post-production so Jennifer could have a nice visual cue on where that will be sitting in front of her, how big and far away it will be. That was critical for her.Marti Romances, Co-Founder and Creative Director, Territory StudioInnovation was required in order to be able to share 300 assets with MPC, Scanline VFX, ILM, Cantina Creative and Lola VFX. One of the best solutions that we had for that is we created a Nuke toolkit, with the biggest one being for the Arc HUD, Romances remarks. The nice thing about this toolkit was that in one file you have all of the different modes of the HUD, similar to drive mode in car. When youre in combat mode, the HUD became more militaristic and combat-ready to aim and shoot. When you were on a normal mode or what we called initiation module, that was a completely different setup. All of these setups are versions of the same HUD with different linework, icons and colors. You just need to comp this once then go to our script, change it to combat mode, rather than having to import a different set of graphics and EXRs again. Also, we were looking at things that have a lot of parallaxes. All of the EXRs were multi-layered and coded into the Nuke script, which means you dont have create different depth layers. You create that once. We could update the EXR layer without having to re-engage the whole graphic system again in the final comp. This is where you find efficiencies, especially when working with other vendors.To assist Jennifer Lopez in believably interacting with the Smith AI, an Arc HUD guideline was put together so she had a sense of the layout, position and function of each graphic.The Smith AI voice graphic was treated as a floating augmented reality version of the Siri icon found on the iPhone.The Centurion Tech was not meant to be comprehensible, as it was created by AI and not humans.Among the elements used to give Smith AI a personality were variants in color, and the core expanding and retracting.In order to be able to share 300 assets with MPC, Scanline VFX, ILM, Cantina Creative and Lola VFX, a Nuke toolkit was created by Territory Studio that streamlined the process.Every project starts from scratch for Territory Studio, as the focus is on capturing the vision of the director.One of the important aspects of graphics is orientating the audience to where the various characters are situated in relation to each other.The Arc HUD has various modes with one of them being for combat.Around 120 on-set screens were created that were all animated, including the coffee machines at the ICN. One interesting part that we never experienced is that there were a few scenes where Brad wanted to take our on-set graphics and augment them in post, Romances notes. This was a beautiful moment for us because were now being asked to extract those on-set 3D graphics, which is not usually the case. There is a moment where Atlas is synchronizing to a different level with Smith and sees all of the graphics that we had produced for on-set floating towards her. And theres another shot where shes in the medical bay and is trying to call Smith to save her. To have the director say, You did this, all of this language. Can you now bring it into post? That was a special moment for us because its like seeing the manual for Smith. It was cool. Sadly, there is a lot of cool stuff that never made the cut, like a floating visor for the Centurion. Thats another thing I would highlight. Sometimes, there is so much more that we do that never gets to see the light of day, but we had lots of fun doing it.
    0 Comments ·0 Shares ·188 Views
  • STAGING THE FINAL SHOOT-OUT AND GETTING FLIPPED OVER A CAR FOR WOLFS
    www.vfxvoice.com
    By CHRIS McGOWANImages courtesy of AppleTV+.After having worked together on two MCU Spider-Man movies Homecoming and Far From Home and having had a great time doing so, writer-director Jon Watts and VFX Supervisor Janek Sirrs teamed again on Wolfs, an action comedy in the fixer category that is light years away from the MCU. Regarding Watts, Sirrs recalls, I think I helped with his baptism of fire into the Marvel VFX world, so perhaps he felt he owed me one. Either way, I think we both have a similar dark sense of humor and an appreciation of the absurd. He probably thought I was a good fit for Wolfs given its black comedy undertones.Previously, Sirrs shared an Oscar for The Matrix as Associate Visual Effects Supervisor and shared nominations for Iron Man 2 and The Avengers as Visual Effects Supervisor. For The Matrix, I originally came on board to help out with a show that was already up and running, but the role dramatically expanded once I arrived on site, Sirrs says. On both of those Marvel shows I was the Overall Production Supervisor, [arriving] during early development stages, running all the way through until the last shots were delivered in post.Brad Pitt and George Clooney in the freezing cold of New York. Despite shooting in the middle of winter in New York, it only snowed about 30 minutes total, so, for consistency, nearly every shot with snow in the final movie has digital snow in the air and/or on the ground.Because we werent allowed to film on the real [Brooklyn] bridge itself, we recreated select limited portions a construction site at the base, a gangplank running through the arches, the catwalk above a cross street, the ladder that leads from the catwalk to the top and the roadway up top as partial set pieces surrounded by bluescreen that were then digitally extended. All these partial sets were built in the overflow parking lot at Six Flags Magic Mountain amusement park just north of Los Angeles [in Valencia].Janek Sirrs, Visual Effects SupervisorIn the Apple TV+ film Wolfs, two lone-wolf, super-efficient rival fixers (George Clooney and Brad Pitt) are called in separately to clean up the seemingly accidental death of a twenty-something man (the Kid) in a very high-end hotel room. The panicking guest is Margaret, a Manhattan district attorney (Amy Ryan) who fears a huge scandal and has contacted an unnamed fixer, identified later only as Margarets man (Clooney), to cover things up. Meanwhile, Pam (the owner of the hotel, voiced by Frances McDormand) has viewed everything through hidden cameras and called in another fixer, referred to as Pams man (Pitt), to tidy up.The film relied solely on storyboards, many of them mocked up by director Jon Watts using a 3D storyboarding program called Frameforge.Most reluctantly, the two professionals are compelled to join forces despite their egos and bickering. (Clooney and Pitt previously worked together on the Oceans trilogy.) Complicating matters further, the Kid (Austin Abrams) was carrying a backpack full of drugs that belongs to the Albanian mafia. Plus, the Kid wasnt actually dead; he was merely overdosing. After waking up, he escapes, clad only in his underwear, and Jack and Nick must chase him down freezing New York City streets. To clean up a mess that just keeps growing, Jack and Nick must also deal with various unexpected factors such as when they get sidetracked by a Slavic kolo dance at a wedding and are recognized by Dimitri, a Croatian mobster (Zlatko Buric). The cast also includes June (Poorna Jagannathan) and the Kids dad, Richard Kind.In addition to writer/director Watts, the production included Larkin Seiple (Cinematography), Jade Healy (Production Design), Andrew Max Cahn (Supervising Art Director), and Conrad V. Brink Jr. and Elia P. Popov (Special Effects Supervisors). Plan B Entertainment and Smokehouse Pictures co-produced with Apple, which has signed Watts to direct, write and produce a Wolfs sequel.Clooney and Pitt talk to blood-covered (Amy Ryan). Interiors were mostly a stage build shot at the Ace Mission Studios in downtown Los Angeles.We had originally thought that the car-flipping moment would be the trickiest, requiring more extensive digi-double work than we ultimately needed. What saved us from uncanny valley hell was deciding early on that wed never need to show the entire stunt in one unbroken moment and could split it into several smaller, more manageable chunks. Added into the mix was Austin [Abrams] game-for-anything attitude, which meant that we never had to deal with any sort of digital face replacements.Janek Sirrs, Visual Effects SupervisorThe primary vendors that handled the bigger scenes were Framestore and beloFX. Framestore essentially did the major stuff in the first half of the movie, while beloFX focused on the latter half. Capital T took care of all the other miscellaneous smaller FX across the whole picture. Rodeo FX was also brought in to specifically handle the final diner scene, Sirrs explains. No previs was done on show at all. Instead, we went all primitive and relied solely on good old-fashioned storyboards. Director Jon Watts actually mocked up many of his own boards himself, using a 3D storyboarding program called Frameforge.Pitt and Clooney threaten the Kid (Austin Adams on the bed) to find out the origin of the stolen drugs in his backpack. In Wolfs, two lone-wolf rival fixers (Clooney and Pitt) are called in separately to clean up the seemingly accidental death of the Kid in an upscale hotel.Exteriors, except for the final warehouse shoot-out and the Brooklyn Bridge partial set work, were filmed on location in New York, which meant a series of long, cold winter nights running around Chinatown and parts of Queens and Brooklyn, Sirrs recalls. The warehouse exteriors were filmed in the thankfully much warmer warehouse district just east of downtown Los Angeles. And the ensuing warehouse interior action was shot in one of those warehouses. Pretty much every other interior was a stage build shot at the Ace Mission Studios in downtown Los Angeles.Bluescreen was used for the Brooklyn Bridge portion of the chase sequence. Sirrs explains, Because we werent allowed to film on the real bridge itself, we recreated select limited portions a construction site at the base, a gangplank running through the arches, the catwalk above a cross street, the ladder that leads from the catwalk to the top and the roadway up top as partial set pieces surrounded by bluescreen that were then digitally extended. All these partial sets were built in the overflow parking lot at Six Flags Magic Mountain amusement park just north of Los Angeles [in Valencia]. This meant we could have real cars driving through the upper roadway set. Sirrs adds, The obvious big VFX scenes were the chase through Chinatown and the immediately following Brooklyn Bridge action. But the shoot-out outside the warehouse is pretty much 100% VFX-enhanced in terms of the weather conditions. We did utilize LED projection for the various car interior driving scenes, but only as simple flat screens projecting process plate, not any sort of 3D volume.Pitt and Clooney threaten each other in the wedding party scene where they are recognized by Croatian mobster (Zlatko Buri) surrounded by cameras, lighting and sound equipment in a behind-the-scenes shot.The Brooklyn Bridge action and surrounding driving scenes were probably the most complicated scenes of the film, logistically. Stunt driving in New York is very restrictive these days, with essentially all vehicles having to follow the normal flow of traffic and not exceed posted speed limits. Taken together, that makes constructing a high-speed chase scene a little bit challenging. The final result relied upon a combination of practical stunt driving and digital vehicles for the faster, more extreme moments, Sirrs explains.Stunt driving in New York is very restrictive these days, with essentially all vehicles having to follow the normal flow of traffic and not exceed posted speed limits. Taken together, that makes constructing a high-speed chase scene a little bit challenging. The final result relied upon a combination of practical stunt driving and digital vehicles for the faster, more extreme moments.Janek Sirrs, Visual Effects SupervisorSnow, both falling and settled on the ground, was an invisible effects challenge. Despite shooting in the middle of winter in New York, it only snowed for real for a duration of about 30 minutes total, Sirrs notes. Shooting permits only allowed SPFX-crushed ice to be put down on sidewalks, not on the streets themselves, and wind conditions in the street made using SPFX foam snow towers erratic at best. So, we really didnt capture the desired weather look in-camera. In the final movie, nearly every shot that you can see snow in has a healthy dose of digital falling snow and/or digital snow ground cover. Snow accumulation also had to track across the entire picture, starting from clear to a more winter wonderland look by the time we reach the climactic shoot-out.The Kid (Austin Abrams) executes a flawless flip over a moving car, with the help of one digi-double shot, bluescreen, a buck and wires and Abramss game stunt work.Director Jon Watts consults with Pitt and Clooney, who previously worked together on the Oceans trilogy. Exteriors, except for the final warehouse shoot-out and the Brooklyn Bridge partial set work, were filmed on location in New York. The final shoot-out and warehouse exteriors were shot in Los Angeles.Behind the scenes on Wolfs with writer/director Jon Watts, a veteran director of films in the MCU, including three Spider-Man films.The scene of the Kid leaping and flipping over a car was a particular challenge and involved VFX and Austin Abrams doing his own stunt work. With the exception of one digi-double shot, everything is ultra-high-speed composites of Austin on a bluescreen stage being hit with a soft foam, blue, car-shaped buck and flipped on wires, combined with plates of the real car driving slowly through the Chinatown street location, Sirrs describes. On stage, we could light Austin brightly enough to shoot at roughly 1,000 fps, but on location we were limited by how much light we could physically cast onto the street, and so we had fake slow motion by shooting at 24 fps and driving very slowly. Adding slow-motion digital falling snow atop everything was the final step in selling the frame-rate trickery. We had originally thought that the car-flipping moment would be the trickiest, requiring more extensive digi-double work than we ultimately needed. What saved us from uncanny valley hell was deciding early on that wed never need to show the entire stunt in one unbroken moment and could split it into several smaller, more manageable chunks. Added into the mix was Austins game-for-anything attitude, which meant that we never had to deal with any sort of digital face replacements.The most satisfying VFX scene for Sirrs was the Kid flipping over a car for sheer comic value. Abrams endured a great deal to make the scene work. Sirrs comments, If it wasnt bad enough that we forced Austin to run around the freezing Chinatown streets at night completely exposed to the elements, we then added insult to injury by running him down with a car.
    0 Comments ·0 Shares ·207 Views
  • CONQUERING THE STREAMING HIGH SEAS WITH VIKINGS: VALHALLA SEASON 3
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of MPC and Netflix.Setting sail on the third season of the spin-off series Vikings: Valhalla was MPC, which acted as the sole vendor, supported by Take 5 Productions in-house team. MPC facilities in Toronto, Mumbai and Bangalore provided 500 visual effects for eight episodes, with the main focus on creating CG environments that existed in the 11th century, in particular Syracuse, Constantinople, Greenland, Poland, Kattegat and Winchester Cathedral. Other significant challenges involved a collapsing cliff and a magic trick played upon villagers attempting to burn Freydis Eriksdotter (Frida Gustavsson) at the stake, instead finding their homes set ablaze.[MPC Ocean tool] gets you 80% of the way there. For the shots it wasnt working, we started figuring out, Do we tie all of these patches of the ocean? Do we create a new ocean? Do we need to add more detail or foam to help blend this into our environment? Having done water for 1,400 shots over the last two seasons, it was great to use a new tool for that and see what worked and didnt work. They had to fix and retool some things for us, if not by request, just by us breaking and using it for what they didnt think it was for.Ben Mossman, VFX Supervisor, MPCOne of the major new environments that had to be created for Season 3 was Constantinople.That was a big story point in Episode 308, where everybody thinks Freydis is a witch, and she starts lighting things on fire in the town, but in reality its the other guys who set them up, states Ben Mossman, VFX Supervisor at MPC. We did put a flame bar in front of her, but it had to be four feet out for safety so none of the actual wood caught fire. There is a fake faade of the little pile that shes attached to that we put out further. They had a flame bar that would go up when its supposed to happen, so we actually had real fire to cue it, but there wasnt much that could be done practically right around her. For the roof earlier in the sequence, we did light it on fire, but we had to add more [roof] and replace some of it. There was some practical smoke on set; however, for the rest of it, we had to add in for them to be able to run away and be included by it. It was about getting the timing down and rehearse when things are supposed to happen, like, Now, you cant see them. Now theyre running away. Because it was fire, there wasnt much else we could do! Its also fire during the daytime too, so its not like you get a big glow of interactive light. Its there in all of its brightness.It took us quite a bit to get there in terms of the timing of the animation, look, what elements do we want to see, how much of this cliff do we want to collapse, and how big are the pieces [of rock falling from the cliff onto the Viking ships]? We have water splashing the rigging and sails, and Vikings being ejected. There was a version where there were definitely more guys getting obliterated, and we doubled it back. One shot we did was full CG, but everything else had to tie back in with practical water.Ben Mossman, VFX Supervisor, MPCMost of the shot designing was determined by storyboards, with previs being done for Episode 303 to figure out how the digital extensions for Constantinople would look like with the plates captured in Croatia and where in the city to shoot. We took a model of Dubrovnik and did some fly-throughs in Unreal Engine to mock up what the extensions would be like or add an army to how see how that filled the space, Mossman remarks. One of our on-set guys, Adam Stewart, did a lot of our scanning and also had a background in animation and modeling. Occasionally, we would ask him to help us out with some previs and scouting. We were able to get an early scan of Croatia that we had purchased, so when we got there we were already able to drop cameras into a physically accurate location which was cool. That was helpful because even while in Croatia scouting, we were able to go back at the end of the day, adjust our cameras to what the directors were wanting to see, and show them a new version of some of their shots to see how they were feeling about it. The physical set for Kattegat is the same one from the original Vikings series on the History Channel, which took place 150 years earlier. Mossman notes, There were things that were legacy that were brought over, but otherwise new showrunner, stories and worlds, so we had a lot of room to depart from that.Modern-day boats had to be replaced with period-accurate vessels.No drastic design changes were made to the Viking boats over the course of the three seasons as the passage of time is measured in years, not decades. We were able to move around a lot of the boats between the various Vikings factions; however, there were different shields and flags, Mossman adds. Kattegat grows over the seasons, so we expanded the town and the same with London to give it little updates that people might notice. Characters are traveling all over the world, especially in Season 3 where they are in Italy, Turkey and Poland. There was always a big crop of environments that would come up that would change every season that we would have to figure out how to do and how extensive it was going to be. Syracuse and Constantinople were the two big new environments and were seen at a hero level. Mossman explains, Syracuse and Constantinople were historical cities that existed, so you wanted to try to figure out what they actually looked like a thousand years ago and still make it work for what we needed for the story. We looked at a lot of fortresses that were built in Spain, Turkey and the Middle East that had the same architecture as what would have been in Syracuse at the time. Constantinople was a huge a city, and pieces of it are still in Istanbul today. There were maps and lots of architectural examples from that time, which the Byzantines either took from Rome or created themselves.Bluescreen assisted in getting the proper scope for environments and the correct number of soldiers.Construction methodologies from the period were factored into the digital assets. In general, its not being machined or being created by advance construction techniques, Mossman remarks. Its all being done by hand with ropes and pullies. That goes with any of the environments that we built where you want this feel that these are trained people who are good at what they do, but there is unevenness in the bricks and not 100% precision in how everything fits together. The Syracuse and Constantinople assets were handled by MPC Bangalore. We have an amazing environments team there that we got to work with, and they were insanely fast! Work was shared all across the sites. A huge water component had to be accommodated by the pipeline. Mossman observes, The advantage that MPC Toronto had in not using the MPC Ocean tool much before was we broke it immediately, which was cool! It gets you 80% of the way there. For the shots it wasnt working, we started figuring out, Do we tie all of these patches of the ocean? Do we create a new ocean? Do we need to add more detail or foam to help blend this into our environment? Having done water for 1,400 shots over the last two seasons, it was great to use a new tool for that and see what worked and didnt work. They had to fix and retool some things for us, if not by request, just by us breaking and using it for what they didnt think it was for.There are 10 birds flying around that have a string tied to their legs with a parcel of fire on the end. The story point is Harald Sigurdsson [Leo Suter] gets these birds drunk to sedate them, so we had to figure out what does a drunk bird look like. Our technical animation department simulated the feathers on the birds as well as the string coming off them, and then effects would take over the rest of the string and fire so its interactively moving around. The fire conveniently does not quite go up the string until theyre out of the prison, but they end up lighting the whole prison on fire when going back to their nest.Ben Mossman, VFX Supervisor, MPCRocks being pushed off a cliff to destroy the invading Viking ships below in Episode 304 reads small on the page. It took us quite a bit to get there in terms of the timing of the animation, look, what elements do we want to see, how much of this cliff do we want to collapse and how big are the pieces? Mossman explains. We have water splashing the rigging and sails, and Vikings being ejected. There was a version where there were definitely more guys getting obliterated, and we doubled it back. One shot we did was full CG, but everything else had to tie back in with practical water. We had a real boat floating in this quarry lake that the set was built on, and for the collapse moment we added a second version that was already half in the water. The special effects team put in water cannons to shoot up with debris and water. It was a good blend to start with, but as the edit and look changed, we deviated further from that to get to the story that they wanted to tell.MPC Toronto broke the MPC Ocean tool when dealing with numerous shots featuring water.Traveling to England was not possible, so the art department built a floor, some pillars and a casket for the Winchester Cathedral funeral sequence while the remainder was digitally augmented. They wanted this dramatic lighting for the death of a character we have seen in the last two seasons and to fill the rest of the church with patrons to pay their respects, Mossman describes. That was a small set, so most of it was done with bluescreen. Crowds in the cities were given special attention. We had one of our mocap and animation leads in Toronto, Charlie DiLiberto, create these little vignettes of people talking and waving. That would be passed off to our crowd team working in Houdini in Bangalore, and they would incorporate it into the crowd system. If we saw that something was missing, another round of performance would be added.For Season 1, we did almost 1,000 shots, and this season was half that, but with the timelines and the complexity of the work and wanting it to look better with every season, it was just as hard to nail down some of these big environments, and we had a lot more effects-driven story points, like the rocks falling and the fire. With that comes more attention from everybody wanting it to look good and hit the beats that theyre after. It was honing in on that stuff and being able to execute it within the time that we had with our showrunner, producers and editors.Ben Mossman, VFX Supervisor, MPCThe birds flying around the fire consisted of only a couple of shots in Episode 307, but it was technically difficult to execute as several departments had to divide and conquer the sequence. Mossman says, There are 10 birds flying around that have a string tied to their legs with a parcel of fire on the end. The story point is Harald Sigurdsson [Leo Suter] gets these birds drunk to sedate them, so we had to figure out what does a drunk bird look like. Our technical animation department simulated the feathers on the birds as well as the string coming off them, and then effects would take over the rest of the string and fire so its interactively moving around. The fire conveniently does not quite go up the string until theyre out of the prison, but they end up lighting the whole prison on fire when going back to their nest.A dramatic moment in Episode 304 is when rocks are pushed off a cliff and sink an invading Viking ship.Buildings were constructed digitally, keeping in mind the tools and techniques used during the period being depicted.Landscapes were altered to give environments a more cinematic quality.Skies were replaced to make shots moodier.Clouds were among the atmospherics added to shots.Working on Vikings: Valhalla involves constructing familiar boats and creating new environments.Fewer shots and a shorter post-production period do not entirely reflect the effort compared to previous seasons. For Season 1, we did almost 1,000 shots, and this season was half that, but with the timelines and the complexity of the work and wanting it to look better with every season, it was just as hard to nail down some of these big environments, and we had a lot more effects-driven story points, like the rocks falling and the fire, Mossman states. With that comes more attention from everybody wanting it to look good and hit the beats theyre after. It was honing in on that stuff and being able to execute it within the time that we had with our showrunner, producers and editors.Working with the same client and creatives over the three seasons has streamlined the process. Mossman comments, We have been able to keep together a chunk of the team, so that helped to keep the consistency and shorthand, especially with animation and matte paintings. He says the combination of the new and the old keeps the work interesting. Its fun because we have these new things every season that we can create, and then there is the stuff we know how to do and didnt have to build new assets for. We can knock those shots out first and feel good about them while were scratching our heads over these giant environments and armies that are new to the season.
    0 Comments ·0 Shares ·239 Views
  • PIXOMONDO PUTS THE DRAGON INTO HOUSE OF THE DRAGON SEASON 2
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of Pixomondo and HBO.Commencing with Season 2, Pixomondo has been associated with Game of Thrones, and this relationship carried over to the spin-off series House of the Dragon, culminating in a full-scale dragon battle referred to as the Dance of the Dragons. Overseeing the eight-episode contribution that consisted of 600 shots with 150 of them featuring the title creature was HBO Visual Effects Supervisor Dadi Einarsson, who was determined that the dragons be realistic and their distinct personalities and size differences were emphasized.The challenge wasnt just creating complex, organic creatures with muscles and loose skin reacting naturally; it was also about ensuring the audience could emotionally connect with them, states Sven Martin, VFX Supervisor at Pixomondo. This way, when the story demands, viewers can truly feel the loss of these big pets. From a visual standpoint, Dadi was committed to grounding the effects in real-world behavior. He avoided impossible camera movements and made sure to silhouette the dragons against bright skies, just as you would if filming them in the real world. This attention to detail helped seamlessly integrate the visual effects shots with the surrounding live-action photography, enhancing the realism of these awe-inspiring creatures.Actress Eve Best (Rhaenys Targaryen) sat on motorized buck to simulate riding a dragon. Final composite of Rhaenys Targaryen riding Meleys to battle, combining footage of the actress with a digital environment and dragon.[The Battle of Rooks Rest] scene features intricate aerial combat between three dragons while a massive army lays siege to a castle below. We worked closely with director Alan Taylor and [VFX Supervisor] Dadi Einarsson to visualize the scale of the battle and choreograph the dragon movements, ensuring the sequence aligned with Alans epic vision for the scene. This sequence required extensive techvis for the dragon-riding scenes and battles, which involved significant visual effects elements. This process helped plan the shoot and ensured smooth integration of CGI with live-action.Matt Perrin, Senior Visualization Supervisor, PixomondoWhen mapping out the major tasks, the most ambitious one was the Battle of Rooks Rest in Episode 204. One of our key priorities was ensuring the dragon rigs were efficient enough for the animators, allowing them to work fluidly even when handling scenes with three massive dragons, Martin remarks. We also knew that during aerial dragon battles, we would need to rely more heavily on CG fire rather than practical elements. Controlling fire simulations, especially at high speeds, over long distances, and with dragons spiraling through the air, while accounting for the effect of their wing flaps on the flames, was a critical focus early on. Additionally, we anticipated that the dynamic, agile camera work would require a fully CG replica of the original shooting location. To achieve this, we began by building the environment based on a LiDAR scan and aerial photography, ensuring a seamless integration of visual effects with the live-action footage.Plate photography of actor Tom Glynn-Carney in front of a bluescreen and a blue dummy head to frame where the dragon will be when added in post. Final VFX shot of King Aegon Targaryen preparing his dragon Sunfyre for battle.Only minor adjustments were made to Vhagar and Syrax, but extensive sculpture and texture work was required on Meleys, which was modeled by MPC in Season 1. As Meleys was featured more prominently in Season 2, often in bright daylight and undergoing several stages of wounds and damage, we needed to enhance her detail, Martin explains. Our art department used photo references of red lizards to overpaint images, ensuring large portions of these photos remained intact to avoid an overly painterly or artificial look. In fact, some of the perfect references were found in a local pet shop, where I stumbled upon a dark red lizard that matched our vision. Since Sunfyre was barely seen in Season 1, we created his model from scratch, investing considerable time in sculpting, texturing and shading to ensure he lived up to his title as the most beautiful dragon in Westeros, as described in the books. Throughout production, Ryan Condal [Executive Producer/ Showrunner] and Dadi Einarsson worked closely with us to fine-tune Sunfyres look, balancing his golden beauty without pushing too far into the realm of fantasy so he would still feel grounded in the scene alongside other dragons. Additionally, we introduced several new dragons in Season 2, including the new mysterious wild dragon, Tessarion [also known as the Blue Queen], a new baby dragon named Stormcloud, and a revival of the original baby dragons from Game of Thrones.Plate photography of soldiers and battlefield. Final VFX shot of dragon Meleys wreaking fiery havoc on the battlefield, with Rooks Rest Castle in the background.Several key sequences required previsualization with the focus primarily on the dragon flight scenes. Notable examples include Daemon and Caraxes journey to Harrenhal and Moondancers chase across the Crown Lands, remarks Matt Perrin, Senior Visualization Supervisor at Pixomondo. The most complex sequence we previsualized was the Battle of Rooks Rest. This scene features intricate aerial combat between three dragons while a massive army lays siege to a castle below. We worked closely with director Alan Taylor and Dadi Einarsson to visualize the scale of the battle and choreograph the dragon movements, ensuring the sequence aligned with Alans epic vision for the scene. This sequence required extensive techvis for the dragon-riding scenes and battles, which involved significant visual effects elements. This process helped plan the shoot and ensured smooth integration of CGI with live-action. The time spent designing and technically deconstructing the sequence for shooting proved worthwhile, as the final result closely matched the previsualized version.Playblast of Meleys, showing some of the controls from the dragon animation rig.Refined beyond typical levels was the previs animation for Caraxes and Moondancer. This animation helped define the motion of the cameras and dragon flights, informing the programming of the motion-control camera rig and the multi-axis gimbal buck rig, Perrin states. Guided by previs, the team was able to execute precise, repeatable movements, especially in action-heavy dragon fight scenes! This ensured that the final footage of these dragon sequences closely matched the vision established during previs, resulting in highly dynamic shots that seamlessly blend practical elements with CGI.[T]he previs animation for Caraxes and Moondancer] helped define the motion of the cameras and dragon flights, informing the programming of the motion-control camera rig and the multi-axis gimbal buck rig. Guided by previs, the team was able to execute precise, repeatable movements, especially in action-heavy dragon fight scenes! This ensured that the final footage of these dragon sequences closely matched the vision established during previs, resulting in highly dynamic shots that seamlessly blend practical elements with CGI.Matt Perrin, Senior Visualization Supervisor, PixomondoPlate photography of actor Ewan Mitchell looking on at the destruction post-fight. Final VFX shot of Aemond Targaryen looking at the crash site of Sunfyre, with additional smoke and fire to underline the post-apocalyptic feel of the battles aftermath.A huge amount of interaction and sharing took place between previsualization, technical setup, virtual production and the post-production visual effects team. We built virtual replicas of the stage and rigs that we would use on the shoot, states James Thompson, Virtual Production Supervisor at Pixomondo. That meant that we could easily build and reconstruct the moves for the dragons and the cameras from the CG into the real world. For the virtual production, this data was fed through our real-time system to trigger the rigs, lighting and any other elements running on the LED wall on set. A lot of the elements we used on the LED panels came from the previs, like the shadows created from the wings of the dragons. The animation data from the previs was solved, interpreted and used to run the buck rig and the robot that the camera was mounted to. For post, if needed, we were able to provide data from the shoot, such as corner pin data to help with any re-positioning and tracking of plates back into the shots.It is one thing to create a dragon but quite another to have to add a rider. A critical part making the dragon riders appear to be believable was making sure that when we took the animation from previs and solved it to work on the set with the rigs, that it was deconstructed as little possible, Thompson reveals. We really needed to have the buck with the actor on it move as closely as possible to the character flying in the previs. Sometimes we had to dial back the speeds and range of movement quite a bit to fit within the mechanical limitations of the rigs. Getting this right would mean the actors bodies would endure, as close as possible, the real physical forces of the movements that would be impossible to act out accurately. Other techniques we employed to enhance believability were things like adding cue bloops on the panels for the actors so they would know when to act out a certain performance. We also used eyeline markers so they knew where to look at a given time based on the previs and the current move they were doing on the buck. This combined with the various lighting effects that we added onto the LED walls, such as fire, helped sell all of this.Grayscale model of the bridge to Dragonstone. Full CG environment of the bridge to Dragonstone.A critical part making the dragon riders appear to be believable was making sure that when we took the animation from previs and solved it to work on the set with the rigs that it was deconstructed as little possible. We really needed to have the buck with the actor on it move as closely as possible to the character flying in the previs. Sometimes we had to dial back the speeds and range of movement quite a bit to fit within the mechanical limitations of the rigs. Getting this right would mean the actors bodies would endure, as close as possible, the real physical forces of the movements that would be impossible to act out accurately.James Thompson, Virtual Production Supervisor, PixomondoCarried over from Season 1 were the assets for Dragonstone and the Grand Sept. One significant change was the decision to shoot Dragonstone Island in a new location, a quarry that wasnt even near the sea, Martin notes. However, it provided a natural base for the castle and had a mountain behind it that we transformed into Dragonmont. Since the original location in Spain was no longer available, and with far more exterior scenes around the castle in Season 1, production chose a real location where we could digitally add missing elements, like the iconic zig-zag bridge and the crescent-shaped island with its gate. I appreciated this approach, as these new plates gave us the ideal foundation for seamlessly blending our CG extensions. Using a scan of the location, we modified the castle to fit its new setting and incorporated a set-build for the entrance, which hadnt been prominently featured before. The Grand Sept, originally a real-time asset used on the LED stage in Season 1, was also adaptable for the traditional Maya pipeline since it had been meticulously built. Driftmarks drydock was a 3D extension of the backlot set, complete with additional ships under repair. Meanwhile, The Eyrie and its surrounding alpine landscape were created using 3D geometry and concepts provided by the art department.Final VFX shot of Rhaenys Targaryen flying dragon Meleys across the sea to battle.Previsualization of Rhaenys Targaryen flying dragon Meleys towards battleFor shots of the dragon riders, the actors sat on a motorized buck and were filmed with a motion-control camera, both of which were driven by the motion data from the previs. Pixomondo devised a new system that allowed for live comps of those buck-plates over the previs on set so the filmmakers could get a better sense of what the final shot will look like while shooting.[T]he greatest challenge is ensuring the emotional depth isnt lost amidst the technical demands. The story of the riders and their dragons, their conflicts and the impact on them was always a central focus. For instance, when Meleys succumbs to her demise at the end of the sequence, we worked closely with [VFX Supervisor] Dadi [Einarsson] to perfect her performance, paying meticulous attention to the timing of her eye blinks and the subtle details as Vhagars jaws close around her. The intricate skin simulations and the depiction of blood channeling between her scales were the finishing touches, enhancing the emotional weight of the scene.Sven Martin, VFX Supervisor, PixomondoAmong the complex shots to execute were the numerous aerial ones featuring two to three dragons fighting over a fully CG battlefield. One particularly challenging shot involved Vhagar stomping through the battlefield in slow motion after a crash-landing, which caused an explosion of ash, dirt and smoke, Martin states. The original plate was shot in bright sunlight and needed to be transformed into a drastically different environment. This required integrating CG soldiers, interactive smoke, ground simulations with exploding dirt and crushed soldiers into the scene. The compositing team faced significant challenges with extensive rotoscoping and dimming the sunlight to achieve the desired effect. In such a technically complex sequence, the greatest challenge is ensuring the emotional depth isnt lost amidst the technical demands. The story of the riders and their dragons, their conflicts and the impact on them was always a central focus. For instance, when Meleys succumbs to her demise at the end of the sequence, we worked closely with Dadi to perfect her performance, paying meticulous attention to the timing of her eye blinks and the subtle details as Vhagars jaws close around her. The intricate skin simulations and the depiction of blood channeling between her scales were the finishing touches, enhancing the emotional weight of the scene.Bluescreen resembling a staircase allows for proper beach interaction between rider and dragon.Digital doubles were necessary to get the proper scope of the crowds.Devastation was digitally augmented to the pristine landscape.Adding further complexity to the aerial battles was having to simulate dragon fire. The rapid movements and high speeds of the dragons made it difficult to ensure that the fire behaved physically accurately, Martin remarks. The simulations needed to align perfectly with the real flamethrower elements captured by the special effects team. Additionally, we had to account for sparks when the fire hit the opposing dragon and integrate smoke to enhance the visual complexity. We opted for simulated wounds rather than pre-sculpted ones to allow for last-minute animation adjustments. Using Houdini, we simulated dragon claws ripping through the skin, causing inner fat and flesh to push out while blood gushed from the newly created wounds. For the flying scenes, we used digital wisps and clouds to enhance movement and parallax against the distant ground or ocean. Instead of traditional cloud renderings from Maya or Houdini, we rendered volumetric clouds and skies in Unreal Engine. This approach enabled us to make extremely fast adjustments to speed, position and lighting. Dadi Einarsson and HBO VFX Producer Thomas Horton and the rest of the HBO production team fostered a collaborative environment. Their deep understanding of our complex workflows and their appreciation for the dedication of everyone at Pixomondo made the process not just enjoyable, but a wonderful ride or better said, dance.
    0 Comments ·0 Shares ·265 Views
  • TAKASHI YAMAZAKI ACHIEVES KAIJU-SIZE SUCCESS
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of Takashi Yamazaki, except where noted.Takashi Yamazaki and Stanley Kubrick are the only directors to have ever won an Oscar for Best Visual Effects.When people think about Japanese cinema, Akira Kurosawa and Hayao Miyazaki often get mentioned, but that is not the entire picture as renowned talent has emerged from younger generations, such as Hirokazu Kore-eda, Mamoru Hosoda, Makoto Shinkai and Takashi Miike. Another name to add to the list is Takashi Yamazaki, who accomplished a feat only achieved by Stanley Kubrick when he became only the second director to win an Academy Award for Best Visual Effects, and in the process reinvigorated a legendary kaiju [giant monster] franchise with Godzilla Minus One. What impressed him most was not being handed the golden statue but getting the opportunity to brush shoulders with his childhood idol. Receiving the Academy Award for Best Visual Effects was a great honor, but meeting Steven Spielberg at the Nominees Luncheon was perhaps an even more exciting moment, Yamazaki admits. It was a chance encounter with the God I had longed for since childhood.Previously, Yamazaki had established himself by adapting mangas, such as Parasyte and Always: Sunset on Third Street, with the sequel of the latter foreshadowing his feature film involvement with the King of Monsters, as he appears in an imagery scene. That scene was a short one, but it was just about as much as we could do with our technology and computing power we had. At that time, it was impossible to complete the visual effects for a two-hour Godzilla film with our capabilities. As time went by, we were able to process information that was incomparable to that time in terms of technology and computing power we had, so I thought I could finally create the Godzilla I envisioned and started this project. It was a good decision to wait until this happened and make the Godzilla I envisioned.Like the kaiju, manga are a cultural phenomenon. The best way to succeed as a creator in Japan is to become a manga artist. Therefore, the best talent is concentrated in manga. Furthermore, the ones who survive in the very tough competition are the ones who become known to the most people. There is no reason why the stories told by those at the top of the giant pyramid should not be interesting. Adapting a comic book into a film potentially requires the characters to be the comic book itself, which is difficult, Yamazaki says.To help define Godzillas look, Yamazaki and the animator spent time developing Godzillas walk in Godzilla Minus One. (Image courtesy of Toho Company)The science fiction genre is interesting in that it can create things that do not exist in this world. I also like the fact that it can be used as an allegory with various messages. The biggest reason for my attraction is that it excites my inner child.Takashi Yamazaki, Director,Godzilla Minus OneGrowing up in Matsumoto, Japan, Yamazaki had a childhood fascination with insects and crafts. I was surrounded by nature, so I collected insects and lizards and observed them. I was also a child who preferred drawing paper to toys and would request 100 sheets of drawing paper as a Christmas present. Neither of his parents had much to do with the arts. My father was good at drawing, and I remember that when I asked him to do something, he would do his best to draw me Ultraman or some other character. A cinematic turning point was getting the opportunity to watch the sci-fi classic by Steven Spielberg, Close Encounters of the Third Kind. What was shocking was the scene where the giant mothership flips over. With the visual effects before this, it took some effort to believe that it is real, but this was the first moment when I had the illusion that it is real.Four-year-old Takashi Yamazaki stands in front of Matsumoto Castle with his family.Takashi Yamazaki started off as a model maker for Shirogumi in 1986.Godzilla destroyed the Tokyo shopping district of Ginza. (Image courtesy of Toho Company)In 2009, Takashi Yamazaki directed the action romance drama Ballad: Na mo naki koi no uta, where a young boy prays for courage to a great Kawakami oak tree and finds himself transported to feudal Japan.A major reason that Godzilla Minus One won the Oscar for Best Visual Effects is that there were both practical and digital effects.Yamazaki became part of the Japanese film industry while studying film at Asagaya College of Art and Design. When I was at art school, many expos were being held in Japan, and Shirogumi, which was skilled in creating unique visuals, was producing visuals for many of the pavilions, Yamazaki explains. There was a part-time job offer for this, and I was able to join Shirogumi as a result of participating in it. Visual effects were led by TV commercials, which had a relatively large budget to work with. We were also trying to introduce the techniques we had tried in TV commercials into film. Around the time I made my debut as a director, CG became more readily available. At that time, it was very difficult to scan live-action parts in theatrical quality, so we even built a scanner in-house that was converted from an optical printer. The pathway to becoming a director began when there was a call for pitches within Shirogumi leading to the production of Juvenile [2000], which revolves around a tween having an extraterrestrial encounter. The president of the company showed the idea I submitted there to Producer Shuji Abe, who was the president of another company; he liked it and worked hard on it, leading to my debut film.Science fiction goes beyond spectacle. The science fiction genre is interesting in that it can create things that do not exist in this world, Yamazaki observes. I also like the fact that it can be used as an allegory with various messages. The biggest reason for my attraction is that it excites my inner child. With science fiction comes the need to digitally create what does not exist in reality. I decided to become a director because I wanted to make films with the type of visual effects I wanted to make in the first place. When I made my debut as a visual effects director, most Japanese films didnt have spaceships or robots in them. I think that having three jobs at the same time is economical because I can judge things quickly and write scripts with the final image in my mind, so there is no loss of time.Yamazaki has directed 20 feature films. You never know what will be a hit, so when I have an original story, I only base it on whether it excites me or not. Making a film means you have to live with the original story for a number of years, so if its not a good match, it becomes hard to get on with it. I simply ask for good actors to join the cast. I am basically a person who wants to do everything myself. When it comes to the staff, I try to ask for people who are at least more skilled than me, people who have talent that I can respect.In Japan, Godzilla represents both God and Monster, so Takashi Yamazaki wanted its movement to feel almost divine or God-like in Godzilla Minus One. (Image courtesy of Toho Company)International markets are rarely taken into consideration when approving film budgets in Japan. This is because for a long time it was said that Japanese films could not go mainstream even if they were released overseas, and that was probably true, Yamazaki states. It was a great surprise that Godzilla Minus One was seen by so many people overseas, and to put it bluntly, it was a joyful experience that opened up new possibilities for future film production in Japan. Hopefully, the budget will reflect that element. I guess well just have to build up our track record and prove that pouring big budgets into it is not a bad option. Stories scripted and directed by Yamazaki have ranged from siblings trying to learn about their grandfather who died as a kamikaze pilot in World War II in The Fighter Pilot, to contributing to the Space Battleship Yamato franchise where an interstellar crew attempt to locate a device to make a devastated Earth inhabitable again, to a forbidden book that can grant any wish but at the cost of a life-threatening ordeal in Ghost Book. The growing popularity of video games has not altered the essence of storytelling. Interesting stories are interesting in any media, and the core of stories that can be developed in various media continues to be influenced by stories that have been around for a long time.Back in the early digital age when Takashi Yamazaki was learning how to create visual effects.At the age of 10, Takashi Yamazaki ventures to downtown Matsumoto with his sister Satsuki.An extremely complex shot to design, create and execute is found in Godzilla Minus One, where a kamikaze pilot has to overcome survivor guilt in order to protect those he loves and Japan from the rampaging title character. The sea battle between Shinsei Maru, the mine disposal ship, and Godzilla was difficult because we had to combine a live-action small boat with CG waves and a giant Godzilla, Yamazaki reveals. The boat in the foreground is live-action, so it was a very time-consuming job to build the waves at a level that would blend in with it. Im glad it worked out.When asked what are the essential traits of a successful director and what has allowed him to have a long career, he responds, What it takes to be a successful film director is to keep everything interesting all the time, but I am not sure about the career. It would be bad if a film failed, so I think its easier to prolong my life if I get the next project off the ground before the next film is released. Yamazaki is appreciative of his good fortune. Thanks to the many people around the world who liked Godzilla Minus One. Godzilla Minus One has received many wonderful awards. I will continue to make films, treasuring the memories of the days I created with you all. Thank you very much. Arigato.The sea battle between the mine disposal ship Shinsei Maru and Godzilla was difficult because CG waves and Godzilla had to be integrated with the practical vessel.
    0 Comments ·0 Shares ·293 Views
  • VFX IN ASIA: BOOM TIME
    www.vfxvoice.com
    By CHRIS McGOWANMysterious creatures fall from space, prey on humans and use them as hosts in Parasyte: The Grey. The Netflix live-action sci-fi/horror series is based on the Hitoshi Iwaaki manga Parasyte. Dexter Studios provided the VFX. (Image courtesy of Dexter Studios and Netflix)The Asian VFX industry is experiencing a meteoric rise driven by a confluence of powerful forces, says Merzin Tavaria, Co-Founder and President, Global Production and Operations for DNEG. The region possesses a vast pool of highly skilled and technically adept VFX artists, a critical foundation for producing top-tier visual effects.Jay Seung Jaegal, VFX Supervisor for Seoul-based Dexter Studios, adds, I believe that the Asian region will become a new core base for the content and VFX industries in the future. As Asian VFX studios increasingly participate in global projects, their presence is expanding. Although they have already proven significant competitiveness and potential, I think there is still immense room for growth.Asia is playing an evolving role in shaping the global VFX ecosystem. Key regions and cities driving the growth of the Asian VFX industry include India, South Korea, Japan, China, Taiwan and Singapore, with Bangkok and Vietnam beginning to gain traction. Homegrown VFX studios like Dexter are on the rise, and multinational firms with VFX branches in Asia include DNEG, BOT VFX, Framestore, ILM, Digital Domain, Rotomaker Studios, Mackevision (Accenture Song), The Third Floor, Tau Films, Method Studios, MPC and Outpost VFX.South Korea has risen to become one of the most important Asian VFX hubs, and the trajectory of Dexter, founded in 2012, is one of the most impressive in South Korea. Jaegal says, As of the first half of 2024, the company has grown into a group with six subsidiaries connected. Dexters headquarters alone employs about 330 people, including around 200 VFX artists. Currently, Dexter Studios is active as a comprehensive content studio with an all-in-one system covering content planning, development, production, post-production, DI and sound. We are also expanding our business areas to new technology fields such as immersive content, AR/VR/XR and the metaverse.South Koreas Gulliver Studios handled the VFX for the Emmy-winning suspense/horror/survivalist series Squid Game. Season 2 is scheduled for December release. (Image courtesy of Gulliver Studios and Netflix)Along the way, Dexter has provided visual effects for several noteworthy films, including Parasite (2019), which captured four Academy Awards, including Best Picture. Jaegal comments, Parasite is a significant film in Korean cinema history as it was the first film to win an Academy Award. It marked a pivotal moment that showcased the excellence and prestige of Korean films to the world. Notably, Parasite is famous for its invisible VFX. Many people think that little VFX was used, but in reality, much of it was created after filming, including the two-story house of Mr. Park, the driving scenes and the neighborhood where Ki-Taeks family lives. Our company designed the VFX and DI [Digital Intermediate], and our subsidiary, Livetone, handled the sound, making us an all-rounder in post-production.Dexter also provided VFX for Space Sweepers (2021), which holds a special meaning as a Korean-style SF [sci-fi] film, Jaegal explains. It successfully [put together] a unique story involving space, the spaceship Victory and robots, which had not been commonly attempted in Korea at that time. We also handled all three post-production parts of this film. I think it redefined the standards for the space/SF genre that can be produced in Korea. Based on this experience, we [went on] to handle KSF-VFX for Netflixs JUNG_E, the Alienoid series, and The Moon. Recently, Dexter has worked on Knights of the Zodiac [with DNEG], YuYu Hakusho with Scanline VFX and Japans Digital Frontier, Gyeongseong Creature for Netflix and Parasyte: The Grey.A volcano erupts on the China/North Korea border and a special team tries to save lives and prevent further eruptions in Ashfall. (Image courtesy of CJ Entertainment and Netflix)Parasite, directed by Bong Joon-ho, won South Koreas first Academy Award for Best Picture in 2020. Dexter Studios provided the VFX for the class-conscious black comedy, most of the VFX invisible. (Image courtesy of Dexter Studios and CJ Entertainment)Gulliver Studios handled the VFX for Squid Game, winner of the 2022 Primetime Emmy Award for Outstanding Special Visual Effects in a Single Episode, which was among the six total Emmys garnered by the series. Squid Game VFX Supervisor Cheong Jai-hoon notes, After Squid Game was released on Netflix, it was gratifying and meaningful to see that viewers worldwide loved it, especially considering that they couldnt tell which parts were created with VFX.Gulliver Studios is a VFX company (also called C-Jes Gulliver Studios) established in 2019 in Goyang by C-Jes Studio. The latter manages actors, singers, and K-pop artists, and is involved in the planning and production of movies, dramas, and musicals, notes the firm. At the end of 2022, Gulliver Studios and C-Jes Studio merged to become a comprehensive content group that extends its scope from planning and producing theatrical films and OTT [Over-The-Top] content to post-production VFX.The first attempt to put Koreans on the moon ends in disaster, and a second try leaves one astronaut alone and stranded in space in The Moon, a survival drama about South Koreas manned flights. (Image courtesy of Dexter Studios and CJ ENM Studios)Looking at the growth of VFX in South Korea, Jai-hoon explains, Around 2015, there was a notable increase in the production of large-scale fantasy and action films within China, yet there werent many proficient VFX companies in China at the time. As a result, the majority of VFX work during that period was handled by Korean companies. As Korean VFX companies gained significant experience through working on various Chinese films, it led to substantial growth in the Korean VFX industry.As the volume of work in Korea increased exponentially, Korean VFX companies established satellite companies in countries like Vietnam and China, where labor costs were lower, and they also outsourced a significant portion of their work to India and Ukraine. As a result, the VFX industry in Asia experienced growth during this period, Jai-hoon remarks. By the late 2010s, the Chinese film industry faced a slowdown, which also halted the growth of the Korean VFX market. However, in the 2020s, the production of Asian content by platforms like Netflix and Disney+ revitalized the industry. Successes such as Squid Game and [prior to that] Bong Joon-hos Parasite [also] energized the global OTT production scene in Asia.DNEGs Indian crews contributed VFX to Godzilla x Kong: The New Empire. (Image courtesy of DNEG and Warner Bros. Pictures)Jai-hoon adds, Recently, there have been talks about Netflix increasing its investment in Korean content production and, following Disney+, even Amazon Prime is outsourcing a lot of work to Korean VFX companies. This signifies that the level of Korean VFX has already been recognized worldwide. Additionally, some global VFX companies like Scanline, Eyeline Studios and The Mill have recently entered the Korean market, gradually increasing their investment in Korean artists potential. As a result, existing Korean VFX companies are building pipelines according to Hollywoods VFX pipeline and standard production processes, different from the Korean system. Also, Korean artists with experience from abroad are gradually returning to Korean VFX companies.Westworld VFX in Goyang, Korea, was established in 2019 and has about 200 employees. Westworld handled the VFX for the Netflix sci-fi series The Silent Sea, the first large-scale project in Korea to use virtual production and LED walls. Asked about Asias VFX growth, Managing Director Koni Jung responds, Its difficult to say exactly, but the growth of young artists and the entry of global OTT platforms into Asia seem to be factors driving growth. [And] as Korean films and series achieve global success; an increasing number of overseas projects are being filmed and post-produced in Korea. Honestly, isnt it because it costs less than the North American market?Wooyoung Kim, Director of Global Business at Seoul-based VA Corporation, comments, As the investment of OTT platforms in the Asian market expanded during the pandemic, the budget for content rose significantly, and many content projects [were] planned that [could] expand expression in a technical direction. This led to successful outcomes for VFX companies in each country, allowing them them to showcase the technical skills that they may not have had in their home markets. VA successfully launched a Netflix series called Goodbye Earth, participated in Netflix Japans project YuYu Hakusho in 2023 and is working on the movie New Generation War: Reawakened Man.In India, DNEG has teams of thousands of talented artists spread across 10 locations (including Chennai, Bangalore, Mumbai and Chandigarh), encompassing both DNEG and ReDefine studios, according to DNEGs Tavaria. This strategic network allows for seamless collaboration with our Western counterparts on every DNEG and ReDefine film and episodic VFX and animation project. Were incredibly proud of the vital role that India plays in DNEGs global success. Tavaria continues, Our talented Indian teams play a pivotal role in all our top-tier international projects, from feature films to episodic television series. Just to name a few, our Indian teams have recently brought their magic to Dune: Part Two, Furiosa: A Mad Max Saga and The Garfield Movie, among others, showcasing their versatility across genres. Their expertise has also been instrumental in projects like Oppenheimer, NYAD, Masters of the Air, Ghostbusters: Frozen Empire, Godzilla x Kong: The New Empire, Borderlands and many others.Tavaria notes, Many Asian governments are actively nurturing the industrys growth. Take Indias AVGC Promotion Task Force, for example. This initiative recognizes the significant contribution VFX makes to the economy and aims to propel India further onto the global stage. By establishing a national curriculum focused on VFX skills development, the Task Force paves the way for India to produce even more world-class content.Larger Asian studios are staying ahead of the curve by rapidly embracing cutting-edge technologies. This ensures their VFX offerings and capabilities remain at the forefront of the global landscape. Tavaria says, This confluence of a skilled workforce and a commitment to technological innovation has solidified Asias position as a major player in the ever-evolving world of VFX.About DNEG, Tavaria comments, Were continually working hard to refine our global pipeline to open the door to a new era of creative collaboration across our locations. This allows our Western studios and Asian teams to work seamlessly together to push the boundaries of whats possible in VFX.Streaming has been another factor in the Asian VFX rise. Tavaria explains, The rise of streaming, along with a flourishing domestic film market, has fueled a surge in high-quality content, presenting a thrilling opportunity for the Asian VFX industry. This explosion of content demands ever more exceptional visual effects for Asian audiences that are hungry for stories that reflect their own cultures and aesthetics.The Masters of the Air wartime miniseries included VFX by DNEGs Indian visual artists. (Image courtesy of DNEG, Amblin Television and Apple Studios)Extraordinary Attorney Woo is an autistic lawyer with a photographic memory and a great love of whales. Westworld VFX contributed effects to the Netflix series. (Image courtesy of Westworld VFX and Netflix)South Koreas Dexter Studios co-produced and helmed the VFX for Jung_E, a near-future post-apocalyptic story about a warring faction on a desolate Earth that attempts to clone the brain of a legendary warrior to develop an AI mercenary and stop a civil war. (Image courtesy of Dexter Studios and Netflix)Space Sweepers was South Koreas first big-scale sci-fi film, proving they could handle the genre. (Image courtesy of Dexter Studios and Netflix)DNEGs Indian teams worked on VFX for Ghostbusters: Frozen Empire. (Image courtesy of DNEG and Columbia Pictures/Sony)Westworld VFX contributed effects to the Netflix series Black Knight, a South Korean television series based on a webtoon, a digital comic book read on smartphones in Korea. (Image courtesy of Westworld VFX and Netflix)BOT VFX has four locations in India (Chennai, Coimbatore, Pune and Hyderabad) and one in Atlanta. Our total team size is 800, says BOT VFX CEO and Founder Hitesh Shah. The firm has been working on many high-profile projects, including Kingdom of the Planet of the Apes, Fallout, 3 Body Problem, Shogun, Monarch: Legacy of Monsters, Knuckles, Civil War, The Boys 4, Furiosa: A Mad Max Saga and Bad Boys: Ride or Die.The size of the talent pool has been growing significantly in India thanks to nearly two decades of RPM work that motivated many new entrants to join the industry. According to Shah, What was a pool of several hundred artists in 2005 is in the tens of thousands today. Also, there is now a large base of training institutions that continually feed new talent into the ecosystem. From the large pool of talent, a portion has had the skills and the ambition to fill highly specialized and technical roles required to build full-service facilities.About the move to provide full-service VFX for Western clients, Shah comments, That shift is from three segments. First, those facilities that have been historically providing full-service VFX to the Indian domestic market are turning part of their attention to Western productions. Second, independent facilities that have historically been point-services providers for example, RPM roto, paint, matchmove are shifting towards full-service VFX. Finally, even large multinational VFX companies that have set up a footprint in India initially for point-services support [are] leveraging more Indian talent towards the full VFX value chain.Shah states, For India specifically, the growth of the VFX industry is driven by the strong value proposition it offers to Western productions in the form of three compelling components: strong cost advantage, a large talent pool and a broadly English-speaking talent pool that has an affinity with Western content.He adds, Despite strong tax incentives in other global regions and trends toward rising talent costs in India, the overall cost advantage of India is still compelling. It seems most Western productions implicitly, or sometimes explicitly, expect their VFX vendors to bake in the lower costs of getting RPM work done in India into their bids. Finally, the affinity to Western content and English has had a subtle but notable impact on VFX growth in India. Many young artists are bi-cultural and equally motivated to work on Western content as they are in working on Indian domestic films. There is a swifter cross-pollination and travel between India-based artists and team members in Canada, the U.S., the U.K. and Australia.In addition, VFX has gained prominence in Indian content, streaming and theatrical. The availability and affordability of VFX in the content creators toolbox have opened up whole new genres and the ability to tell epic Indian tales that were out of budget reach previously, Shah says. Keep in mind that India is not just one monolithic content market, but multiple mature regional markets with their own vibrant histories of storytelling, all of whom have taken a fondness towards what VFX can enable. The fact that the Indian VFX market is well poised to serve both Western content, as well as the expanding Indian domestic content, gives it a firm footing in the global VFX ecosystem.Peter Riel, Owner and CEO of Basecamp VFX, a small studio founded in 1996 and based in Kuala Lumpur, Malaysia. He argues that its important to understand how the SEA (Southeast Asian) market works. Riel says, Each nation here is quite sovereign both in terms of language and culture. While its easy to quickly take a glance and see the various countries similarities, its a mistake to think they all share the same cultural sentiments that perhaps the Western world does to films made in the U.S. As an example, one would expect Malaysian movies to be popular in Indonesia and vice-versa, due to their similar language and culture, but thats far from the case. I do still think there is tremendous value to be found in SEA VFX. The artists here are extremely dedicated and are used to working fast and efficiently.Kraken Kollective CEO Dayne Cowan has over 30 years of experience in the VFX business, worked with DNEG and other major VFX firms, managed VHQ Medias feature film divisions in Singapore, Kuala Lumpur and Beijing, and worked for Amazon Studios South East Asia as the Post Production Lead for Local Original Movies. Earlier this year, Cowan founded a new VFX firm in Singapore. Kraken Kollective is a next-generation post-production and visual effects management company, he says. It leverages the cost and skill benefits of Asia for foreign productions that would like to have the work done here but are unfamiliar with business environments, cultures and capabilities locally. Asia is a massive diverse region with so many countries. Working here may appear straightforward, but there are many unseen challenges that we help to navigate. The skill of the talent [here] continues to grow and develop, almost at an exponential rate. When combined with technology advancements like generative AI and the sheer size of the talent pool here, it represents a serious value add.Cowan comments, Parts of Asia have long been known for handling the entry-level work like roto, matchmove, paint work. As new technology changes the shape of things, I am seeing smaller companies emerge with stronger, specialized skill sets. I think people forget that nearly 4.5 billion people live in Asia, and with a rapidly developing talent base, it will play a huge role going forward in production and post-production. Broadly speaking, I believe we are looking at a boom time for Asian VFX.Dexter Studios was in charge of VFX for Ashfall, a 2019 South Korean disaster film. Dexter co-produced and distributed with CJ ENM Studios.Dexter Studios handled the VFX for Space Sweepers, a 2021 South Korean space western film regarded as the first Korean space blockbuster.Dexter Studios supplied VFX fuel for The Moon, a 2023 South Korean space survival drama film written, co-produced and directed by Kim Yong-hwa. (Image courtesy of Dexter Studios and CJ ENM Studios)
    0 Comments ·0 Shares ·275 Views
  • ARTISTIC ALCHEMY: THE PERSONAL CREATIONS OF VFX TALENT
    www.vfxvoice.com
    By TREVOR HOGGBanana Slug Vase (Image courtesy of Liz Bernard)In essence, an alchemist can transform matter into something else, which oddly enough, describes the creative process where pieces of paper, canvas, clay or a blank computer screen are turned into works of art by combining and applying different materials guided by human imagination. In the world of visual effects, digital toolsets reign supreme, but that does not mean that traditional mediums of oil paint, pottery or watercolor have been tossed to the wayside. Outside of office hours, private collections are being assembled that, in some cases, have entered into the public consciousness through art exhibitions and published childrens books. To showcase the sheer breadth of artistic ingenuity, seven individuals have been curated, each of whom demonstrate a unique perspective and talent, which we have the privilege to share with you.Liz Bernard, Senior Animation Supervisor, Digital Domain Art has been a part of the life of Liz Bernard ever since her graphic designer parents placed an X-Acto knife in her hands as a child. The creative inclinations have culminated in a career that has seen her animate the Alien Queen in Enders Game, video game characters in Free Guy and a lawyer with gamma-ray issues for She-Hulk: Attorney at Law. A major source of inspiration is a deep love for nature, which Bernard draws upon when producing each piece of ceramic, either through the art of wheel throwing or utilizing flaming trash cans.Parker Ridge (Image courtesy of Zoe Cranley)I took a day off because there is a workshop that only happens a couple of times per year at a local community arts center where you can do an alternative firing called Raku, which originated in Japan, Bernard explains. The idea is that you fire things in a kiln. While theyre still yellow hot, you open the kiln up, reach in with tongs and quickly take your pottery over to a prepared nest of newspaper situated in a sandpit; it instantly catches on fire, and you up-end a miniature metal trash can, which has even more combustibles, over your piece so to create a reduction atmosphere. You get these crazy metallic reds and coppers, beautiful colors that are hard to achieve with other firing techniques. Its an unpredictable, chaotic, elemental experience.I find that my animation background influences me heavily because Im always wanting to find an interesting pose for something, Bernard notes. You can do a straight-on profile of an eagle or find something that has more personality to it. I love finding those personalities in animals. and I try to put that into my work. There is a lot of experimentation. One of my favorite things to do right now is called Sgraffito, where I formed a piece of clay into a bowl, painted the entire interior surface in black and scraped away the lighter parts. What Ive been doing with these particular pieces is begin with a footprint of a local animal, like a heron, and then use the negative space to start drawing in random shapes. A different aspect of the brain gets creatively simulated. The reason I like this so much is because its so tactile and real. The images we make in the computer, you cant interact with using your hands. This is a nice counterpoint to what I do daily. Visit: www.lizupclose.comVenetian Caprice (Image courtesy of Andrew Whitehurst)Balduin Owl (Image courtesy of Sigri de Vries)Black Cats (Image courtesy of Sigri de Vries)Aragon at Christmas (Image courtesy of Andrew Whitehurst)Zoe Cranley, Head of CG, beloFXMajor franchises such as Jason Bourne, MonsterVerse, The Hunger Games, Wonder Woman and Mission: Impossible appear on the CV of Zoe Cranely, who has transitioned from being a digital artist to a CG supervisor to a more managerial role. Throughout all of this, the passion for oil painting has remained and led to an exhibition at the Seymour Art Gallery in Vancouver showcasing landscapes transformed into geometric shapes and blurred lines.Its being in them, Cranley observes. You can paint or draw anything you want. I used to do a lot of still life and flowers which look pretty. but they dont mean anything to you. Landscapes are so epic, and generally most of the paintings Ive done Ive been there, so Im drawn back to them and can remember that exact moment. Being in Vancouver, beautiful landscapes are abundant wherever you go. Unlike visual effects, the goal is not to achieve photorealism. When you look at a picture that is real, I dont have that desire to keep looking at it because you go, Oh, yeah. Thats what it looks like. I love it when people recognize not instantly what it is, but then have an attachment. I feel like Ive done what I had set out to do, which is to capture the essence of that place in an abstract way.The Faun (Image courtesy of Mariana Gorbea)Ive been using oils for at least 20 years and wont go back to anything, states Cranley, who is not a fan of digital painting. There is something so magical about putting a paintbrush to a canvas. I like that it takes so long to dry and is so malleable for so long. You can do so many different things to it based on the stage of drying. Also, I like the science of the various solvents that you can use. So much of it is the fundamentals of design, color, negative space and composition. Generally, the meaning to me is what makes a nice picture. The quality of the work and brushstrokes have improved. Ive gotten a lot more critical and precise. The edges are neater and I have learned to varnish properly. I have refined the process. A lot of people have said that Ive gotten more abstract. Last year, I learned how to digitize everything, which was a whole process in itself. Visit: https://www.zoecranley.artSigri de Vries, Compositor, Important Looking Pirates There is no shortage of high-profile projects to work on whether it be Shgun, Avatar: The Last Airbender, Ahsoka or Foundation, but this has not stopped Sigri de Vries from embarking on a journey to discover her medium of choice. Along the way, she was hired to create the cover and 12 illustrations for the childrens book Balduin, die Entdecker-Fledermaus by Bianca Engel. I was expecting more kickbacks and having to re-do things and such, instead I was given a lot of freedom with how I wanted to do the illustrations and what parts of the story I wanted to paint, de Vries states. I started with a few sketches of the various characters, and once I got the green light on those, the rest of the illustrations were smooth sailing.Ethereal Cathedral (Image courtesy of Marc Simonetti)Pink Kits (Image courtesy of Zoe Cranley)Experimentation is the only constant. I always start with a sketch, de Vries remarks. I erase the sketch so you can almost not see the lines and then do the watercolor and a pen on top. I found that to be what I like aesthetically, but Im still at the beginning of this journey where Im experimenting a lot and looking at YouTube videos for inspiration and techniques. I follow a number of artists on the Internet and want to do what they do. I want to try everything. Ive done watercolor, clay sculpting, digital art, acrylic and ink. Its my hobby, so Im just having fun! Initially, the plan was to learn digital painting to do concept art. I did a lot of landscapes and played around with compositions. I also did a lot of matte paintings at work, but matte paintings are more photo collaging than painting. As my journey progressed, I got interested in characters and creating them in a cute illustrative style.Phil Tippett Portrait (Image courtesy of Adam Howard)Deathly Silence (Image courtesy of Mariana Gorbea)When I finally had enough money to buy an iPad, I switched from Photoshop to Procreate, de Vries states. Since then, Ive been painting so much more. Procreate is so easy and intuitive, and I can paint and draw directly on the screen, which I love. What a lot of artists do is paint with an actual brush on paper, scan that and use it as a texture for a brush in Procreate. My next big project is a scanner/printer so I can do that stuff as well because it sounds fun to make your own brushes.Visit: https://www.artstation.com/sigriMariana Gorbea, Senior Modeller, Scanline VFXModeling creatures and characters is something that Marianna Gorbea does on daily basis for Scanline on projects such as Game of Thrones, X-Men: Dark Phoenix or Terminator: Dark Fate, but that all occurs within the digital realm. This expertise has also been taken into the physical world where clay is literally shaped and transformed into figments of her imagination. I started with ZBrush and then moved onto clay, Gorbea states. The biggest difference is that you have to be mindful of what youre doing with clay because if you mess up, those hours cannot be taken back. Lessons have been learned from working with clay. It has made me observe more of the whole picture, to be more careful with details, composition and how a sculpture looks from all angles; that has helped me to make better sculptures in ZBrush. The tools I use with clay, I try to replicate in ZBrush and vice versa.Gorbea is attracted to crafting fantastical creatures. Mexican culture is fascinated with death, and some artists can turn dark things into something beautiful. Im drawn to that, and thats why I try to sculpt creatures and characters. Designs are simplified for clay. Building armatures is the hardest and trickiest part with clay. It has to be able to stand. You have to be familiar with skeletons. For example, if Im making a horse, Im looking at horse anatomy, how the bones are built and proportions. I build the armature first because if that is not done properly, its not going to work.Three types of clay are utilized: oil-based, polymer and water-based. All of them are quite different, so I have to think about how Im going to make a structure and base for it, Gorbea remarks. Water-based clay dries quickly, and I use it to make bigger sculptures that have fewer details. With polymer or oil-based clay, you get to spend more time with it and put in more detail; I use them for smaller sculptures. The sculptures are usually made of several pieces, and I create layers of detail. Depending on the size, a sculpture can take five to 10 hours. The hardest part of making a sculpture is to give it personality and convey emotion. If the face, especially the eyes, dont work, then the sculpture is not going to work. Visit: https://www.instagram.com/margo_sculptures/Adam Howard, Visual Effects SupervisorInterwoven with the space exploration universe envisioned by Gene Roddenberry, Adam Howard has been lauded with Primetime Emmys for Star: Trek: Voyager and Star Trek: The Next Generation as well as nominations for Star: Trek: Deep Space Nine and Star Trek: Enterprise. However, his artistic eye has gone beyond the realm of Federation and Klingon starships as he paints with light to produce character studies of friends, colleagues and celebrities.Aftermath (Image courtesy of Marc Simonetti)The human face is a never-ending source of wonderful detail and surprise, Howard explains. Based on a photograph, I start with a detailed pencil outline that determines the overall shape of the face. Within that outline, I also mark out areas for shadow and highlights. I paint masks for each major area: face, eyes, ears, neck, hair, beards and clothing. Once each area has a clean mask, then I start the actual painting. First, come base colors and areas of shadow and highlight followed by middle ground detail then eventually on to finer detail. I paint in digital oils because I love being able to blend my paint to help give subtle form to each area. I also love the fact that by painting on my iPad, I can paint anywhere. I am not restricted to a physical studio or materials.Sleeping Beauty (Image courtesy of Marc Simonetti)Buying Pane Cunzato in Trapani (Image courtesy of Andrew Whitehurst)Tidal Raku Vase (Image courtesy of Liz Bernard)Slip Trailed Box (Image courtesy of Liz Bernard)Howard begins each portrait by painting the eyes. Eyes truly are the window to the soul, and I try to capture the real essence of each subject by painting the fine detail and shape of eyes. Sometimes, it can be a really tiny detail like a single point of highlight on an eyelid that makes the person feel real. I love those moments when the face pops off the page at me as the person I am painting. Depending on the portrait, I sometimes work in additional detail over the final painting from the original pencil outline. This can assist in deepening lines around the eyes and accentuating hair detail. I used to do colored pencil and ink portraits on a plain white background. The backgrounds have become more detailed. This plays a big part in portraits, like my paintings of Ve Neill and Steven Spielberg, where so many films they have created are represented in the background. Sometimes, the backgrounds take longer to paint than the person. Visit: www.adamhoward.artMarc Simonetti, Art Director, DNEGInitially trained as an engineer, Marc Simonetti decided to become a concept artist and has made contributions to Valerian and the City of a Thousand Planets, Aladdin and Transformers: Rise of the Beasts. He has also illustrated Discworld by Terry Pratchett, Royal Assassin (The Farseer Trilogy) by Robin Hobb and The Name of the Wind by Patrick Rothfuss. When I started my career, the only job available was for book covers in sci-fi or fantasy, Simonetti notes. I grew up with that trend. Maybe I would have had a completely different style if I had tried fine art first. But thats life.Sometimes, I start with watercolors or pastels, but that is rare because we have to be fast, Simonetti remarks. The only thing that I try to do all of the time is to change my process because I need to have fresh options. If I stick to something then my picture will always look the same. At the same time, its trying to be as honest as possible. Most of the time, I start with pencil and paper because its the easiest one. Once the composition is set in my mind, there is an app, Substance 3D Modeler, that allows you to sculpt in VR, which is a lot like ZBrush. I use my Meta Quest headset to scout everything. I can put lighting on the model and find different cameras. I also can create a library by sculpting a tower or window that are used later on. Once I have the model, I can use Octane, Cinema 4D, Blender or Unreal Engine. Then I render and paint it in Procreate or Photoshop.Sketches are conceptualized without references. I want to be as free as possible to set up a good composition, Simonetti states. However, when I need to fill the composition with elements, I try to have lots of references whether its architecture or anatomy. Everything has to be grounded. Even when Im making an alien, it has to be believable. Same thing with architecture. I want people to connect with it. If you dont have any reference for the scale, it takes people out of the painting. Lighting is critical. When Im using 3D, its a huge help. Im trying so many different lighting scenarios to fit the correct mood and to be as impactful as possible. Visit: https://art.marcsimonetti.com/Oyster Shell Bowl: Eagle Talon (Image courtesy of Liz Bernard)Andrew Whitehurst, VFX Supervisor, ILMGiven that Andrew Whitehurst studied Fine Arts before becoming an Oscar-winner for Ex Machina, his belief that music, pictures, lunch and ball sports are the greatest achievements of humanity is not entirely surprising. The enjoyment of studying faces and drawing caricatures has come in handy. If I know that were doing a digital face for someone, literally the first thing that I will do is type an actors name + caricatures and search the Internet, Whitehurst reveals. If there are loads of good caricatures then its going to be an easier job because something is capturable about their likeness. If there arent that many good caricatures then its going to be much harder. There arent many good caricatures of Harrison Ford, and it was hard.Al Pacino Portrait (Image courtesy of Adam Howard)Dragon (Image courtesy of Sigri de Vries)Ringwraith (Image courtesy of Mariana Gorbea)There is an interplay between the way that I paint and what I understand about the world, which I have gleaned from doing visual effects for a long time, Whitehurst notes. Im always trying to make something psychologically interesting. I love abstract art, but Im not good at doing it. I started doing a lot of landscape paintings, and I discovered what painting is to me; its a way for me to meditatively spend time somewhere I find special or engaging in some way; and to have the opportunity to think about it, enjoy it, and try to capture something of it, but in a reflective way.If Im going on location or holiday, I have a sketchbook with me, Whitehurst remarks. I will do black-and-white pen-and-ink drawings. Some of them I will scan and add color in Procreate later if I feel like it. The drawings tend to be a more immediate reaction to a place and have more of a comic book style because that is generally how I draw. I like to exaggerate and use a little bit of imagination. The paintings consist of casein on wooden panels. Casein has the advantage over gouache because when its properly dry, it doesnt reactivate as easily, so you can paint over the top of it, and its slightly creamier in texture, so its a little bit like oil paint but is water soluble and dries quickly. I would paint in oil but for the fact I cant have my house stinking of turpentine!Contact: @andrewrjw on Cara and Instagram.
    0 Comments ·0 Shares ·262 Views
  • CATCHING A RIDE ON THE VFX VENDOR-GO-ROUND
    www.vfxvoice.com
    By TREVOR HOGGThe foundation for shows such as Vikings: Valhalla are previous collaborations that enabled visual effects supervisors and producers to deliver shots within budget and on schedule. (Image courtesy of Netflix)Compared to the 1990s when only a few visual effects companies were capable of doing film-quality work, the number of options has exploded around the world. It is fueled by the ability to achieve photorealism and the growing acceptance of CGI as a filmmaking tool. As a consequence, the success or failure of a project is often dependent upon hiring the correct group of supervisors, producers, digital artists and technicians either through a single or several vendors who aim to achieve the same goal. In some ways, the vendor selection process has remained the same, but in other areas it has become sophisticated, reflecting the maturity of the visual effects industry as it travels further down the pathway once traveled by the crafts of cinematography, editing, production design and costume design to become entrenched members of the entertainment establishment.Vendor connections begin with the production companies, studio visual effects executives or visual effects producers and supervisors. On the studio side, we break down a script; we are typically the first ones, and we tend to do this before a director is hired, states Kathy Chasen-Hay, Senior Vice President of Visual Effects at Skydance Productions. We work closely with a line producer to discuss shoot methodology, then well send the breakdown out to four or five trusted visual effects companies. We pick these vendors based on their specialties, shot count and the size of the budget. Finding and hiring vendors is a group effort. The VFX studio executives work closely with the visual effects teams when picking vendors. Since studio visual effects executives work with tons of vendors, we know and trust certain vendors. Did that vendor deliver on time? Was the work stellar? Did we get change-ordered to death? Relationships are key, and several VFX supervisors have built relationships with vendor supervisors, and its important to support these relationships; after all, they are the ones in the trenches, day after day. Agents are typically not involved. Relationships are built on past projects. Successful vendors have someone at the top who communicates with studios, production companies and the visual effects producers. We trust these folks as we have worked with them on prior projects. Its all about previous projects.Established relationships are favored given the difficult nature of delivering finished shots within budget and on time. Depending on the type of work required in the visual effects breakdown, the visual effects production team would work together with their production company and/or studio to start understanding how many vendors may be needed and which ones have the capacity and capabilities to handle that type of work in the given timeframe, explains Jason Sperling, Creative Director/VFX Supervisor at Digikore Studios. This can help narrow the list of possible vendor candidates significantly, and at that point, visual effects production teams begin the specific task of reviewing vendor demos and sample bidding numbers and expressing the creative and logistical expectations. If individual artists are needed for an in-house visual effects production team, they begin to assemble and reach out to their known available freelance crew or other resource lists.The selection process for visual effects teams can vary significantly depending on the structure and needs of a particular project, states Neishaw Ali, CEO and Executive Producer at Spin VFX. While sometimes studio preferences might dictate the choice, more commonly, the decision-making is often led by the VFX supervisor and the VFX producer. These key figures play crucial roles due to their expertise and their understanding of the projects technical and artistic requirements. The visual effects supervisor is primarily responsible for ensuring that all visual effects are seamlessly integrated into the live-action footage and align with the directors vision. Meanwhile, the visual effects producer manages the budget, scheduling and logistics of the visual effects work, coordinating between the studio and the creative teams. Their collaboration is essential in choosing the right visual effects team[s] that can deliver high-quality results within the constraints of the projects timeline and budget.Scale and budget have an impact on the audition and hiring process. For independent films, I found theres more flexibility, while the big studio productions may have predetermined criteria or preferences, notes Julian Parry, VFX Supervisor and VFX Producer. Producers and directors typically seek out talent based on their track record and previous work. Artists or visual effects houses with impressive portfolios are usually approached for potential collaborations. Its not uncommon when vetting a visual effects vendor that the artists are promoted in the pitching. Breakdown reels, showcasing previous work and expertise play a significant role in the hiring process. Producers and directors look for visual effects houses or artists whose style and capabilities match the projects needs and offer a detailed insight into their experience in specific disciplines, such as creating monsters, which can be crucial for achieving desired visual results.Producers and directors look for vendors and artists whose style and capabilities match the projects needs, like for The Wheel of Time. (Image courtesy of Prime Video)Considering the capacity of the vendor to meet deadlines and handle the complexity of the work is the first crucial step in the selection process for shows like Fallout. (Image courtesy of Prime Video)The selection process for visual effects teams can vary significantly depending on the structure and needs of a particular project such as Asteroid Hunters.(Image courtesy of IMAX and Spin VFX)Scale and budget have an impact on the audition and hiring process for vendors on projects like The Witcher. (Image courtesy of Netflix)Another part of the vendor equation for films like Dungeons & Dragons: Honor Among Thieves are in-house visual effects teams that can consist of a designated vendor or a group of handpicked digital artists. (Image courtesy of Paramount Pictures)Generally, for VFX Producer Tyler Cordova, one to three major vendors are brought on board during the majority of prep and shoot to develop assets for films like Dungeons & Dragons: Honor Among Thieves. (Image courtesy of Paramount Pictures)Considering the capacity of the vendor to meet deadlines and handle the complexity of the work is the first crucial step in the selection process. Can the vendor commit to delivering a specific number of shots by a set date, and do they have the necessary resources to handle the project? notes Pieter Luypaert, VFX Producer at Dream Machine FX. Competitive pricing is important, as multiple vendors are bidding for the same work. The vendors location also plays a role, as tax incentives can significantly impact cost. Breakdowns are a big part of the bidding process, as they provide the vendors with all the essential information needed to provide a first bid and propose a methodology. Does the vendor believe they can achieve a certain effect with a 2D solution? The chosen methodology can drive the cost and schedule. Lastly, pre-existing work relationships, mutual connections and shared history are important. Due to the interconnected nature of the visual effects industry, personal connections can ultimately be the deciding factor. Multiple vendors are often used to mitigate risks. The main vendor typically handles the majority of the work, while the studios visual effects production team oversees the coordination among the different vendors. This is becoming more common as the vendor providing virtual production services needs to collaborate closely with other vendors using their assets.In many ways, the career trajectories of individuals determine future studio and vendor collaborations. I was a compositor by trade and knew a lot of the people at Pixomondo who went on to form their own companies such as Crafty Apes, states Jason Zimmerman, Supervising Producer and VFX Supervisor at CBS. Me bouncing around working at Zoic Studios and Eden FX, you meet a lot of people along the way and collect those people with whom you resonate. Ive been fortunate to meet a lot of awesome people along the way who have either started a company or gone to a company. To me, its all about the artists, having been one myself. I keep track of all my favorite people, and they have all moved around and done great work at different places. Not everything is about past relationships. If someone has great visuals then youre going to try them out, regardless. Having a reel that has got a good variety is important because you know that they can do more than one type of shot or effect or discipline. And how does it look to your eye? Do you agree with the integration and lighting? All of those shots were driven by a supervisor, studio, timelines and budget. You take it for what it is, and every decision made was not only one person because there are a lot of people who go into making a visual effect shot work.Setting up a visual effects company has become more economical. The technology is at a point where if youre an independent artist, you can buy the software and render it on the cloud, notes Jonathan Bronfman, CEO at MARZ. You dont need infrastructure. But it has been that way for a while. Its quite homogenous. Everyone is using the same tech staff. We have artists who have worked at Wt FX and vice versa. What is that differentiator? Which is why we ended up developing the AI. Thats through differentiation. If you can nail the tech. Outside of the AI that were developing, were very much a creature/character shop. We still do environments because creatures and characters need to live in environments. There are other companies like Zoic Studios which are television-focused. But if you go to Wt FX or ILM, they do everything. Everything stems from reliability. Word of mouth is the result of doing a good job and executing at a high level. You have to produce good quality on time and budget. If you can do those things then it spreads. Certain stakeholders have to be impressed. You have the visual effects supervisor, visual effects producer, production company and studio. If you have all three levels of stakeholders, that is ideal. But ultimately, it is the visual effects supervisor who gets the final say.Conversing with potential vendors actually commences before the studio assembles a visual effects team. I will get a look at the scripts early, know what type of work it is, and I can reach out to my counterparts at some of those vendors, explains Todd Isroelit, Senior Vice President, Visual Effects Production at 20th Century Studios. Id say, We have project X, which has a creature/character or water effects simulation. Here is the rough schedule that were looking at. Its important to plant a flag in some of these vendors so your project is on their radar as theyre talking to all of the other studios and filmmakers about other things that might be happening in a similar timeframe and looking for similar resources. As we start to identify the team or the candidates for the team, well look at what projects theyve done and what relationships they have. Sometimes, well look at actually creating a package scenario where we are talking to a vendor and vendor supervisor. The proliferation of visual effects has led to more agent representation. In the early days, all of the visual effects supervisors were tied to facilities like ILM, Digital Domain and Sony. There wasnt this big freelance pool. As the industry grew and people started moving around, it became this emerging piece of the business that gave the supervisor a head of department status that fits into that below-the-line approach to filmmaking where you are looking at DPs and costume designers. Visual effects supervisors started having a bigger stake and voice in how the projects were coming together. Thats when I saw people getting agents started to evolve, even to the point where big below-the-line talent agencies who represent DPs, editors and costume designers started realizing the same thing. Agent representation is not as significant for the vendors as a point of contact for the studios. Executive producers or business development executives at the vendors; those are the relationships that we have, Isroelit says.Rather than hire agents, vendors tend to have a top executive communicating with production companies and studios to work on series such as Foundation. (Image courtesy of Skydance Television and AppleTV+)Having a reel that has a good variety is important because it demonstrates the ability to do more than one type of shot, effect or discipline when attempting to work on series such as Star Trek: Discovery. (Image courtesy of Paramount+)Conversations with potential vendors actually commence before the studio assembles a visual effects team, reveals 20th Century Studios Todd Isroelit, who worked on Prometheus. (Image courtesy of Twentieth Century Fox)Another part of the vendor equation are in-house visual effects teams that can consist of a designated vendor or a group of hand-picked digital artists. In my experience, an in-house team usually comes in closer to the end of post-production to do easier, mostly non-CG shots, remarks VFX Producer Tyler Cordova. Typically, opticals, re-times, split screens and simple paint-outs, things of that nature. Its important because its a cost-effective solution to have a small team do simpler shots after the edit has settled. Ive hired in-house artists on past shows through contacts Ive worked with for years and years. In-house artists will suggest other artists theyve worked with as well. There are some legendary in-house artists that a lot of visual effects producers know about Johnny Wilson, John Stewart, looking at you! though some studios and producers prefer going to a vendor instead of using in-house artists to give some accountability to a company performing efficiently, rather than relying on individual artists to manage themselves; it depends. In-house teams are rarer these days since COVID-19 hit, and a lot of productions seem to be hiring smaller in-house-type vendors rather than individual artists so they can do the work securely and efficiently while working remotely.
    0 Comments ·0 Shares ·273 Views
More Stories