-
- EXPLORE
-
-
-
-
Professional Honorary Organization
Recent Updates
-
WWW.VFXVOICE.COM20 YEARS ON: REVISITING THE MATRIX RELOADED AND REVOLUTIONS, AND OTHER 21 st CENTURY VFX CLASSICSBy TREVOR HOGGA true mark of innovation is when one can look back two decades later and still be impressed by what was achieved given the technology, tools and resources available at the time. Say what you will about The Matrix Reloaded (2003) and The Matrix Revolutions (2003), but it is amazing the number of top-level visual effects professionals who have emerged from them, such as John DJ DesJardin (Man of Steel) and Dan Glass (The Tree of Life). Adding to the complexity of The Matrix Reloaded and The Matrix Revolutions were the logistics of having both shot at the same time in sequential order and released six months apart from each other.While John Gaeta (The Matrix) was the Overall Visual Effects Supervisor, the real and virtual worlds were divided respectively between DesJardin and Glass. I was always fascinated by the real world as a subject because of this notion it had become a dystopian nightmare, DesJardin notes. The idea that we were going to go to Zion and see all of the aspects of it, not just where they live but the machines that keep it running and the big temples. Then the fetus fields and right down to Machine City. Those are my favorite things. The battle was a nail-biter to get that done. I recently came across a shooting assessment that I made and delivered to the producers for the how we were going to shoot the guys waging the battle in the APUs [Armoured Personnel Unit]. It was a big motion-control effort. But I will say, when I learned the story of the film, one of my favorite moments and couldnt wait to get a handle on to make was when Neo and Trinity fly up above the clouds to get rid of the Sentinels that are clinging to the ship, and you get to see the sun for the first time in the real world. Its a great idea, and I love the way it came out.The street scene took awhile because we had 50 doubles for Agent Smith wearing printed masks; along with them we built mannequins from the cast of Hugo Weaving. The doubles were in the background, and in front of them were two mannequins that they could move left and right. When Hugo brought his kids on set, they were slightly horrified! There was 151 of dad there!Dan Glass, Visual Effects Supervisor, The Matrix Reloaded & RevolutionsThe Matrix Reloaded & RevolutionsIn the Oracles Kitchen set shooting for The Matrix Reloaded and Revolutions, Digital Effects Producer Diana Giorgiutti takes chrome and gray-ball notes with Visual Effects Supervisor Kim Libreri. Visual Effects Supervisor Dan Glass is off to the side left with James McTeigue (1st AD) and Bill Pope (DP). (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)Visual Effects Supervisor John DJ DesJardin was responsible for executing the Sentinel fight sequence in The Matrix Revolutions, which involved full-sized APU units being on set at Fox stages in Sydney. (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)Crater action from The Matrix Revolutions, with Keanu Reeves lining up to shoot accompanied by mud and rain. (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)Visual Effects Supervisor Dan Glass on set with the late Diana Giorgiutti shooting chrome and gray balls for The Matrix Reloaded and Revolutions. (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)A notable shot of the Agent Smith mannequins from the Super Burly Brawl scene in The Matrix Revolutions where the many Smiths watch Agent Smith fight Neo in the rain. (Image courtesy of Diana Giorgiutti and Warner Bros. Pictures)Cutting-edge digital human technology was utilized to create the Burly Brawl and Super Burly Brawl involving the massive onslaught of Agent Smith clones. I watched the Burly Brawl fairly recently, and the reason why it holds up is the split-screen work is all photography, and as you get into the more virtual work, for its time it was ambitious and pulled off some incredible things, Glass remarks. The Super Burly Brawl took the longest to shoot. There was a side thing where the Wachowskis didnt want just rain. The raindrops had to be oversized. The special effects team was trying to figure out how to make this extra-wet, blobby rain. The street scene took awhile because we had 50 doubles for Agent Smith wearing printed masks; along with them we built mannequins from the cast of Hugo Weaving. The doubles were in the background, and in front of them were two mannequins that they could move left and right. When Hugo brought his kids on set, they were slightly horrified! There was 151 of dad there! And a lot of rain. It was grueling for us and I imagine as well for the actors. The digital doubles of Agent Smith were not simply carbon copies. You always try to bring some level of individuality so it feels more credible. The advantage of working in the Matrix for those movies was it was about a simulation, so it gave us some leeway, Glass adds.Spider-Man 2Sam Raimi rehearses a scene with Alfred Molina during the shooting of Spider-Man 2. (Image courtesy of Columbia Pictures)A signature fight in Spider-Man 2 occurs onboard a speeding train, which showcases the lethal might of Doc Ock. (Image courtesy of Columbia Pictures)A technological and creative breakthrough for Spider-Man 2 was the fidelity achieved in producing a digital double of Alfred Molina as Doc Ock. (Image courtesy of Columbia Pictures)Each one of the tentacles of Doc Ock was given a distinct personality with the circular light in the center conveying a sentient quality harkening back to HAL in 2001: A Space Odyssey. (Image courtesy of Columbia Pictures)It was important to given the digital double of Spider-Man the correct inertia and ground the camerawork to make the performance believable. (Image courtesy of Columbia Pictures)When I learned the story of the film, one of my favorite moments and couldnt wait to get a handle on to make was when Neo and Trinity fly up above the clouds to get rid of the Sentinels that are clinging to the ship, and you get to see the sun for the first time in the real world. Its a great idea, and I love the way it came out.John DJ DesJardin, Visual Effects Supervisor, The Matrix Reloaded & RevolutionsSet in New York City, Eternal Sunshine of the Spotless Mind (2004) explores what happens if a couple breaks up and go through a medical procedure to get rid of their memories of each other. This was the best script Ive ever read in my life, states Louis Morin, who was at the time a Visual Effects Supervisor for Buzz Image Group and made suggestions about ways of erasing memories, such as having abstractions melt and disappear. The producer said that Michel Gondry (Be Kind Rewind) didnt want any visual effects supervisor on set. It was to be a free camera style of filmmaking and no lights, like Breathless by Jean-Luc Godard. Camera tracking was hellish. There was the Pan from Hell, which is exactly the Breathless shot and from the peculiar mind of a director who decided to flip the image so that the actor was walking into a flipped image of himself. We then had to marry the two together with tracking, morphing, and put in a telephone pole to help us out. The camera goes four times like that. As Joel Barish (Jim Carrey) keeps going back and forth on the street, the details in the imagery begin to fade away. We had to redo the whole store in CG to be able to erase everything step by step, Morin adds.Eternal Sunshine of the Spotless MindCG chopsticks were added to make the shot transition seamless for the sofa bed scene in Eternal Sunshine of the Spotless Mind. (Image courtesy of Louis Morin and Universal Pictures)To visually depict memories being erased, subtle details were removed, such as a leg belonging to Clementine Kruczynski (Kate Winslet). (Image courtesy of Louis Morin and Universal Pictures)The Pan from Hell in Eternal Sunshine of the Spotless Mind required seamlessly transitioning back and forth from footage that was flipped 180 degrees. (Image courtesy of Louis Morin and Universal Pictures)Then there was the preceding moment featuring a missing leg belonging to Clementine Kruczynski (Kate Winslet) and a falling car. Michel wanted to have the first moment indicating that the memory of Clementine is being erased and he said, Remove a leg, Morin recalls. I said to him, Nobody is going to notice that. We did it and nobody was seeing it. Also, Clementine didnt turn her head at the right time, so we Frankensteind the shot by taking the head from a longer take, which worked well, but nobody was noticing the leg again! Somebody suggested that we could have a car fall down from the sky. Everybody thought it was ridiculous, but Michel said, Lets do it. We had to do that entire background and car in CG. At the end, everybody liked the idea, and it was powerful. There were also subtle digital adjustments taking place. Joel falls off of the sofa bed and reverses back into another shot of him on the sofa bed eating Chinese food with chopsticks. But the chopsticks werent working so we had to make them CG. A major visual effect was the collapsing house. At first Michel was talking about doing some optical iteration of the image. It wasnt looking great. Then Michel asked, Can we have a chimney collapsing? Upon seeing the test, he went, Wow. Can we have the house collapsing? The house became entirely CG and was destroyed by using rigid body dynamics, Morin says.When I talked to the people who were animating the shots of Spider-Man, I told them to imagine that he had his own cameraman, and the cameraman has to travel the same way as Spider-Man. As a result, you get a much more human or fallible version of camera operation that lends reality to it.John Dyksta, Production Visual Effects Supervisor, Spider-Man 2Departing from the normal routine from recruiting directors from within, Pixar collaborated with Brad Bird (The Iron Giant) to produce a superhero family adventure. In the process of making The Incredibles (2004), Visual Effects Supervisor Rick Sayre had to work out what he calls open problems, such as simulating the long, straight hair of Violet Parr, which was a key part of her character. Violet is a teenage girl and her power is mostly defensive, Sayre notes. She puts up a shield or turns invisible because she wants to disappear. You will often see her with one eye. She is hiding behind her hair. That was important to Brad. The existing hair simulation system had to be overhauled to allow for interaction. Explains Sayre, One of our tricks before was to randomly connect some sets of hairs to other hairs of invisible springs; that would allow for a coif to retain its volume, but that technique doesnt work with long hair because either the hair flattens out if there are no springs, or it looks like cement. We ended up embedding the simulation hairs inside of a volume, which was how they were able to couple their motions and collision responses to each other in a way that still isnt as computationally expensive as every hair looking at every other hair. Its as if theyre embedded in a block of invisible goo that is modulating these responses. We also used that block of invisible goo to infer some information that we used for lighting, shading and shadowing.The house in Eternal Sunshine of the Spotless Mind became entirely CG and was destroyed by using rigid body dynamics. (Image courtesy of Louis Morin and Universal Pictures)A comedic sequence occurs when Fashion Designer Edna Mode demonstrates how indestructible her superhero suits are to Helen Parr by putting them through a series of extreme tests, such as a flame thrower. Its funny you mention that because it has an almost live-action approach, Sayre remarks. We hadnt done a big effects film. Whats happening in the simulation chamber is done by a different team with a different set of techniques, even a different renderer, than Helen and Enda sitting on the other side of the glass. They are essentially on a set looking through the window at greenscreen where nothing is happening, pretending to react to all of this stuff that is done later and comped together. There was no way that we could have done all of that at the same time in the same system. The thing that caused our team the most headaches were the super suits, which were tight-fitting and caused simulation stability and collision fidelity issues. Because Edna is so amazing, her super suits have special visual properties. Theyre shiny and had these surface characteristics where we would see these rendering artifacts coming from the guts of how Catmull-Clark subdivision surfaces got rendered in RenderMan of the day. At some point, we were using a different kind of subdivision surface or a loop subdivision and then reprojecting it. The super suits that Edna doesnt make, like Syndrome, were easier to deal with because theyre more like regular cloth.Being able to make the protagonist crawl walls and swing through the air from buildings in a believable manner, and giving the antagonist mechanical tentacles that have a mind of their own, were a couple of many challenges John Dykstra faced as the Production Visual effects Designer on Spider-Man 2 (2004). Sam Raimi (A Simple Plan) wanted to use as many practical elements as he could, so we pursued Doc Ock that way, Dykstra remarks. It is tough for puppets to defy physics because they are in the real world and want to work in real time. We went through and prevised the entire sequence, and did computer-generated imagery for those shots where we felt puppeteering was impractical. The arms of [Doc Ock] were a digital endeavor from the get-go. The art department worked with us in terms of the design, and we worked with the vendor to figure out the animation look in regards to the speed and mass and how the arms worked. We were defying gravity a lot in Spider-Man 2. Realism was built into the CG camerawork. When I talked to the people who were animating the shots of Spider-Man, I told them to imagine that he had his own cameraman, and the cameraman has to travel the same way as Spider-Man. As a result, you get a much more human or fallible version of camera operation that lends reality to it.The IncrediblesA progression illustrating what dinner is like for the Parr family in The Incredibles. (Images courtesy of Disney/Pixar)The superpowers in The Incredibles are an extension of the character traits. (Image courtesy of Disney/Pixar)Concept art of the Parr home dining room. (Image courtesy of Disney/Pixar)Concept art by Don Shank exploring a major action sequence that occurs during the third act of The Incredibles. (Image courtesy of Disney/Pixar)Dealing with the long, straight hair, which was essential to the character of Violet, was a major technological hurdle to overcome for The Incredibles. (Image courtesy of Disney/Pixar)The Incredibles was the first time for Pixar that the principal cast consisted entirely of stylized human characters. (Image courtesy of Disney/Pixar)An extremely hard shot was the tight closeup of Doc Ock falling. Trust me, that was torn from the artists hands by the time it was put into the film! Dykstra laughs. The idea was to have a moment where we actually featured a CGI character with emotional content, and the challenge was to do it in a way that you would be convinced that it was real, especially when youre doing something with a real person. I suppose Alfred Molina could have done it, but I dont imagine he could have been underwater for so long! CG skin is always tricky. Things like pores and inconsistencies in surface reflectivity often contribute to the complex and somewhat visually noisy thing that is human flesh. In theory, Spider-Man is an ideal CG character because the material of the suit has a smooth matte finish and no hair or fur has to be simulated. When there is an absence of natural phenomenon, you end up questioning the verisimilitude of what youre looking at. It was important to improve upon the specular nature of the suit, and the way it wrinkled had variations in the texture of the surface of the body while it was in motion. Spider-Man 2 occurred during a transitional period from analog to digital solutions. Dykstra states, One of the things that we had to work on in that era was including world noise. We had to take the perfection of the computer-generated model and haul it back into the realm of the real world. Stuff like film grain and how it was reacting. Was it out or in focus? We had to study that to figure out how to apply it to the shots because its the filter through which you see the world.0 Comments 0 Shares 14 ViewsPlease log in to like, share and comment!
-
WWW.VFXVOICE.COMACHIEVING MAXIMUM ALTITUDE WITH THE VISUAL EFFECTS FOR EDGE OF SPACEBy TREVOR HOGGImages courtesy of Jean de Meuron and VFX Los Angeles.While the likes of Chuck Yeager, Neil Armstrong and John Glenn looked to soar humanity to heights, Jean de Meuron took his fascination with cinema to commemorate their aerial accomplishments with the short film Edge of Space, which was a winner at the OSCAR and BAFTA Qualifying LA Shorts International Film Festival 2024 (Jury Special Mention). Running 18 minutes, the story revolves around United States Air Force test pilots being sent on a suborbital mission with the hypersonic rocket-powered X-15, paving the way for America to land on the Moon before the Soviet Union.If you look at a project like Edge of Space and think, Oh, my god, this shot with the spaceship is going to be the coolest and the hardest one. Yeah, but the last shots to be approved were the ones with the visor because theres no place to hide. Youve got to make sure that skin looks good. Youve got not to distract from the performance. Everyone can sense what a reflection looks like on a curved piece of glass, so actually those were the hardest ones to get right.Charlie Joslain, Senior Visual Effects SupervisorAn actual X-15 was used for exterior shots and scanned to create a digital version for when the hypersonic aircraft takes flight.The X-15 was able to penetrate the Krmn line, which, officially per NASA, is the edge of space where you go 330,000 feet up in the air, states producer/ director/writer de Meuron. Many X-15 test pilots received astronaut wings.A blueprint for the production was The Creator by Gareth Edwards. The Creator was cost-effective, but it also gave a sense of real scale and scope. Denis Villeneuve or Gareth Edwards said, If you make the most of the frame reel, the added visual effects blend in naturally. Charlie Joslain [Senior Visual Effects Supervisor], Izzy Traub [Visual Effects Supervisor] and I storyboarded everything in pre-production so on set we knew exactly what we wanted filmed. We also scanned multiple assets and locations during different times of the day that were then built in 3D.Something that would easily go unnoticed are the addition of 3D tombstones created by VFX Los Angeles. As often with films of this caliber, in terms of 100 plus visual effects, a lot of it is invisible effects, Joslain notes. You expect all of the contrails and airplanes, but there is a ton of clean-up little lights in the background, and traffic in the desert to create an isolated place since the airbase was supposed to be secret. Jean wanted some tombstones and they looked good, but were placed in a way that was too narrow and didnt quite give the gravitas and scale of the sacrifice of American pilots for that cause. We did some research to make sure that we found the right kind of tombstones, recreated multiple CGI ones, and ended up not quite Arlington National Cemetery but something similar in the desert.The framed picture of American President John F. Kennedy was incorporated into the set decoration to authentically recreate 1961.Crafting the contrails of the X-15 was made even more complicated by the aircraft essentially being a hypersonic rocket. That little gap between the back of the airplane and the contrail, Joslain explains. Imagine that multiplied by x amount. Would you be able to see the X-15 up in the sky, when its 45 feet long and 70,000 feet up in the air? You probably wouldnt be able to see it, but if you dont show it, what exactly is our character looking at up in the sky? We had to find that balance of what it should have looked like and how do we represent it so its engaging for the audience? Through the use of plate versus recreating similar plates, then doing a lot of calculation, work and optical engineering as to what zoom lens would create what effect thats how we created the best of both worlds. It looks historically and scientifically accurate, but its telling a story, is still engaging, and the plane feels like its there.That little gap between the back of the airplane and the contrail: Imagine that multiplied by x amount. Would you be able to see the X-15 up in the sky, when its 45 feet long and 70,000 feet up in the air? You probably wouldnt be able to see it, but if you dont show it, what exactly is our character looking at up in the sky? We had to find that balance of what it should have looked like and how do we represent it so its engaging for the audience?Charlie Joslain, Senior Visual Effects SupervisorChanneling the cinematography of Days of Heaven, Jean de Meuron opted to shoot during the magic hour.Clouds became aerial landmarks indicating size, scale and speed. At one point, Glen Ford [Chad Michael Collins] penetrates [the Krmn line], and there is this massive, beautiful shot set against the sun, de Meuron recalls. Its backlit and silhouetted, but then we wanted to give a sense of scale. This is still earthbound, but the minute he penetrates, we go to space where we dont have clouds. The clouds helped us give a sense of scale and depth as well as layers and nuances with light, shadows, and a little underexposed in the foreground. We played heavily into those cloud formations. Even more important than scale is the sense of speed. Weve seen a million films and sci-fi movies, and again the X-15 is supposed to fly Mach 5 or 6, Joslain notes. When clouds of that scale start drifting past so fast, that helps to portray the sense of speed of the aircraft.Only digital versions of the Huey helicopter and B-15 are shown flying.Visor reflections are essential to have but hellish to pull off in a believable manner. Anything to do with the visor or helmet is a mixture, Joslain reveals. Roughly half of the scenes have the visor on where we had to erase reflections from outside of the cockpits and therefore recreate the performance, repaint skin or add the twinkle in the eye. The opposite was the true of the few shots that we got with the visor off where we had to recreate a CG visor and then repaint reflections from the cockpit and Moon on that visor. Thats more or less how this whole thing was tied together. No shot was untouched. If you look at a project like Edge of Space and think, Oh, my god, this shot with the spaceship is going to be the coolest and the hardest one. Yeah, but the last shots to be approved were the ones with the visor because theres no place to hide. Youve got to make sure that skin looks good. Youve got not to distract from the performance. Everyone can sense what a reflection looks like on a curved piece of glass, so actually those were the hardest ones to get right.Outer space was based on ISS footage. I would text pictures and references from astronauts in space either from the ISS, Mercury, Apollo or Gemini when they filmed and took pictures in outer space, de Meuron states. Its interesting because gradually the tones and shades of blue [change], Charlie and I would look at that. You can see from the ISS how the blue gradually transitions into a dark black and then becomes pitch black. Discoveries could be shared at any moment. Joslain recalls a funny anecdote that tells you a lot about Jeans dedication for the last two years: You would get a bunch of texts at 4 p.m., and I would go, I know this is Jean and hes found something! But most of the time this would actually take place at 3 a.m., and youre like, Jean, not now!The logos on the X-15 had to be digitally altered to make them period-accurate.We want to respect [director] Jeans [de Meuron] vision because thats what you want to achieve, but at the same time by damaging the perfection of the whole thing is how you achieve true perfection. My favorite shot is fully CG-made. Its the X-15 taking off, and its that super-long-lens-like 5000mm view of the X-15. There is enough shake in the camera and zoom play with the lens going on to add that sense there is an actual human being filming.Charlie Joslain, Senior Visual Effects SupervisorFor exterior shots, a real X-15 was photographed and scanned. Any sort of motion, such as the gears turning, were CG; even the front wheels because they werent quite right, Joslain remarks. As far as the texture, the real X-15 was used as a reference, but a lot of the logos had to be painted out, recreated and redesigned to match the historical plane, as opposed to the current NASA museum piece that it is. The cockpit was sealed off, so it was recreated in the studio by Production Designer Myra Barrera, with the visual effects team producing a digital version as well. We had LED panels and lights, and when you see the astronaut, its frontal, de Meuron states. I didnt want the actors profile because First Man had already done that. I wanted to do my own interpretation. I wanted it to be tight and claustrophobic in a real closeup or extreme closeup so we see every nuance of his performance, and maybe how he twitches or is sweating.One of the toughest elements to create and integrate were the contrails being generated by what is essentially a rocket.Capturing the aerial establisher of the landing strip was a drone. That was a real shot, explains Traub. As the camera keeps going, you see someone working on the plane; that was the same person doing the motion capture performances of everybody! Normally for a project, you can go in and purchase model packs and use them. We had to model everything from scratch because there wasnt anything that we could find for the most part that fit the historical references. One thing that is interesting is we actually replaced the X-15 in that particular shot because it gave us more control. One cannot have an airport landing strip without a control tower. We obviously didnt have a lot of photos of the Edwards Air Force Base in the 1960s, Joslain states. An important part of an airbase is going to be the control tower. We had an overview photograph of the base at the time. Assuming the picture and information were correct, we knew which month and year this was taken, and we did a bit of reverse engineering to figure out, according to the length of the shadow, normally how high the control tower was going to be. When we put the control tower in the shot, it was too small, so we had to make it bigger!A cool color palette was adopted for the outer space shots to make the cosmic environment feel colder.The landing shot of the X-15 was extremely difficult. The drone was more or less a continuous speed, but obviously an airplane landing and slowing down is not a continuous speed, Joslain remarks. But how do you create that? We had to find the right balance of what would be an accurate speed for the X-15 to slow down and grind to a halt. But matching that stopping moment with the twist of the pan of the camera, then having the jeep and vehicles enter, that was a complicated one to figure out. Contributing to the believability were lens aberrations. We were messing a little bit with the focus here and there, Joslain states. Adding a little grain there. Adding a little bit of a deep camera shake and vibration. We want to respect Jeans vision because thats what you want to achieve, but at the same time by damaging the perfection of the whole thing is how you achieve true perfection. My favorite shot is fully CG-made. Its the X-15 taking off, and its that super-long-lens-like 5000mm view of the X-15. There is enough shake in the camera and zoom play with the lens going on to add that sense there is an actual human being filming.A mixture of shots were done with visor up and down, with the reflections added and removed as needed.The X-15 does not actually take off but is attached to and released from a B-15.Drone photography was essential for the aerial shots of Edwards Air Force Base, with the buildings, vehicles and individuals digitally recreated.Cloud formations assisted with conveying the proper size, scale and speed of the aircrafts.Shots such as the B-52 releasing the X-15 were treated as if a camera operator was capturing the moment with a long lens.Particle simulations had to be produced on the ground and in the air. The stuff on the ground was the hardest, for sure, Traub notes. We have this sequence where the X-15 lands, there is a touchdown where the back of the plane basically slaps the ground, and there is an explosion of dust that goes up in the air. Then we see underneath the plane. Basically, the tracks are ripping up the ground as its coming to a halt. The X-15 pushes through a whole bunch of dust. In that same shot, you have this helicopter moving down and landing. The particle simulations become a lot more complicated because youre locked to lighting that is on the ground, so your lighting has got to be atmospherically correct. The shadows have to cast with the particle simulations that were doing in Houdini versus in the air. A lot of the particle simulations were atmospheric.CG tombstones were added to create a setting that had the gravitas of Arlington National Cemetery.One of the unique images features the death of a colleague reflected in the sunglasses worn by Glen Ford. The reflection of the explosion in the sunglasses was one of those cases where we did an absurd amount of reverse-engineering, Joslain explains, about the scale/size of X-15 contrail, the amount of curvature the piece of glass would have applied to it, and how it should have all looked to be 100% accurate versus what it needed to look like to be emotionally impactful, as well as aesthetically pleasing. Unreal Engine became a major tool for Edge of Space. One of the big things that we were dabbling with a little bit was Unreal Engine, but Unreal Engine became a key part of the pipeline when it came to all of the CG shots, Traub states. The reason for that was simply because of real-time rendering, the ability to tweak the lighting, quickly change the camera and output multiple versions. Especially when the deadline was coming up, it enabled us to move at speed that was a lot better. One thing that we had never done before was integrating Houdini simulations with Unreal Engine. Both of those paired up nicely, and by the time we had all of our renders, you could composite everything together in After Effects or Nuke. We got fairly adept with the Unreal Engine pipeline specifically for cinematic filmmaking, and it was a great experience. Well continue to use Unreal Engine for the rest of our projects most likely. Its an amazing tool.0 Comments 0 Shares 34 Views
-
WWW.VFXVOICE.COMWHATS OLD IS NEW AGAIN IN THE MADCAP WORLD OF BEETLEJUICE BEETLEJUICEBy TREVOR HOGGImages courtesy of Warner Bros. Entertainment Inc.Starting off as a motion-control cameraman on Batman, Angus Bickerton has gone on to to work with Tim Burton two more times and oversee the digital augmentation for Dark Shadows and Beetlejuice Beetlejuice, the long-awaited sequel to the cult classic where a mischievous ghost causes havoc for those alive and dead. It might be a combination of knowing me a little bit, reflects Angus Bickerton, Production VFX Supervisor on Beetlejuice Beetlejuice. Tim was wanting to put a project together and was keen to do it fairly quickly on a moderate budget. I like doing things practically wherever possible because I come from that background initially. Back when we were doing Batman, we didnt do anything digital or CGI.Tim Burton was insistent that the Shrinkers have animatronic shrunken heads and performers underneath them.Technological advances allowed for more visual sophistication. If you take the sandworms in the Titan desert, we wanted the lovely warm feel of the original stop-motion, Bickerton explains. Mackinnon & Saunders did actually stop-motion animate all of the sandworm shots, but One of Us composited the landscape. The landscape was made up of a digital matte painting and some CG noodle rocks, which are curly bedrocks. But then the simulations of the sandworm diving into the dunes on Titan thats where it got interesting. We played backwards and forwards on that one with Tim because in the original movie they probably put some sand on a tray and hit it from underneath, and they got splashes of sand that were used optically as elements for wherever the sandworm dived into the dunes. For us, we did it as a CG simulation. We showed Tim a variety, from trying to mimic a miniature element to trying to get some scale to it. We had to find an in-between level that was about right. We wanted to improve the look but didnt want to get too sophisticated.[I]f you look at the Shrinkers, which are animatronic, they have a stagey design. The Beetlejuice universe allows you a little bit of freedom of not being held to being completely photoreal. It did affect our thinking when we were approaching the sequences, how could we do it practically and augmented rather than just resort to CG? I wouldnt call that a huge challenge but a great joy.Angus Bickerton, Production VFX SupervisorMichael Keaton and Tim Burton reunite for the madcap world of Beetlejuice Beetlejuice, which aimed to capture the spirit of the original movie.One of Us worked with Framestore, BUF, Red VFX and Goldcrest VFX to create 1,200 visual effects shots. Framestore and One of Us were our two main vendors, with the Influencers sequence being the most complex as people literally get sucked into their smartphones. That was a late idea, Bickerton reveals. There is a moment in the original film when a couple have distorted their faces, which was all done with replacement animation and individually sculpted, remodeled faces. We wanted to evoke the feeling of the original, but we went for CG. I tried to stop- motion animate myself and warp myself to try being sucked into a phone. But Tim wanted more stress, like a wind tunnel effect on the faces, and have them look really frightened. Basically, we shot plates with and without them. We shot every character individually. We had to then divide the characters up into different levels, so if they were a hero character closer to camera, we scanned that particular performer in detail. We got them to go through the actions of pretending to be scared and sucked into the phone. We had a level of detail two for mid-ground characters, then a level of detail three for low-resolution background. Right at the end, Tim said, It would be nice to see the hair twitching. When you see the hair twitching, those were separately shot elements captured against bluescreen in our office.Tim [Burton] said, No. Im going to do [the Shrinkers] all practically. We never changed the performance at all. The genius is that its a combination of a physical performer and a puppeteer off-camera with a radio control unit getting those minimal movements. The Shrinkers are Keaton-esque, and by Keaton I mean Buster Keaton, as they do everything with a blank look in their eyes.Angus Bickerton, Production VFX SupervisorTim Burton and Michael Keaton discuss the finale which takes place in a church.Animatronic shrunken heads were placed on top of performers for the Shrinkers. They had a yellow suit on, white shirt, and a thin area on the top of the chest that lines up with where their eyes are, Bickerton states. They did have a limited vision, but its a credit to Tim. He said, No. Im going to do it all practically. We never changed the performance at all. The genius is that its a combination of a physical performer and a puppeteer off-camera with a radio control unit getting those minimal movements. The Shrinkers are Keaton-esque, and by Keaton I mean Buster Keaton, as they do everything with a blank look in their eyes. A new character named Delores staples herself back together. Thats an interesting mix. Tim was keen to get the majority of it in camera, so we did lot video-matics and blocking. Credit should go to Neal Scanlan and his team of puppeteers. When we did that sequence where Delores puts herself together, we had Monica Bellucci there and three other performance artists providing legs and arms the detached limbs. We did almost a black velvet theater where we blocked out the motion. Of course, they cant join a limb on, so we would get the elements as close as we could simply, then there is a fair degree of digital augmentation. If you wanted to see a stump end, we obviously had to tack on the end of a limb, arm or leg.Lydia Deetz (Winona Ryder) has gone on to become a host of a horror show called Ghost House.Recreated for the sequel from the original plans was the miniature representation of Winter River.A theatrical aesthetic rather than photorealism was the goal for Beetlejuice Beetlejuice.A running gag for Wolf Jackson (Willem Dafoe) is that he is always given a cup of hot coffee, which was combination of digital and practical steam captured in Angus Bickertons kitchen.A lot work went into the prosthetic makeup that reflected how the deceased died.Monica Bellucci portrays the ex-wife of Beetlejuice, Delores, who literally staples herself back together again.I tried to stop-motion animate myself and warp myself to try being sucked into a phone. But director] Tim [Burton] wanted more stress, like a wind tunnel effect on the faces, and have them look really frightened. We got them to go through the actions of pretending to be scared and sucked into the phone. Right at the end, Tim said, It would be nice to see the hair twitching. When you see the hair twitching, those were separately shot elements captured against bluescreen in our office.Angus Bickerton, Production VFX SupervisorRecreated using the original plans was the detailed model of the town of Winter River from which Beetlejuice emerges once again. We pulled the model apart and produced smoke and had under lights, but then we wanted to get a collapsing edge which we would have never gotten at a model of 1/58th scale, so thats CG augmented edges, remarks Bickerton. When it came to Michael Keaton actually emerging from the model, we had backed ourselves into a hole because we had built this model town into this replica attic and we were an 1/8 of a foot off on the deck. We raised the set by six feet. That allowed Michael to pull the model apart and cheat the camera angles so we could create a bigger gap and have Michael Keaton sitting on a camera dolly. We cranked him up with a camera dolly. There were two parts. Part one, he is sitting on the camera dolly so you get his head emerging. Part two, we put a standing platform on him. When you are behind him, the model is actually beyond him, and were shooting across his shoulder to make it look like hes in the miniature. The world-building revolved around plate photography. Because we aimed to get a lot of it in-camera, we were matching to what we shot. That train station to the great beyond was a big set, and we had to do minor set extensions and extensions to the train. We had an immigration hall, again a great set build by the art department, and we had good concept work, so we knew how to extend it in the same style of the cinematography, Bickerton explains.Outside of one shot where a Shrinker named Bob was enhanced with digital sweat, the rest of the performance was captured practically.The cinematography and lighting were as off-kilter as the story itself.Bickerton was excited about joining the project. When I had my earliest Zoom call with Tim, straightaway he said, I would like to do some stuff stop-motion. Neal Scanlan joined early on for a lot of the prosthetics and animatronics. With all due respect to Neal and his team, who were brilliant, if you look at the Shrinkers, which are animatronic, they have a stagey design. The Beetlejuice universe allows you a little bit of freedom of not being held to being completely photoreal. Overriding everything was the desire to retain the spirit of the original movie wherever possible. It did affect our thinking when we were approaching the sequences, how could we do it practically and augmented rather than just resort to CG? I wouldnt call that a huge challenge but a great joy.0 Comments 0 Shares 75 Views
-
WWW.VFXVOICE.COMSFX/VFX VETERANS OF ALIENS REUNITE TO STRIKE FRESH TERROR INTO ALIEN: ROMULUSBy JENNIFER CHAMPAGNEImages courtesy of 20th Century Studios, except where noted.The Alien franchise has always been about pushing boundaries. From Ridley Scotts original 1979 Alien to James Camerons Aliens in 1986, the series captivated audiences with its terrifying mix of storytelling and technical mastery. The practical magic of animatronics, puppetry and miniatures gave the extraterrestrial nightmare its unforgettable texture, creating a visceral experience that lingers in the minds of fans. Now, decades later, Alien: Romulus takes the franchise into a bold new era, merging those tactile roots with state-of-the-art digital techniques to craft a visual spectacle that feels both nostalgic and forward-thinking.Uruguayan filmmaker Fede lvarez, a self-proclaimed Alien superfan, helmed this latest chapter with a clear mission: to honor the franchises iconic feel through practical effects while leveraging the latest digital applications. What we wanted was the same visceral, real experience from the original films, lvarez explains. But we also wanted to embrace where technology is today. Its about respecting the legacy while pushing it forward. His commitment to practical effects was unwavering, bringing back veteran craftsmen from Aliens to oversee creature design and animatronics while seamlessly integrating cutting-edge CG. This hybrid approach sets a new benchmark for the franchise, balancing authenticity with innovation to deliver a fresh yet nostalgic cinematic experience.Legacy Effects Supervisor/Animatronic Puppeteer Shane Mahan at work. Alien: Romulus was filmed entirely in Budapest, Hungary, primarily at Origo Studios. (Image courtesy of Legacy Effects and 20th Century Studios)The team behind Romulus exemplifies the fusion of tradition and innovation. Alec Gillis, who supervised the chestburster effects with his team at Amalgamated Dynamics, Inc. (ADI), and Legacy Effects Supervisors Shane Mahan and Lindsay MacGowan, who oversaw the adult Xenomorphs, brought decades of expertise and passion to the project. Both teams, veterans of previous Alien films, approached the project as a homecoming a chance to revisit and evolve the iconic creatures they helped bring to life. The tools we have now let us do things we could only dream of back in the 80s, Gillis notes. But the goal was always to make it feel real to make it visceral.Matching [Director of Photography] Galo [Olivaress] lighting style was one of the most rewarding and demanding parts of the project. We had to ensure the digital creations didnt just blend into the live-action shots but felt like they were lit by the same haunting glow.Eric Barba, Production VFX SupervisorDespite its $80 million budget, Romulus boasts the visual scale and ambition of a much larger production. Filmed entirely in Budapest, Hungary, primarily at Origo Studios, the team maximized resources with state-of-the-art facilities and expert local crew. Their craftsmanship, enhanced by digital techniques, ensured the film retained its tangible, gritty texture while pushing the boundaries of whats possible in modern effects. This was further elevated by Galo Olivaress cinematography, which embraced rich contrasts and shadowy depth a visual palette that amplified the horror and suspense elements while integrating the palpable realism of the practical effects with the eerie atmosphere of the alien environments.Director Fede lvarez wanted the same visceral experience from the original Alien films, but he also wanted to embrace todays technology. (Image courtesy of Legacy Effects and 20th Century Studios. Photo: Murray Close).Production VFX Supervisor Eric Barba conveyed how Olivaress cinematography provided a nice and contrasty visual palette that amplified the horror and suspense elements of the film. The intricate use of backlighting and rich contrasts not only added depth to the alien environments but also created an oppressive sense of menace that underscored the films tone. However, Olivaress technique of working on the edge of exposure where shadows were allowed to fall presented unique challenges for the visual effects team. Matching Galos lighting style was one of the most rewarding and demanding parts of the project, Barba explains. We had to ensure the digital creations didnt just blend into the live-action shots but felt like they were lit by the same haunting glow. This attention to detail contributed to the films cohesive visual narrative, making the cinematography a central element of its atmospheric appeal.From left: Legacy Effects Supervisor/Animatronic Puppeteer Shane Mahan and director Fede lvarez. A lot of work went into maintaining a balance between practical and digital effects. (Image courtesy of Legacy Effects and 20th Century Studios)From a directors standpoint, lvarez doesnt favor storyboards carved in stone or excessive previs. I dont want to be locked into something too rigid before we start shooting, lvarez states. I like to leave room for creativity and discovery as we go. However, lvarez also recognizes that a film like Alien: Romulus with its intricate blend of practical and digital effects requires a well-organized previs process. The team still heavily relied on tools like 3ds Max for specific needs, particularly for lvarez to set up the camera and lighting angles. We did a lot of previs to ensure everything worked in the digital realm before we started shooting, Barba remarks. But the goal was always to keep that initial vision open to adjustments and surprises once we were on set.A tremendous amount of work went into keeping the balance of practical versus digital effects. What made [director] Fede [lvarez] really happy was that we didnt take that cheaper, faster way out. We embraced what he had shot.Daniel Macarin, Visual Effects Supervisor, Wt FXFor the VFX team, led by Barba, 3ds Max was invaluable for tasks like lvarezs camera placement. It was a crucial tool in helping them plan out specific effects sequences. The software allowed us to plan and explore shot compositions and digital environments before we ever set foot on set, Barba continues. It was like a virtual blueprint that guided the execution of practical effects and digital enhancements, ensuring everything would mesh seamlessly. While lvarezs preference for flexibility over rigid planning encouraged creative spontaneity on set, the teams meticulous previs work provided a strong technical framework that that guaranteed that even the most complex visual effects sequences were grounded and cohesive.Legacy Effects Shane Mahan and Lindsay MacGowan, both of whom contributed to Aliens, brought their expertise to the subadult and adult Xenomorphs. Alec Gillis returned to Romulus to supervise the chestburster effects with his team at Amalgamated Dynamics, Inc. (ADI). (Images courtesy of Legacy Effects and 20th Century Studios)The chestburster scene, one of the franchises most iconic moments, was reimagined in Romulus to showcase the fusion of old and new techniques. Gillis and his team began with digital sculpting in ZBrush, crafting an intricately detailed model that could be adjusted and refined before being 3D-printed at various scales. This allowed for a level of precision and efficiency that would have been unimaginable in earlier films. Once printed, the models were brought to life with animatronics, featuring translucent silicone skin, injected black fluids for shifting textures, and servo mechanisms for lifelike articulation. lvarez wanted the sequence to feel raw and unsettling, echoing the primal, slow-motion terror of a natural birth. Gilliss team delivered with a puppet that could drool, snarl and twist with terrifying authenticity, subtly enhanced by digital touches to perfect its movements.Modern 3D scanning techniques were used to refine the designs of Xenomorph while preserving the franchises signature bio-mechanical aesthetic.The shift from traditional clay sculpting to digital sculpting and 3D printing represents a pivotal evolution in the creation process for Romulus. We decided that we were going to do this not as a traditional clay sculpture but as a 3D digital sculpt, Gillis explains. Once we had that form, it allowed our mechanical department to start designing in 3D, ensuring everything aligned perfectly. The transistion from clay to digital not only helped to streamline production but also establish consistency between the large-scale and one-to-one models to smoothly integrate practical and digital. By moving away from clay, the team gained the ability to iterate quickly while maintaining the physical authenticity of the original Alien model.We knew that animatronics could give us the tactile realism we needed, but combining it with subtle digital enhancements gave the creature a fluidity that made it truly come alive.Lindsay MacGowan, Special Effects Supervisor, Legacy EffectsMiniatures played a central role in grounding the films massive set pieces. Producer Camille Balsamo-Gillis collaborated with New Deal Studios Co-founder Ian Hunter, a miniature effects veteran, to bring these elements to life. Using 3D scans and designs from Production Designer Naaman Marshall, the team built models that aligned with the films visual aesthetic. This level of precision allowed us to create miniatures that not only looked incredible but also matched perfectly with the digital environments, Balsamo-Gillis says. One of the most striking applications of their work was the Corbelin, a ship that evolved from initial 2D designs into a richly detailed 3D model that took on character in the process. The Corbelin and the Echo Probe were crafted to honor the original Alien production design while adding fresh elements unique to Romulus. A standout sequence involved the Corbelins dramatic crash into the Romulus, where the miniatures realism brought weight and texture to the scene. Once built, the miniatures were scanned, digitized and integrated into the film with the help of Wt FX and ILM. Other effects vendors on the film included Image Engine, Fin Design + Effects, Wt Workshop, Wylie Co., Metaphysic, Pro Machina, Atomic Arts and Tippett Studios.Wt FXs Daniel Macarin and his team used Maya, Houdini and proprietary applications to create fluid, organic movements for the CG creatures, making sure they felt as alive as their practical counterparts.The blend of practical and digital elements was central to the films identity. Barba, a veteran of TRON: Legacy and Terminator: Dark Fate, notes the importance of balance. Its not about replacing practical effects with digital ones, he explains. Its about enhancing whats already there and making sure everything feels cohesive. His team used Houdini and Maya to create environments and enhance creature movements, working closely with Wt FX and ILM. Wt FXs expertise in creature animation added a sense of organic fluidity to the aliens movements, while ILM contributed to the grander large-scale effects, including breathtaking space sequences and explosive action set pieces.Facehugger facelift. The shift from traditional clay sculpting to digital sculpting and 3D printing for Romulus Xenomorphs represents a pivotal evolution in the creature creation process for the franschise. (Photo: Murray Close)The integration of practical and digital effects extended to the work of Daniel Macarin, Visual Effects Supervisor at Wt FX, whose contributions were instrumental in bringing the Xenomorphs and offspring to life with the required dramatic level of realism. A tremendous amount of work went into keeping the balance of practical versus digital effects. What made Fede really happy was that we didnt take that cheaper, faster way out. We embraced what he had shot, Macarin states. Using Maya, Houdini and proprietary applications from Wt, Macarin and his team created fluid, organic movements for the CG creatures, making sure that every twitch and snarl felt as terrifyingly alive as their practical counterparts. Macarins team worked closely with the practical effects department, utilizing 3D scans of the animatronics to match textures and scale, and fully integrate the CG creatures into the live-action.A notable challenge arose when lvarez requested lighting effects in a highly practical set. Fede had an idea to make the cargo hold feel more alive by fluctuating all the lights, Macarin explains. This meant turning practical effects into digital recreations, which was enormous, but it elevated the scene. This adaptability, coupled with close collaboration with other departments, enabled Wts team to create immersive visuals that pushed the boundaries on Romulus.The Corbelin and Echo Probe ships were crafted to honor the original Alien production design while adding fresh elements unique to Romulus.The collaboration extended across disciplines, with every department contributing to the films vision. Stunt Coordinator Mark Franklin Hanson played a pivotal role in creating the films dynamic action sequences, including zero-gravity stunts that required precise coordination with the VFX team. Hansons meticulous planning ensured that the actors movements felt natural and believable, even in digitally augmented environments. Our goal was always to make the action feel grounded, Hanson says. Even when we were dealing with aliens and zero gravity, it had to feel real.The beauty of using 3D scanning is that it allows us to capture the detail and scale of our practical builds and translate them perfectly into the digital realm. Nothing gets lost in that transition.Shane Mahan, Special Effects Supervisor/Animatronic Puppeteer, Legacy EffectsKnown for his work in the horror genre (Dont Breathe, Evil Dead), lvarez approached practical effects with the same care and attention he gave his actors. During the chestburster scenes, he directed the animatronics as if they were performers, guiding their breathing, snarling and other movements to achieve the desired emotional impact. His use of 3ds Max for previsualization allowed the team to plan complex sequences while leaving room for on-set spontaneity. Hes incredibly collaborative, Gillis says about lvarez. He knows what he wants but is always open to the magic that can happen in the moment.ILM contributed to the grander large-scale effects, including breathtaking space sequences and explosive action set pieces.The films narrative and visual design were deeply informed by its rich legacy. Mahan and MacGowan, both of whom contributed to Aliens, brought their expertise to the adult Xenomorphs, applying modern 3D scanning techniques to refine their designs while preserving the franchises signature bio-mechanical aesthetic. Mahan explains, The beauty of using 3D scanning is that it allows us to capture the detail and scale of our practical builds and translate them perfectly into the digital realm. Nothing gets lost in that transition. The creatures intricate details every fold, texture and sinew were preserved, blending physical models that were digitally designed and 3D-printed with advanced digital counterparts. One of the films most striking moments a Xenomorph emerging from a cocoon spotlighted this total attention to detail. MacGowan reflects, We knew that animatronics could give us the tactile realism we needed, but combining it with subtle digital enhancements gave the creature a fluidity that made it truly come alive.lvarez wanted the CG shots of the void of space in Romulus to be darker, scarier and emptier than in other space films. To achieve the feeling of distant depth, the 3D model was mostly backlit to accentuate the detail, and the planet and rings silhouetted the ship and space station.The films climactic sequences brought together every aspect of the teams expertise. A hero shot featuring an alien bursting out of its cocoon required months of planning, with previs helping the team map out camera angles, creature movements and digital environments. The practical puppet, created by Mahans team, featured articulated limbs and cable mechanisms that allowed it to interact with its surroundings. Digital enhancements added fluidity and realism to the creatures movements, creating a terrifying moment. Building on the intensity of the cocoon scene, the birthing sequence involving Robert Bobroczkyi, a towering Romanian basketball player, delivered a visual and emotional gut punch that tapped the teams collective skills. Bobroczkyis unique stature and elongated proportions created an immediate sense of unease, embodying the alien-human hybrids eerie otherworldliness. Mahan, MacGowan and the Legacy Effects team crafted a custom suit for the actor that featured integrated animatronics, which allowed for unnervingly precise articulation, from the aliens subtle twitches to the grotesque unfurling of its limbs.Barba described how his team worked closely with the practical effects department to enhance the aliens unsettling emergence. The real magic came from layering digital enhancements over the incredible work Robert and the Legacy Effects team brought to the scene, Barba reveals. The digital touches were minimal but deliberate, amplifying the aliens unnatural movements and adding a glistening, almost wet texture to the suit that made it feel alive. The scene also relied heavily on lighting and camera placement to maximize its visual discomfort. lvarez wanted lighting that accentuated the aliens grisly features while casting Bobroczkyis imposing frame in looming shadows. Combined with tight camera angles, the effect was a claustrophobic, almost voyeuristic experience that immersed audiences in the horror of the transformation. Roberts performance was so striking that we knew we didnt want to overshadow it with too much digital work, Mahan states. The key was to let the practical suit and his physicality shine, while the digital elements served as the finishing touch. The result was a sequence that encapsulated Romuluss hybrid approach to effects and stands as one of the films most memorable moments, capturing both the raw terror and visual ambition that define the Alien franchise.***Legacy Effects worked closely with director Fede lvarez to create the Xenomorph puppets and suits, Cocoon, Rook animatronic puppet and design/makeup effects of the Offspring. To watch a fascinating behind-the-scenes pictorial and video journey through Legacys development of the creatures for Alien: Romulus, click here: https://www.legacyefx.com/alien-romulus0 Comments 0 Shares 88 Views
-
WWW.VFXVOICE.COMDANCING TO THE ANIMATED BEAT OF PIECE BY PIECEBy TREVOR HOGGImages courtesy of Focus Features.Producer/director/writer Morgan Neville has become known for his musician profiles, whether highlighting backup singers in 20 Feet from Stardom or sitting down with a legendary rock guitarist in Keith Richards: Under the Influence. What is different with Piece by Piece is that the life and career of Pharrell Williams is depicted not through the traditional documentary means of talking heads and archival footage but instead as a LEGO animated feature. Two versions were made of the documentary, with the more traditional approach used as a rough template for its animated counterpart.For a long time, documentaries were seen as just journalism with pictures. There have been people over the decades who have pushed that, such as Erroll Morris, Werner Herzog and Wim Wenders. The audience is ready and hungry for it. When I get asked, Is your film a documentary? I say, Its creative nonfiction. A documentary comes with a rulebook, and Piece by Piece is deeply faithful and truthful, but is it pure journalism? No. However, thats not what I tried to do. Its cinema first. That kind of liberation is good for filmmakers and audiences.Morgan Neville, Producer/Director/WriterDirector/writer Morgan Neville describes his LEGO persona as the prefect pasty, disheveled, bespectacled documentary filmmaker that I am!It had some original and archive footage, music videos, other movie clips and some drawings to get the story in place as much as we could before we took it into animation, Neville states. I remember it being complicated. If youre basing something on a photo, do you need to license it? There is stuff like the Oprah footage, which we licensed and then basically animated it one-to-one with what the original scene is, if you look at it side by side with his performance on that show. That is maybe 25% to 30% of the film. Licensing footage was worth the cost. Making an animated film on a smaller budget, those sequences were gifts because even though they were going to cost as much as something we had made up, we didnt have the time to make it up, remarks Animation Director Howard Baker. We still storyboarded it so that the animation studios could start breaking down the scene down. If it was live-action, their heads would have exploded!Pivotal in visually translating Williams creative process was his condition called Chromesthesia or sound-to-color synesthesia where sounds appear as colors. Synesthesia was the one thing that unlocks this fantasy gear and was perfect for animation, Neville believes. That scene worked so well that when we showed it to people they said, I want to see that movie. Given a visual representation are the catchy musical beats that are the foundation of the Williams songs. I had three people who I worked quite closely throughout the show, and we felt that the film needed a LEGO hook, Baker recalls. We always had big bowls of LEGO and LEGO toys in our story room and were playing around with them. We started making these things saying, This is like this or that sound. We made a whole bunch of them, and our producers in India created them in CG. We sent them over to Pharrells company, i am Other, and they had ideas and reasons why things didnt work. Each beat ended up having a specific personality.When Gwen Stefani talked to director/writer Neville, she was already animated and quite over the top, so using her interview as actual dialogue felt natural.Some of the dialogue for Snoop Dogg came from a podcast he did with Pharrell Williams.Interviews rather than scripted dialogue drive the narrative. Whenever I was doing interviews, I would ask, What did the room look like? What did you and they say? Neville explains. In the case of Pharrells grandmother, she is gone, so he got his aunt to do the voice, but we didnt script anything. Even the banter between Pharrell and Snoop Dogg came from a podcast they had done together, and suddenly you can transform that anywhere, like backstage at the concert. There was that playfulness of being in moments more rather than just narrate something with pictures. We could time travel in a way through the film, which you cant do in documentaries but easily can do in animation.Rather than making everything perfect, which is possible in animation, an effort was made to incorporate mistakes that would appear in a live-action documentary.Some of interviewees were a nature fit while others required editorial assistance when it came to timing. When N.O.R.E. and Gwen Stefani were talking to Morgan, they were already animated and quite over the top, so using their interview as actual dialogue felt natural, Baker notes. But then Teddy Riley didnt seem quite as natural; his acting felt like it might turn out to be stiff; however, it ends up becoming the character. The director is part of the cast. My mini-me is the prefect pasty, disheveled, bespectacled documentary filmmaker that I am! Morgan laughs. My hair has gotten much whiter than it was at the time. Everyone had opinions about the design of their LEGO persona to varying degrees. Because Pharrell was such a main character, it took a long time for us to get there, Baker states. Missy Elliott was involved in the design of her character, which is one of the most successful because she was in there pointing out things to do to make her feel comfortable. No Doubt was easy about their caricatures; they saw one version, gave some notes, and that was that.[W]e didnt script anything. Even the banter between Pharrell and Snoop Dogg came from a podcast they had done together, and suddenly you can transform that anywhere, like backstage at the concert. There was that playfulness of being in moments more rather than just narrate something with pictures. We could time travel in a way through the film, which you cant do in documentaries but easily can do in animation.Morgan Neville, Producer/Director/WriterA conscious decision was made not to have a production designer developing a unifying look. I wanted all the different designers to bring what they thought it should look like into the picture so all of these places end up having a natural personality that was different, remarks Baker, who was based at Pure Imagination Studios. Set and environmental designs were divided between Tongal and Zebu Animation Studios. Zebu Animation Studios and CDW Studios each did one-third of the animation, and the remainder was completed by animators hired by Pure Imagination Studios, who were already proficient in the LEGO stop-motion animation style. There were definitely studios we knew were better at certain things, Neville observes. CDW Studios was good at water, and the water effects are amazing. There were definitely animators who were good at base animation, and we gave them the close-up scenes.Daft Punk was very particular about the design of their LEGO persona.Real footage was shot of Pharrell Williams returning to his hometown of Virginia Beach, Virginia, which in turn inspired the animation for the scene.Piece by Piece may be best described as creative nonfiction.The scene that served as a proof of concept was when high school student Pharrell Williams listens to I Wish by Stevie Wonder for the first time on a ghetto blaster.A point of reference for the camera style and lensing was Moonlight.Unlike animation, where everything is created from scratch, documentaries need to adapt to real settings and circumstances. We did a lot of things that were common in documentaries that are uncommon in animation, like montaging through space and time, Neville states. My sense is that the big LEGO movies have some giant incredible anchor sets that they live in a lot, and we were constantly skipping through space and time from location to location. The number of sets we had would dwarf what you would normally find in LEGO movie. In a way, it wasnt about building all of these amazing castles. We just need to capture the essence of different places and times. Teddys studio, Virginia Beach and New York each have a feel.Five years was spent making Piece by Piece, which is not uncommon for an animated feature.Animation allowed for a deeper exploration of the characters, such as Justin Timberlake and Pharrell Williams.The aspect ratio was 2.39:1. The film that I talked about the most in terms of the look was Moonlight because I love the cinematography, and it has this warm, funky anamorphic look, Neville explains. There was a unifying lens look throughout the whole film even though at times we break into these archival sections that actually have a different look. Some of the archival sections we exported from finished 4K animation onto VHS and re-imported it from VHS. I dont know if that has ever been done before! In terms of framing, we did a lot of tights and mediums because that felt cinematic.We felt that the film needed a LEGO hook. We always had big bowls of LEGO and LEGO toys in our story room and were playing around with them. We started making these things saying, This is like this or that sound. We made a whole bunch of them, and our producers in India created them in CG. We sent them over to Pharrells company, i am Other, and they had ideas and reasons why things didnt work. Each beat ended up having a specific personality.Howard Baker, Animation DirectorEarly on, a topic of conversation was the limitations of LEGO animation. A big one was dance, Neville states. LEGO figures dont bend, and there is a lot of movement in the film. We had a lot of discussions about, How do we represent dance as much as we can? Howard has a dance background and has played with this. Cracking that was a major thing for us. A 24-hour-long video of Pharrell Williams hit Happy came in handy. We watched that for many hours looking for a lot of dance references to put all over in the film, Baker reveals. Im a big believer in if you can draw it, you can probably animate it. We would draw it out and show it to the animators, and sooner or later they would give us a version of it that felt right.Getting the visual treatment were the catchy beats composed by Pharrell Williams. Here is an example of that in a sequence going from storyboard, layout, animation, ambience, lighting to final.Piece by Piece expands the boundaries of documentary filmmaking. For a long time, documentaries were seen as just journalism with pictures, Neville notes. There have been people over the decades who have pushed that, such as Erroll Morris, Werner Herzog and Wim Wenders. The audience is ready and hungry for it. When I get asked, Is your film a documentary? I say, Its creative nonfiction. A documentary comes with a rulebook, and Piece by Piece is deeply faithful and truthful, but is it pure journalism? No. However, thats not what I tried to do. Its cinema first. That kind of liberation is good for filmmakers and audiences.The experience has been career-altering. Its to Pharrells credit for making this animated because animation is an emotional metaphor which allows us not be factual but at the same time make it believable, Baker remarks. You can have a singing mermaid and no one questions it. Making a story about a real persons life, then making it as emotionally visual as we get to do in animation has opened my eyes to being able to get deeper into characters and letting them tell the story. There came a point where I realized that what we do with animation is so visual and what they do now in documentary and live-action is so character-driven that I was able to bring those two things together in way that is unique and mind-boggling eye-opening.0 Comments 0 Shares 124 Views
-
WWW.VFXVOICE.COMJELLYFISH BALANCED BOMBED CITIES, SMOKE AND VFX TO CAPTURE THE STORY OF LEEBy OLIVER WEBBImages courtesy of Jellyfish Pictures and Sky UK Ltd, except where noted.Released in September for a limited theatrical window in the U.S. and U.K. where it was recently nominated for British Independent Film Awards (BIFA) for Best Effects and Cinematography, Lee stars Kate Winslet as World War II photojournalist Lee Miller and explores the story behind Millers rise to fame as a fashion model turned war correspondent for Vogue magazine. Adapted from the 1985 biography The Lives of Lee Miller, written by Millers son Antony Penrose, Lee is a harrowing portrayal of one of Americas most acclaimed photographers.Production VFX Supervisor Glen McGuigan and Jellyfish Pictures VFX Supervisor Ingo Putze led the team from pre-production to post. VFX Supervisor Glen McGuigan approached Jellyfish Pictures back in December 2022 to partner on the movie Lee, Putze says. Glen knew of Jellyfish through Jon Park, who is one of our stellar VFX supervisors. Jon was already busy with another project, so I was pleased that Lee ended up on my desk.Kate Winslet is WWII photographer Lee Miller in Lee. The VFX requirements on Lee encompassed a diverse range of tasks, including set extensions and digital matte paintings to help enhance Director of Photography Pawel Edelmans live-action photography.Since the film was focused on Lees photography, the VFX needed to capture both the tone and style of her photos. Lees war photography is black-and-white, grainy, high contrast, with a handheld camera somewhat impossible to marry with the smooth, perfectly-captured pictures by [Director of Photography] Pawel Edelman, but we found a golden middle to integrate them into the look of the VFX scenes.Ingo Putze, VFX Supervisor, Jellyfish PicturesWhen it came to initial conversations about the look of the film, Jellyfish had a very short window between locked edit, shot turnover and delivery. We decided to do concept art and paint over plates of almost all the key scenes to support the creative vision of the director, Ellen Kuras, Putze explains. It was very helpful to have that common visual language to execute the workload to make the delivery in the given timeframe. For the look, we found a language that kept the story focused on the foreground, balanced with bombed cities, smoke and VFX in the background without distracting the audience.Kate Winslet and Andy Samberg in Lee. Part of the visual effects work consisted of transforming modern landscapes into WWII-era London and Germany.Due to the fast turnaround and nature of the project, Jellyfish held remote reviews almost daily to finesse the work creatively while director Ellen Kuras was in New York. This meant we could work the shots up without impacting the schedule, Putze notes. Producer Kate Solomon and Post-Production Supervisor Portia Napier were instrumental in ensuring we had maximum access to Ellen and were included in the grading process to make the final detailing efficient. Once everything was in a good place creatively, we hosted 4K screening room reviews in our theater for sign-off. It was a true collaboration in every sense of the word and made for the smoothest possible execution.Lee Millers original photographs served as the main source of reference and inspiration for Jellyfish. Having access to Millers private archives proved to be invaluable in order to accurately depict her experiences. Translating Millers black-and-white analog still photographs into live-action moving imagery while preserving their original impact proved to be a challenging task for the VFX team. Since the film was focused on Lees photography, the VFX needed to capture both the tone and style of her photos, Putze remarks. Lees war photography is black-and-white, grainy, high contrast, with a handheld camera somewhat impossible to marry with the smooth, perfectly-captured pictures by [Director of Photography] Pawel Edelman, but we found a golden middle to integrate them into the look of the VFX scenes.Lee Millers original photographs served as the main source of reference and inspiration for Jellyfish. (Photo courtesy of Elevation Pictures)Usually, VFX projects are faster, bigger, stronger, creating VFX you often have never seen before. Lee, in contrast, is a true artistic film with a strong emotional message. It shows the importance of reporting the atrocities of war from the first female war photographer. We really enjoyed calming down the amplitude of VFX to not overtake the story.Ingo Putze, VFX Supervisor, Jellyfish PicturesJellyfish were tasked to match the bombed-out church shot where Miller took her most famous photo. We managed to find out how this building looked before the bombing through old postcards today it is a modern building on the same spot, Putze explains. Millers photo You Will Not Lunch In Charlotte Street, which is featured in one shot, was actually Goodge Street. A lot of photo material was taken from similar buildings and architecture to recreate London in the 1940s. Miller witnessed the worlds first use of napalm bombing. As a base, we colorized the original black-and-white photo then replaced it with real reference and added simulated smoke explosions, animated water, etc.Translating Millers black-and-white analog still photographs into live-action moving imagery while preserving their original impact proved to be a challenging task for the Jellyfish VFX team.The close-up of the bombed church in London was one of the most challenging shots. The blueprint for the shot was established when Lee takes a photo of the church. We needed to be true to her photo and also match the live photography, which was Kate Winslet on bluescreen with a pile of bricks. Extensive research went into it and getting the right texture elements for the DMP in execution it was a 2.5D projection to simulate the camera move. It was certainly a lot of work for 15 frames in focus.The VFX requirements on Lee encompassed a diverse range of tasks, including set extensions and digital matte paintings to help enhance Edelmans live-action photography. There is a scene where Lee is at an airfield where we needed to add CG planes in the background, then landing on a field airport in France and adding a sea of army tents with animated planes on the runway; adding injuries, wounds and burns to a soldier, adding fire to a burning building, simulating propaganda leaflets raining from the sky, hotels scenes and Millers home which was shot on bluescreen and needed BG replacement, [as well as] adding explosions, ground impacts and smoke to the combat scenes, Putze details.CG planes were added into the background for the airfield sequence, along with a sea of army tents in the foreground. The VFX needed to capture both the tone and style of Lee Millers photos.We managed to find out how this building looked before the bombing through old postcards today it is a modern building on the same spot. A lot of photo material was taken from similar buildings and architecture to recreate London in the 1940s. Miller witnessed the worlds first use of napalm bombing. As a base, we colorized the original black-and-white photo then replaced it with real reference and added simulated smoke explosions, animated water Ingo Putze, VFX Supervisor, Jellyfish PicturesThe Jellyfish team was extremely moved and emotionally challenged when researching original photos of the Dachau KZ camp where Miller witnessed atrocities by the Nazis. To be authentic, we needed to add dead bodies and compose them into the live action, Putze says. Understandably, some artists found it hard to work on this scene, which of course we respected, so we had to assign the tasks and handle the feedback incredibly sensitively. Feedback for this sequence was handled delicately while still achieving the level of realism that was required to honor Millers documentation of war and the brutality of its visuals.Jellyfish were also responsible for creating bombed cities and incorporating explosions.Jellyfish added explosions, ground impacts and smoke to the combat scenes.When it came to initial conversations about the look of the film, Jellyfish had a short window between locked edit, shot turnover and delivery.Authentically recreating Lees journalistic endeavors of WWII was a delicate balancing act for Jellyfish, which delivered 180 VFX shots for the film.The visual effects work also consisted of transforming modern landscapes into WWII-era London and Germany, creating CG war planes, enhancing wounds, adding injuries, extending crowds, adding CG fire and water hose elements to the burning Vogue building, replacing windows using bluescreen, creating bombed cities and incorporating explosions. Jellyfish delivered 180 visual effects shots for the film. At Jellyfish Pictures, we have a strong environment and DMP team, Putze notes. A lot of layers and cards came from that department and were projected in a 2.5D setup in compositing. This made the show much lighter in 3D tasks.Authentically recreating Lees journalistic endeavors of WWII was a delicate balancing act for Jellyfish. Comments Putze, Im incredibly proud of the team and what we achieved. Usually, VFX projects are faster, bigger, stronger, creating VFX you often have never seen before. Lee, in contrast, is a true artistic film with a strong emotional message. It shows the importance of reporting the atrocities of war from the first female war photographer. We really enjoyed calming down the amplitude of VFX to not overtake the story. I was proud to recommend this film I worked on to my family and friends. My personal background is art direction and matte paintings, which were used heavily in the execution of Lee, so it was close to my heart.0 Comments 0 Shares 154 Views
-
WWW.VFXVOICE.COMILM GOES INTO ALTERNATE MODE FOR TRANSFORMERS ONEBy TREVOR HOGGImages courtesy of Paramount Pictures.After returning to feature animation with Ultraman: Rising, ILM released its second offering that comes from a franchise that made the VFX company synonymous with the live-action version. Transformers One was directed by Josh Cooley, who set the story back when sworn enemies Optimus Prime and Megatron were inseparable, mischievous buddies called Orion Pax and D-16.One of the good things about ILM is that we have many different varieties of visual effects shows, so our pipeline and assets need to be versatile, notes Animation Supervisor Stephen King, who divided and conquered the animation with fellow supervisors Rob Coleman and Kim Ooi. The pipeline itself didnt change too much from how we handle our visual effects shows. Where there is a difference is, rather than working on a shot for this or that sequence, on an animated feature everything moves forward together within a sequence. There is a lot more time and structure built into these animated features, which is nice.Lead Character Designer Amy Beth Christenson explores color options for Airachnid.Being that it was going to be such a clean style for this film, it immediately became clear that they were going to have to transform onscreen without cheating. We had to solve the transformations on the concept art side to make it easier for everybody down the line. Once we had that narrowed down, I immediately started building everything in 3D as a concept art model and kept it in the same file.Amy Beth Christenson, Lead Character DesignerKing had previously worked on Transformers: Revenge of the Fallen, Transformers: Dark of the Moon, Transformers: Age of Extinction and Transformers: The Last Knight as well as Transformers: The Ride 3D. Not only did we use and tweak some of our specific tools to help them transform what were developed for the live-action films, we also learned that because these are big bulky characters sometimes its hard to get clean silhouettes for action and fighting, King states. It meant that we had to stress strong dynamic poses so the audience can read [the action], and theres not that confusion because the action happens quickly. It was good because the director, Josh Cooley, was into us suggesting things like, You would get a stronger pose if we do this kick instead of a punch here. It was a great collaborative effort.Rather than follow the norm of transforming into a vehicle, Alpha Tron takes the shape of a lion.Storyboards were created and the initial layout was provided by Christopher Batty at Paramount Animation, who served as the cinematographer on the project. We always had something that helped to ground us, but it is freeing, King remarks. Michael Bay is a fantastic cinematographer and does cool camera moves, but youre always trying to put them into the plate, so youre cheating the camera. Whereas with this one, youre doing the action as it would be done on set. You adjust the camera to the action. Unlike Michael Bay, who is known for two-second shots, Cooley had an opposite approach. Josh was keen to do long shots. There is a shot on the rooftop that is over a minute long. He said, Why do things with multiple cameras, takes and cuts when I can let the audience get in and breathe whats going on? Sometimes, we had multiple animators working on the same shot. One animator can focus on one specific action and another can work on other bits that are happening around it. We tried to bring it up to a certain level, show it to the director, and if were heading in the right direction then wed add some polish and refinement and keep going, King explains.As D-16 becomes Megatron, he gets bigger and heavier, and you can feel the weight. Orion Pax is a dreamer, has a carefree bounce to his step and doesnt have the weight of responsibility. But as Orion Pax transforms, he is forced into that leadership role, and we changed his pose. Orion Pax stands more upright and his shoulders are back more. Not only does he physically change but also his persona.Stephen King, Animation SupervisorCharacter Designer Evan Whitefield conceptualizes B-127 as a miner.The toys, comic books, live-action movies and the animated series were considered to be a point of reference. We started to visually unpack everything and discussed what we wanted our proportions to be and what is the style of the movie, states Lead Character Designer Amy Beth Christenson. Josh and Jason William Scheier [Production Designer] wanted to lean into Art Deco and J.C. Leyendecker. Josh wanted the faces to emote, so we did some studies of panel work going back to the 1980s cartoon, which had much more metal skin and worked much better to get the facial expressions out. It always defaulted back to the 1980s cartoon, which was nice because from the get-go this was going to be an animated movie with its own style. Being such an uber fan, I also wanted to make sure it was distinctly recognizable and pulling in the main parts while still doing something new.Characters had to fit into the overarching style but still have distinct elements in their own right. I made sure that the negative space inside the helmet was distinctive for every single character, and they had a shape language, Christenson remarks. It was nice going back and forth between the helmet and the face making sure that they felt of one piece. B-127 has a lot more round shapes than anybody else because hes friendlier and funnier. The character design reflects the arc of the cast members. Josh brought up the fact that a lot of the times when youre going up with your friends, youll dress and act the same. We started almost with the same silhouette for Orion Pax and D-16. But if you look at Megatron, the toy from the 1980s, he is square. We went from round shapes for D-16, and I began adding in more squares and repeating triangle angles because thats a much more aggressive shape. Thats the best example where we changed the shape language but tried to keep the silhouette so you could recognize D-16 as a miner with a cog as Megatron, Christenson explains.Amy Beth Christenson imagines Bumblebee with a cog and his alternate mode.Adding further complexity to the character designs was the fact that Transformers are given that name for a reason. Being that it was going to be such a clean style for this film, it immediately became clear that they were going to have to transform onscreen without cheating, Christenson states. Beyond that, there are three versions of Orion Pax and Megatron and two versions of most of the other characters. We had to solve the transformations on the concept art side to make it easier for everybody down the line. Once we had that narrowed down, I immediately started building everything in 3D as a concept art model and kept it in the same file. I would have the robot and the alt mode, and I would make sure that I was instancing the piece over so that the chest is going to be the same scale and piece as the front of the big rig. Concept art models were keyframed animated to demonstrate the transformations. We had to work out everybody at the same time to make sure that Bumblebee didnt suddenly transform bigger than a big rig. like Megatrons tank. For Elita and Bumblebee, we had to keep hollower in the inside so when transforming into a bike or car they took up less space, because there were these cavities where things could go in tighter. Whereas, Orion Pax and D-16 were solid in the inside and have parts that can expand, which allowed them to get bigger than they should, Christenson explains.We started almost with the same silhouette for Orion Pax and D-16. But if you look at Megatron, the toy from the 1980s, he is square. We went from round shapes for D-16, and I began adding in more squares and repeating triangle angles because thats a much more aggressive shape. Thats the best example where we changed the shape language but tried to keep the silhouette so you could recognize D-16 as a miner with a cog as Megatron.Amy Beth Christenson, Lead Character DesignerA facial study of Megatron.Being given the ability to transform alters the mindset of the characters, which was incorporated into the animation. As D-16 becomes Megatron, he gets bigger and heavier, and you can feel the weight, King remarks. Orion Pax is a dreamer, has a carefree bounce to his step and doesnt have the weight of responsibility. But as Orion Pax transforms, he is forced into that leadership role, and we changed his pose. Orion Pax stands more upright and his shoulders are back more. Not only does he physically change but also his persona. Developing an emotional connection with the audience meant having the characters emote in a realistic manner. Josh still wanted them to feel like robots, so we dont have a cheekbone that our skin rises over. The eyes move more like camera lenses and apertures. We didnt make them blink because those were saved for more emotional beats.For each shot, an animator would capture reference footage of themselves performing the required action. Its my favorite style because it helps the animators get into the characters and is definitely where the industry has moved to get that realism and detail that audiences want to see now, King reveals. Because we didnt want go down the wrong path for the long shots, we would often show Josh our reference before presenting him with the animation blocking. Directing the eye of the viewer was accomplished through lighting and color. Its a colorful, saturated movie, but for each shot lighting, [Production Designer] Jason William Scheier directed the eye [of the viewer] while subtly darkening or desaturating the background because we didnt want the audience to look as if our characters would pop out.A variety of facial expressions are examined for Orion Pax.Among the most distinct characters is the Sentinel Primes enforcer Airachnid. She was a lot of fun to animate because we had her walking on her legs and doing big acrobatic flips into her transformation, King explains. On top of that, she has all of these sets of eyes on the side, which is a big story point, giving them the spider quality of always looking around independently. You feel the importance of it exactly when she locks all of them onto something. Whereas, most of the Transformers become a vehicle. Alpha Trion takes on the form of a lion with a unicorn horn and electric tail. Because hes older and from this previous generation of Transformers, the same amount of energy wasnt needed. We played Alpha Trion smaller and subtler to make him feel wise. An ancient alien race is the archenemy of the Transformers. The Quintesson boss has these floaty tentacles, so weve got this nice organic movement underneath her, which is a nice juxtaposition to the robotic movements, King notes.Considerable effort went into making sure the negative space inside the helmet was distinctive for every single character and that the helmet and face felt of one piece.A sketch of Orion Pax in his alternate mode.[Airachnid] was a lot of fun to animate because we had her walking on her legs and doing big acrobatic flips into her transformation. On top of that, she has all of these sets of eyes on the side, which is a big story point, giving them the spider quality of always looking around independently. You feel the importance of it exactly when she locks all of them onto something.Stephen King, Animation SupervisorTransformations happen with background characters as they go about their daily lives. There was a lot of world-building in a movie like this, King states. We did anything to make it feel like a real organic world. We did a couple of iterations with the fan and getting the timing right. Its supposed to be this big fan that turns on and pushes a lot of air, so it cant move too quickly, otherwise it wont have that weight, but at the same time, it had to be fast enough to be dangerous. We had it go fast on the initial startup, then it slowly decreases over time. Fun was had with the vehicles. Describes King, D-16s tank has some nice details such as independent treads and legs so he can maneuver around uneven surfaces. Elita is a tri-bike with independent suspension, so when shes banking around corners, we tested and pushed how far she could lean over and get cool, dynamic poses, especially when beating up all of the Trackers in the middle of the movie.Serving as the main antagonist is Sentinel Prime voiced by Jon Hamm.Taking part in the Iacon Race is a female racer and her alternate mode.Leading the ancient alien race known as the Quintessons is Quintus Prime.Mapping out the evolution of Orion Pax into Optimus Prime.Exploring the how scale changes going from D-16 to Megatron.Malleable metal skin was given to their face to enable the characters to emote better.Wreaking havoc as the enforcer for Sentinel Prime is Airchnid, who has the ability to transform into a helicopter.B-127/Bumblebee gets exited about being able to generate knife hands once a cog gets inserted into his body.Orion Pax was given a carefree demeanor.The eyes were treated as camera lenses to allow the characters like D-16/Megatron to emote while remaining robotic.The primary visual reference for the movie was the 1980s animated series.Reaching the surface of Cybertron, B-127, Elita, Orion Pax and D-16 encounter Soundwave, Starscream and Shockwave.Harkening back to the wild craziness of the pod race in Star Wars: Episode I The Phantom Menace and the car race in Ready Player One is the Iacon race that Orion Pax and D-16 hijack despite their inability to transform. From the first script read, that was the sequence ILM keyed on because we knew it was going to take all of our skills and departments, King reveals. We have characters racing and interacting, then we add the complexity of this road that has to come out, transform in front of them, move and have its own life. It would have been very difficult to do in live-action, so that is one of the great things about doing an animated feature.Watch the transformation videos featuring key character designs and models in Transformers: A Design Case Study from ILM. Click here: https://www.ilm.com/art-department/transformers-one-case-study/0 Comments 0 Shares 155 Views
-
WWW.VFXVOICE.COMHOW TERRITORY STUDIO DEVELOPED THE GRAPHIC LANGUAGE FOR ATLASBy TREVOR HOGGImages courtesy of Territory Studio and Netflix.Given that computer technology has become an integral component of everyday life, whether at work or home, user interfaces and screen graphics can easily be taken for granted, but a lot of thought goes into designing them, as reflected in the work Territory Studio did for the Netflix film Atlas. The sci-fi adventure, starring Jennifer Lopez and directed by Brad Peyton, required 300 screens, HUDs (Heads Up Displays) for Arc robots, and a way of visualizing the voice and emotional state of an AI entity known as Smith. These tasks were guided by Production Designer Barry Chusid and Visual Effects Supervisor Lindy DeQuattro.Territory Studio spent more than two years on the project. I always say that our storytelling with the universal language of design is what helps directors sometimes escape an entire scene because we can show a map with an A, B and a line, and the whole audience quickly gets that these guys are going from A to B, states Marti Romances, Co-Founder and Creative Director of Territory Studio. We look at the script and figure out what can be done practically. We understand how DPs love to get the lens and the light that comes from certain screens, but in the case of Atlas, there were a lot of things that were impossible to shoot in-camera, like holograms, which happen in a visual effects pipeline. The reason why we were on Atlas for so long was that we were prepping the graphics to be played back on set and working with the art department to develop the language. As shooting was wrapping, we were already doing concepts for the post-production visual effects.We are not beginning with a template. Rather, we always treat it as something to be novel and more towards what the director wants. In this case the director wanted to aim for simplicity, and Pacific Rim was the opposite. Pacific Rim was like a glass hologram, whereas with Atlas, imagine you have lasers sketching a hologram. If you approach it from that angle and really listen to what the director wants, you are always going to end up with something new.Marti Romances, Co-Founder and Creative Director, Territory StudioInspiring the graphic design for the International Coalition of Nations were government and military-industrial-complex institutions like the CIA.Having art directors and production designers act as a bridge between production and post-production ensures that there is no miscommunication as they understand the story behind each set, prop and graphic. There are projects where we did the on-set graphics and someone else did the post and vice versa, Romances notes. The nice thing about a project which uses the same company for both is its a more cohesive visual language. There were a number of efficiencies. For example, we didnt need to create a lot of the design language and icons from scratch because of our on-set library. Territory Studio thrives working for Hollywood and Silicon Valley. We believe that there will be different influences here and there from one industry to the other. General design principles apply no matter the industry. What is the best user experience for the viewer who is being entertained and how can we make it so that a broader audience can understand? But we also do that when were attacking an electric vehicle. You may have never had an electric vehicle, and we want to make sure you feel comfortable, Romances says.HUDs are a familiar high-tech element in cinema, in particular those associated with Iron Man and Pacific Rim. Despite this recognition, Territory Studio starts from scratch every single time. We are not beginning with a template, Romances notes. Rather, we always treat it as something to be novel and more towards what the director wants. In this case the director wanted to aim for simplicity, and Pacific Rim was the opposite. Pacific Rim was like a glass hologram, whereas with Atlas, imagine you have lasers sketching a hologram. If you approach it from that angle and really listen to what the director wants, you are always going to end up with something new. The directive was for militaristic AR in 2070. We were passing all of our toolkits and animated material to five different vendors, and the comment from Lindy was, If this graphic is here, I need to see the light source. We were trying to ground it in a more realistic way, rather than having something floating, not knowing where its anchored. Props to Lindy for keeping an eye on making sure the everyone understood that this is how it should be portrayed in the frame.The graphic for the voice of the Smith AI has a core surrounded by several transparent wrapping spheres, which gives it a simple and sophisticated visual aesthetic.The brief was, What if we imagine an augmented reality version of the visual that we get for Siri on our iPhones that floats around you? We did an immense number of design iterations and ended up with something that is simple yet could be mobile and could be projected from every corner of that HUD. We needed to give it some character and a connection to the vocal performance. I would have loved to do a PR stunt where we question why the Oscars are not considering Smith for Best Supporting Actor! It was a special element in the film.Marti Romances, Co-Founder and Creative Director, Territory StudioPartnering with counter-terrorism analyst Atlas Shepherd (Jennifer Lopez) is an AI known as Smith (voiced by Gregory James Cohan) that serves as the operating system for the mech robot she is piloting. When the AI was getting angry or witty, we almost had different modes but not necessarily a big change in the design, Romances remarks. It was more like, Lets play with color and the animation, and the reaction of the audio wave that is connected to the voice. How do we make sure that comes across as something that is too aggressive? When the movie Her came out, someone said to me, The future doesnt have any UI. That will never work. Humans are visceral animals. We need an anchor point. Thats why the Alexa has a little light that tells you if its thinking or talking. We approached it in the same way, and Brad wanted exactly that. The brief was, What if we imagine an augmented reality version of the visual that we get for Siri on our iPhones that floats around you? We did an immense number of design iterations and ended up with something that is simple yet could be mobile and could be projected from every corner of that HUD. We needed to give it some character and a connection to the vocal performance. I would have loved to do a PR stunt where we question why the Oscars are not considering Smith for Best Supporting Actor! It was a special element in the film.Driving the graphic design process was the need to clearly illustrate story points and move the narrative forward.Rather than rely on the typical audio waveform, which is line-oriented and has peaks and valleys much like a mountain range, a different approach was adopted to showcase the voice of Smith. Its the same amount of distortion that you would imagine in a two-dimensional line, but we applied it to a few wrapping transparent spheres that are surrounding the core, Romances explains. It was not only how it vibrates but how much it retracts or expands depending on tone. We added some particle elements so it isnt as simple as a few spheres on top of each other combined with the lighting, nice compositing, animation and color. We came up with enough variables to create a real character which was the endgame.The Centurion Tech is used by humanoid AI terrorist Harlan Shepherd (Simu Liu) and his agent Casca Decius (Abraham Popoola). It resembled the different blocks you get when looking at the fragmentation of a memory disk. Brad wanted Centurion Tech to be something that you could not understand. It was monochromatic, and we wanted to pair it with the same technology that Atlas has when different holograms appear in front of her. The only time we get to see a visualization of the voice from Centurion Tech is when Casca jams the signal on the Smith UI. Because Smiths system is being hacked, the visualization had to sit somewhere between as it was being pushed and forced to send the message.Shifting back and forth between consumer products and film and television projects has enabled Territory Studio to apply lessons learned from both experiences to their design methodology.Topographical projection with the light source indicated to ground the technology.An interesting challenge was the International Coalition of Nations (ICN) Interrogation Room, which took inspiration from the CIA. There were certain times when the director started asking for crazy things in the interrogation moment where Atlas comes out and deploys an entire system of holograms that appear next to her as she tries to extract the coordinates of Harlans base from Casca, Romances states. This is the type of storytelling that without those graphics would be difficult for the audience to understand what is happening.All of the work done during shooting was extremely helpful for the lead actress. We could show Jennifer Lopez that we are clearing out all of this space. So, when were shooting towards the exterior of this cockpit, make sure she remembers that in this area, she will have a voice talking to her and in that area she is going to have a menu. We created all of these rules for the HUD prior to starting post-production so Jennifer could have a nice visual cue on where that will be sitting in front of her, how big and far away it will be. That was critical for her.We could show Jennifer Lopez that we were clearing out all of this space, so when were shooting towards the exterior of this cockpit, make sure she remembers that in this area, she will have a voice talking to her and in that area she is going to have a menu. We created all of these rules for the HUD prior to starting post-production so Jennifer could have a nice visual cue on where that will be sitting in front of her, how big and far away it will be. That was critical for her.Marti Romances, Co-Founder and Creative Director, Territory StudioInnovation was required in order to be able to share 300 assets with MPC, Scanline VFX, ILM, Cantina Creative and Lola VFX. One of the best solutions that we had for that is we created a Nuke toolkit, with the biggest one being for the Arc HUD, Romances remarks. The nice thing about this toolkit was that in one file you have all of the different modes of the HUD, similar to drive mode in car. When youre in combat mode, the HUD became more militaristic and combat-ready to aim and shoot. When you were on a normal mode or what we called initiation module, that was a completely different setup. All of these setups are versions of the same HUD with different linework, icons and colors. You just need to comp this once then go to our script, change it to combat mode, rather than having to import a different set of graphics and EXRs again. Also, we were looking at things that have a lot of parallaxes. All of the EXRs were multi-layered and coded into the Nuke script, which means you dont have create different depth layers. You create that once. We could update the EXR layer without having to re-engage the whole graphic system again in the final comp. This is where you find efficiencies, especially when working with other vendors.To assist Jennifer Lopez in believably interacting with the Smith AI, an Arc HUD guideline was put together so she had a sense of the layout, position and function of each graphic.The Smith AI voice graphic was treated as a floating augmented reality version of the Siri icon found on the iPhone.The Centurion Tech was not meant to be comprehensible, as it was created by AI and not humans.Among the elements used to give Smith AI a personality were variants in color, and the core expanding and retracting.In order to be able to share 300 assets with MPC, Scanline VFX, ILM, Cantina Creative and Lola VFX, a Nuke toolkit was created by Territory Studio that streamlined the process.Every project starts from scratch for Territory Studio, as the focus is on capturing the vision of the director.One of the important aspects of graphics is orientating the audience to where the various characters are situated in relation to each other.The Arc HUD has various modes with one of them being for combat.Around 120 on-set screens were created that were all animated, including the coffee machines at the ICN. One interesting part that we never experienced is that there were a few scenes where Brad wanted to take our on-set graphics and augment them in post, Romances notes. This was a beautiful moment for us because were now being asked to extract those on-set 3D graphics, which is not usually the case. There is a moment where Atlas is synchronizing to a different level with Smith and sees all of the graphics that we had produced for on-set floating towards her. And theres another shot where shes in the medical bay and is trying to call Smith to save her. To have the director say, You did this, all of this language. Can you now bring it into post? That was a special moment for us because its like seeing the manual for Smith. It was cool. Sadly, there is a lot of cool stuff that never made the cut, like a floating visor for the Centurion. Thats another thing I would highlight. Sometimes, there is so much more that we do that never gets to see the light of day, but we had lots of fun doing it.0 Comments 0 Shares 150 Views
-
WWW.VFXVOICE.COMSTAGING THE FINAL SHOOT-OUT AND GETTING FLIPPED OVER A CAR FOR WOLFSBy CHRIS McGOWANImages courtesy of AppleTV+.After having worked together on two MCU Spider-Man movies Homecoming and Far From Home and having had a great time doing so, writer-director Jon Watts and VFX Supervisor Janek Sirrs teamed again on Wolfs, an action comedy in the fixer category that is light years away from the MCU. Regarding Watts, Sirrs recalls, I think I helped with his baptism of fire into the Marvel VFX world, so perhaps he felt he owed me one. Either way, I think we both have a similar dark sense of humor and an appreciation of the absurd. He probably thought I was a good fit for Wolfs given its black comedy undertones.Previously, Sirrs shared an Oscar for The Matrix as Associate Visual Effects Supervisor and shared nominations for Iron Man 2 and The Avengers as Visual Effects Supervisor. For The Matrix, I originally came on board to help out with a show that was already up and running, but the role dramatically expanded once I arrived on site, Sirrs says. On both of those Marvel shows I was the Overall Production Supervisor, [arriving] during early development stages, running all the way through until the last shots were delivered in post.Brad Pitt and George Clooney in the freezing cold of New York. Despite shooting in the middle of winter in New York, it only snowed about 30 minutes total, so, for consistency, nearly every shot with snow in the final movie has digital snow in the air and/or on the ground.Because we werent allowed to film on the real [Brooklyn] bridge itself, we recreated select limited portions a construction site at the base, a gangplank running through the arches, the catwalk above a cross street, the ladder that leads from the catwalk to the top and the roadway up top as partial set pieces surrounded by bluescreen that were then digitally extended. All these partial sets were built in the overflow parking lot at Six Flags Magic Mountain amusement park just north of Los Angeles [in Valencia].Janek Sirrs, Visual Effects SupervisorIn the Apple TV+ film Wolfs, two lone-wolf, super-efficient rival fixers (George Clooney and Brad Pitt) are called in separately to clean up the seemingly accidental death of a twenty-something man (the Kid) in a very high-end hotel room. The panicking guest is Margaret, a Manhattan district attorney (Amy Ryan) who fears a huge scandal and has contacted an unnamed fixer, identified later only as Margarets man (Clooney), to cover things up. Meanwhile, Pam (the owner of the hotel, voiced by Frances McDormand) has viewed everything through hidden cameras and called in another fixer, referred to as Pams man (Pitt), to tidy up.The film relied solely on storyboards, many of them mocked up by director Jon Watts using a 3D storyboarding program called Frameforge.Most reluctantly, the two professionals are compelled to join forces despite their egos and bickering. (Clooney and Pitt previously worked together on the Oceans trilogy.) Complicating matters further, the Kid (Austin Abrams) was carrying a backpack full of drugs that belongs to the Albanian mafia. Plus, the Kid wasnt actually dead; he was merely overdosing. After waking up, he escapes, clad only in his underwear, and Jack and Nick must chase him down freezing New York City streets. To clean up a mess that just keeps growing, Jack and Nick must also deal with various unexpected factors such as when they get sidetracked by a Slavic kolo dance at a wedding and are recognized by Dimitri, a Croatian mobster (Zlatko Buric). The cast also includes June (Poorna Jagannathan) and the Kids dad, Richard Kind.In addition to writer/director Watts, the production included Larkin Seiple (Cinematography), Jade Healy (Production Design), Andrew Max Cahn (Supervising Art Director), and Conrad V. Brink Jr. and Elia P. Popov (Special Effects Supervisors). Plan B Entertainment and Smokehouse Pictures co-produced with Apple, which has signed Watts to direct, write and produce a Wolfs sequel.Clooney and Pitt talk to blood-covered (Amy Ryan). Interiors were mostly a stage build shot at the Ace Mission Studios in downtown Los Angeles.We had originally thought that the car-flipping moment would be the trickiest, requiring more extensive digi-double work than we ultimately needed. What saved us from uncanny valley hell was deciding early on that wed never need to show the entire stunt in one unbroken moment and could split it into several smaller, more manageable chunks. Added into the mix was Austin [Abrams] game-for-anything attitude, which meant that we never had to deal with any sort of digital face replacements.Janek Sirrs, Visual Effects SupervisorThe primary vendors that handled the bigger scenes were Framestore and beloFX. Framestore essentially did the major stuff in the first half of the movie, while beloFX focused on the latter half. Capital T took care of all the other miscellaneous smaller FX across the whole picture. Rodeo FX was also brought in to specifically handle the final diner scene, Sirrs explains. No previs was done on show at all. Instead, we went all primitive and relied solely on good old-fashioned storyboards. Director Jon Watts actually mocked up many of his own boards himself, using a 3D storyboarding program called Frameforge.Pitt and Clooney threaten the Kid (Austin Adams on the bed) to find out the origin of the stolen drugs in his backpack. In Wolfs, two lone-wolf rival fixers (Clooney and Pitt) are called in separately to clean up the seemingly accidental death of the Kid in an upscale hotel.Exteriors, except for the final warehouse shoot-out and the Brooklyn Bridge partial set work, were filmed on location in New York, which meant a series of long, cold winter nights running around Chinatown and parts of Queens and Brooklyn, Sirrs recalls. The warehouse exteriors were filmed in the thankfully much warmer warehouse district just east of downtown Los Angeles. And the ensuing warehouse interior action was shot in one of those warehouses. Pretty much every other interior was a stage build shot at the Ace Mission Studios in downtown Los Angeles.Bluescreen was used for the Brooklyn Bridge portion of the chase sequence. Sirrs explains, Because we werent allowed to film on the real bridge itself, we recreated select limited portions a construction site at the base, a gangplank running through the arches, the catwalk above a cross street, the ladder that leads from the catwalk to the top and the roadway up top as partial set pieces surrounded by bluescreen that were then digitally extended. All these partial sets were built in the overflow parking lot at Six Flags Magic Mountain amusement park just north of Los Angeles [in Valencia]. This meant we could have real cars driving through the upper roadway set. Sirrs adds, The obvious big VFX scenes were the chase through Chinatown and the immediately following Brooklyn Bridge action. But the shoot-out outside the warehouse is pretty much 100% VFX-enhanced in terms of the weather conditions. We did utilize LED projection for the various car interior driving scenes, but only as simple flat screens projecting process plate, not any sort of 3D volume.Pitt and Clooney threaten each other in the wedding party scene where they are recognized by Croatian mobster (Zlatko Buri) surrounded by cameras, lighting and sound equipment in a behind-the-scenes shot.The Brooklyn Bridge action and surrounding driving scenes were probably the most complicated scenes of the film, logistically. Stunt driving in New York is very restrictive these days, with essentially all vehicles having to follow the normal flow of traffic and not exceed posted speed limits. Taken together, that makes constructing a high-speed chase scene a little bit challenging. The final result relied upon a combination of practical stunt driving and digital vehicles for the faster, more extreme moments, Sirrs explains.Stunt driving in New York is very restrictive these days, with essentially all vehicles having to follow the normal flow of traffic and not exceed posted speed limits. Taken together, that makes constructing a high-speed chase scene a little bit challenging. The final result relied upon a combination of practical stunt driving and digital vehicles for the faster, more extreme moments.Janek Sirrs, Visual Effects SupervisorSnow, both falling and settled on the ground, was an invisible effects challenge. Despite shooting in the middle of winter in New York, it only snowed for real for a duration of about 30 minutes total, Sirrs notes. Shooting permits only allowed SPFX-crushed ice to be put down on sidewalks, not on the streets themselves, and wind conditions in the street made using SPFX foam snow towers erratic at best. So, we really didnt capture the desired weather look in-camera. In the final movie, nearly every shot that you can see snow in has a healthy dose of digital falling snow and/or digital snow ground cover. Snow accumulation also had to track across the entire picture, starting from clear to a more winter wonderland look by the time we reach the climactic shoot-out.The Kid (Austin Abrams) executes a flawless flip over a moving car, with the help of one digi-double shot, bluescreen, a buck and wires and Abramss game stunt work.Director Jon Watts consults with Pitt and Clooney, who previously worked together on the Oceans trilogy. Exteriors, except for the final warehouse shoot-out and the Brooklyn Bridge partial set work, were filmed on location in New York. The final shoot-out and warehouse exteriors were shot in Los Angeles.Behind the scenes on Wolfs with writer/director Jon Watts, a veteran director of films in the MCU, including three Spider-Man films.The scene of the Kid leaping and flipping over a car was a particular challenge and involved VFX and Austin Abrams doing his own stunt work. With the exception of one digi-double shot, everything is ultra-high-speed composites of Austin on a bluescreen stage being hit with a soft foam, blue, car-shaped buck and flipped on wires, combined with plates of the real car driving slowly through the Chinatown street location, Sirrs describes. On stage, we could light Austin brightly enough to shoot at roughly 1,000 fps, but on location we were limited by how much light we could physically cast onto the street, and so we had fake slow motion by shooting at 24 fps and driving very slowly. Adding slow-motion digital falling snow atop everything was the final step in selling the frame-rate trickery. We had originally thought that the car-flipping moment would be the trickiest, requiring more extensive digi-double work than we ultimately needed. What saved us from uncanny valley hell was deciding early on that wed never need to show the entire stunt in one unbroken moment and could split it into several smaller, more manageable chunks. Added into the mix was Austins game-for-anything attitude, which meant that we never had to deal with any sort of digital face replacements.The most satisfying VFX scene for Sirrs was the Kid flipping over a car for sheer comic value. Abrams endured a great deal to make the scene work. Sirrs comments, If it wasnt bad enough that we forced Austin to run around the freezing Chinatown streets at night completely exposed to the elements, we then added insult to injury by running him down with a car.0 Comments 0 Shares 158 Views
-
WWW.VFXVOICE.COMCONQUERING THE STREAMING HIGH SEAS WITH VIKINGS: VALHALLA SEASON 3By TREVOR HOGGImages courtesy of MPC and Netflix.Setting sail on the third season of the spin-off series Vikings: Valhalla was MPC, which acted as the sole vendor, supported by Take 5 Productions in-house team. MPC facilities in Toronto, Mumbai and Bangalore provided 500 visual effects for eight episodes, with the main focus on creating CG environments that existed in the 11th century, in particular Syracuse, Constantinople, Greenland, Poland, Kattegat and Winchester Cathedral. Other significant challenges involved a collapsing cliff and a magic trick played upon villagers attempting to burn Freydis Eriksdotter (Frida Gustavsson) at the stake, instead finding their homes set ablaze.[MPC Ocean tool] gets you 80% of the way there. For the shots it wasnt working, we started figuring out, Do we tie all of these patches of the ocean? Do we create a new ocean? Do we need to add more detail or foam to help blend this into our environment? Having done water for 1,400 shots over the last two seasons, it was great to use a new tool for that and see what worked and didnt work. They had to fix and retool some things for us, if not by request, just by us breaking and using it for what they didnt think it was for.Ben Mossman, VFX Supervisor, MPCOne of the major new environments that had to be created for Season 3 was Constantinople.That was a big story point in Episode 308, where everybody thinks Freydis is a witch, and she starts lighting things on fire in the town, but in reality its the other guys who set them up, states Ben Mossman, VFX Supervisor at MPC. We did put a flame bar in front of her, but it had to be four feet out for safety so none of the actual wood caught fire. There is a fake faade of the little pile that shes attached to that we put out further. They had a flame bar that would go up when its supposed to happen, so we actually had real fire to cue it, but there wasnt much that could be done practically right around her. For the roof earlier in the sequence, we did light it on fire, but we had to add more [roof] and replace some of it. There was some practical smoke on set; however, for the rest of it, we had to add in for them to be able to run away and be included by it. It was about getting the timing down and rehearse when things are supposed to happen, like, Now, you cant see them. Now theyre running away. Because it was fire, there wasnt much else we could do! Its also fire during the daytime too, so its not like you get a big glow of interactive light. Its there in all of its brightness.It took us quite a bit to get there in terms of the timing of the animation, look, what elements do we want to see, how much of this cliff do we want to collapse, and how big are the pieces [of rock falling from the cliff onto the Viking ships]? We have water splashing the rigging and sails, and Vikings being ejected. There was a version where there were definitely more guys getting obliterated, and we doubled it back. One shot we did was full CG, but everything else had to tie back in with practical water.Ben Mossman, VFX Supervisor, MPCMost of the shot designing was determined by storyboards, with previs being done for Episode 303 to figure out how the digital extensions for Constantinople would look like with the plates captured in Croatia and where in the city to shoot. We took a model of Dubrovnik and did some fly-throughs in Unreal Engine to mock up what the extensions would be like or add an army to how see how that filled the space, Mossman remarks. One of our on-set guys, Adam Stewart, did a lot of our scanning and also had a background in animation and modeling. Occasionally, we would ask him to help us out with some previs and scouting. We were able to get an early scan of Croatia that we had purchased, so when we got there we were already able to drop cameras into a physically accurate location which was cool. That was helpful because even while in Croatia scouting, we were able to go back at the end of the day, adjust our cameras to what the directors were wanting to see, and show them a new version of some of their shots to see how they were feeling about it. The physical set for Kattegat is the same one from the original Vikings series on the History Channel, which took place 150 years earlier. Mossman notes, There were things that were legacy that were brought over, but otherwise new showrunner, stories and worlds, so we had a lot of room to depart from that.Modern-day boats had to be replaced with period-accurate vessels.No drastic design changes were made to the Viking boats over the course of the three seasons as the passage of time is measured in years, not decades. We were able to move around a lot of the boats between the various Vikings factions; however, there were different shields and flags, Mossman adds. Kattegat grows over the seasons, so we expanded the town and the same with London to give it little updates that people might notice. Characters are traveling all over the world, especially in Season 3 where they are in Italy, Turkey and Poland. There was always a big crop of environments that would come up that would change every season that we would have to figure out how to do and how extensive it was going to be. Syracuse and Constantinople were the two big new environments and were seen at a hero level. Mossman explains, Syracuse and Constantinople were historical cities that existed, so you wanted to try to figure out what they actually looked like a thousand years ago and still make it work for what we needed for the story. We looked at a lot of fortresses that were built in Spain, Turkey and the Middle East that had the same architecture as what would have been in Syracuse at the time. Constantinople was a huge a city, and pieces of it are still in Istanbul today. There were maps and lots of architectural examples from that time, which the Byzantines either took from Rome or created themselves.Bluescreen assisted in getting the proper scope for environments and the correct number of soldiers.Construction methodologies from the period were factored into the digital assets. In general, its not being machined or being created by advance construction techniques, Mossman remarks. Its all being done by hand with ropes and pullies. That goes with any of the environments that we built where you want this feel that these are trained people who are good at what they do, but there is unevenness in the bricks and not 100% precision in how everything fits together. The Syracuse and Constantinople assets were handled by MPC Bangalore. We have an amazing environments team there that we got to work with, and they were insanely fast! Work was shared all across the sites. A huge water component had to be accommodated by the pipeline. Mossman observes, The advantage that MPC Toronto had in not using the MPC Ocean tool much before was we broke it immediately, which was cool! It gets you 80% of the way there. For the shots it wasnt working, we started figuring out, Do we tie all of these patches of the ocean? Do we create a new ocean? Do we need to add more detail or foam to help blend this into our environment? Having done water for 1,400 shots over the last two seasons, it was great to use a new tool for that and see what worked and didnt work. They had to fix and retool some things for us, if not by request, just by us breaking and using it for what they didnt think it was for.There are 10 birds flying around that have a string tied to their legs with a parcel of fire on the end. The story point is Harald Sigurdsson [Leo Suter] gets these birds drunk to sedate them, so we had to figure out what does a drunk bird look like. Our technical animation department simulated the feathers on the birds as well as the string coming off them, and then effects would take over the rest of the string and fire so its interactively moving around. The fire conveniently does not quite go up the string until theyre out of the prison, but they end up lighting the whole prison on fire when going back to their nest.Ben Mossman, VFX Supervisor, MPCRocks being pushed off a cliff to destroy the invading Viking ships below in Episode 304 reads small on the page. It took us quite a bit to get there in terms of the timing of the animation, look, what elements do we want to see, how much of this cliff do we want to collapse and how big are the pieces? Mossman explains. We have water splashing the rigging and sails, and Vikings being ejected. There was a version where there were definitely more guys getting obliterated, and we doubled it back. One shot we did was full CG, but everything else had to tie back in with practical water. We had a real boat floating in this quarry lake that the set was built on, and for the collapse moment we added a second version that was already half in the water. The special effects team put in water cannons to shoot up with debris and water. It was a good blend to start with, but as the edit and look changed, we deviated further from that to get to the story that they wanted to tell.MPC Toronto broke the MPC Ocean tool when dealing with numerous shots featuring water.Traveling to England was not possible, so the art department built a floor, some pillars and a casket for the Winchester Cathedral funeral sequence while the remainder was digitally augmented. They wanted this dramatic lighting for the death of a character we have seen in the last two seasons and to fill the rest of the church with patrons to pay their respects, Mossman describes. That was a small set, so most of it was done with bluescreen. Crowds in the cities were given special attention. We had one of our mocap and animation leads in Toronto, Charlie DiLiberto, create these little vignettes of people talking and waving. That would be passed off to our crowd team working in Houdini in Bangalore, and they would incorporate it into the crowd system. If we saw that something was missing, another round of performance would be added.For Season 1, we did almost 1,000 shots, and this season was half that, but with the timelines and the complexity of the work and wanting it to look better with every season, it was just as hard to nail down some of these big environments, and we had a lot more effects-driven story points, like the rocks falling and the fire. With that comes more attention from everybody wanting it to look good and hit the beats that theyre after. It was honing in on that stuff and being able to execute it within the time that we had with our showrunner, producers and editors.Ben Mossman, VFX Supervisor, MPCThe birds flying around the fire consisted of only a couple of shots in Episode 307, but it was technically difficult to execute as several departments had to divide and conquer the sequence. Mossman says, There are 10 birds flying around that have a string tied to their legs with a parcel of fire on the end. The story point is Harald Sigurdsson [Leo Suter] gets these birds drunk to sedate them, so we had to figure out what does a drunk bird look like. Our technical animation department simulated the feathers on the birds as well as the string coming off them, and then effects would take over the rest of the string and fire so its interactively moving around. The fire conveniently does not quite go up the string until theyre out of the prison, but they end up lighting the whole prison on fire when going back to their nest.A dramatic moment in Episode 304 is when rocks are pushed off a cliff and sink an invading Viking ship.Buildings were constructed digitally, keeping in mind the tools and techniques used during the period being depicted.Landscapes were altered to give environments a more cinematic quality.Skies were replaced to make shots moodier.Clouds were among the atmospherics added to shots.Working on Vikings: Valhalla involves constructing familiar boats and creating new environments.Fewer shots and a shorter post-production period do not entirely reflect the effort compared to previous seasons. For Season 1, we did almost 1,000 shots, and this season was half that, but with the timelines and the complexity of the work and wanting it to look better with every season, it was just as hard to nail down some of these big environments, and we had a lot more effects-driven story points, like the rocks falling and the fire, Mossman states. With that comes more attention from everybody wanting it to look good and hit the beats theyre after. It was honing in on that stuff and being able to execute it within the time that we had with our showrunner, producers and editors.Working with the same client and creatives over the three seasons has streamlined the process. Mossman comments, We have been able to keep together a chunk of the team, so that helped to keep the consistency and shorthand, especially with animation and matte paintings. He says the combination of the new and the old keeps the work interesting. Its fun because we have these new things every season that we can create, and then there is the stuff we know how to do and didnt have to build new assets for. We can knock those shots out first and feel good about them while were scratching our heads over these giant environments and armies that are new to the season.0 Comments 0 Shares 197 Views
-
WWW.VFXVOICE.COMPIXOMONDO PUTS THE DRAGON INTO HOUSE OF THE DRAGON SEASON 2By TREVOR HOGGImages courtesy of Pixomondo and HBO.Commencing with Season 2, Pixomondo has been associated with Game of Thrones, and this relationship carried over to the spin-off series House of the Dragon, culminating in a full-scale dragon battle referred to as the Dance of the Dragons. Overseeing the eight-episode contribution that consisted of 600 shots with 150 of them featuring the title creature was HBO Visual Effects Supervisor Dadi Einarsson, who was determined that the dragons be realistic and their distinct personalities and size differences were emphasized.The challenge wasnt just creating complex, organic creatures with muscles and loose skin reacting naturally; it was also about ensuring the audience could emotionally connect with them, states Sven Martin, VFX Supervisor at Pixomondo. This way, when the story demands, viewers can truly feel the loss of these big pets. From a visual standpoint, Dadi was committed to grounding the effects in real-world behavior. He avoided impossible camera movements and made sure to silhouette the dragons against bright skies, just as you would if filming them in the real world. This attention to detail helped seamlessly integrate the visual effects shots with the surrounding live-action photography, enhancing the realism of these awe-inspiring creatures.Actress Eve Best (Rhaenys Targaryen) sat on motorized buck to simulate riding a dragon. Final composite of Rhaenys Targaryen riding Meleys to battle, combining footage of the actress with a digital environment and dragon.[The Battle of Rooks Rest] scene features intricate aerial combat between three dragons while a massive army lays siege to a castle below. We worked closely with director Alan Taylor and [VFX Supervisor] Dadi Einarsson to visualize the scale of the battle and choreograph the dragon movements, ensuring the sequence aligned with Alans epic vision for the scene. This sequence required extensive techvis for the dragon-riding scenes and battles, which involved significant visual effects elements. This process helped plan the shoot and ensured smooth integration of CGI with live-action.Matt Perrin, Senior Visualization Supervisor, PixomondoWhen mapping out the major tasks, the most ambitious one was the Battle of Rooks Rest in Episode 204. One of our key priorities was ensuring the dragon rigs were efficient enough for the animators, allowing them to work fluidly even when handling scenes with three massive dragons, Martin remarks. We also knew that during aerial dragon battles, we would need to rely more heavily on CG fire rather than practical elements. Controlling fire simulations, especially at high speeds, over long distances, and with dragons spiraling through the air, while accounting for the effect of their wing flaps on the flames, was a critical focus early on. Additionally, we anticipated that the dynamic, agile camera work would require a fully CG replica of the original shooting location. To achieve this, we began by building the environment based on a LiDAR scan and aerial photography, ensuring a seamless integration of visual effects with the live-action footage.Plate photography of actor Tom Glynn-Carney in front of a bluescreen and a blue dummy head to frame where the dragon will be when added in post. Final VFX shot of King Aegon Targaryen preparing his dragon Sunfyre for battle.Only minor adjustments were made to Vhagar and Syrax, but extensive sculpture and texture work was required on Meleys, which was modeled by MPC in Season 1. As Meleys was featured more prominently in Season 2, often in bright daylight and undergoing several stages of wounds and damage, we needed to enhance her detail, Martin explains. Our art department used photo references of red lizards to overpaint images, ensuring large portions of these photos remained intact to avoid an overly painterly or artificial look. In fact, some of the perfect references were found in a local pet shop, where I stumbled upon a dark red lizard that matched our vision. Since Sunfyre was barely seen in Season 1, we created his model from scratch, investing considerable time in sculpting, texturing and shading to ensure he lived up to his title as the most beautiful dragon in Westeros, as described in the books. Throughout production, Ryan Condal [Executive Producer/ Showrunner] and Dadi Einarsson worked closely with us to fine-tune Sunfyres look, balancing his golden beauty without pushing too far into the realm of fantasy so he would still feel grounded in the scene alongside other dragons. Additionally, we introduced several new dragons in Season 2, including the new mysterious wild dragon, Tessarion [also known as the Blue Queen], a new baby dragon named Stormcloud, and a revival of the original baby dragons from Game of Thrones.Plate photography of soldiers and battlefield. Final VFX shot of dragon Meleys wreaking fiery havoc on the battlefield, with Rooks Rest Castle in the background.Several key sequences required previsualization with the focus primarily on the dragon flight scenes. Notable examples include Daemon and Caraxes journey to Harrenhal and Moondancers chase across the Crown Lands, remarks Matt Perrin, Senior Visualization Supervisor at Pixomondo. The most complex sequence we previsualized was the Battle of Rooks Rest. This scene features intricate aerial combat between three dragons while a massive army lays siege to a castle below. We worked closely with director Alan Taylor and Dadi Einarsson to visualize the scale of the battle and choreograph the dragon movements, ensuring the sequence aligned with Alans epic vision for the scene. This sequence required extensive techvis for the dragon-riding scenes and battles, which involved significant visual effects elements. This process helped plan the shoot and ensured smooth integration of CGI with live-action. The time spent designing and technically deconstructing the sequence for shooting proved worthwhile, as the final result closely matched the previsualized version.Playblast of Meleys, showing some of the controls from the dragon animation rig.Refined beyond typical levels was the previs animation for Caraxes and Moondancer. This animation helped define the motion of the cameras and dragon flights, informing the programming of the motion-control camera rig and the multi-axis gimbal buck rig, Perrin states. Guided by previs, the team was able to execute precise, repeatable movements, especially in action-heavy dragon fight scenes! This ensured that the final footage of these dragon sequences closely matched the vision established during previs, resulting in highly dynamic shots that seamlessly blend practical elements with CGI.[T]he previs animation for Caraxes and Moondancer] helped define the motion of the cameras and dragon flights, informing the programming of the motion-control camera rig and the multi-axis gimbal buck rig. Guided by previs, the team was able to execute precise, repeatable movements, especially in action-heavy dragon fight scenes! This ensured that the final footage of these dragon sequences closely matched the vision established during previs, resulting in highly dynamic shots that seamlessly blend practical elements with CGI.Matt Perrin, Senior Visualization Supervisor, PixomondoPlate photography of actor Ewan Mitchell looking on at the destruction post-fight. Final VFX shot of Aemond Targaryen looking at the crash site of Sunfyre, with additional smoke and fire to underline the post-apocalyptic feel of the battles aftermath.A huge amount of interaction and sharing took place between previsualization, technical setup, virtual production and the post-production visual effects team. We built virtual replicas of the stage and rigs that we would use on the shoot, states James Thompson, Virtual Production Supervisor at Pixomondo. That meant that we could easily build and reconstruct the moves for the dragons and the cameras from the CG into the real world. For the virtual production, this data was fed through our real-time system to trigger the rigs, lighting and any other elements running on the LED wall on set. A lot of the elements we used on the LED panels came from the previs, like the shadows created from the wings of the dragons. The animation data from the previs was solved, interpreted and used to run the buck rig and the robot that the camera was mounted to. For post, if needed, we were able to provide data from the shoot, such as corner pin data to help with any re-positioning and tracking of plates back into the shots.It is one thing to create a dragon but quite another to have to add a rider. A critical part making the dragon riders appear to be believable was making sure that when we took the animation from previs and solved it to work on the set with the rigs, that it was deconstructed as little possible, Thompson reveals. We really needed to have the buck with the actor on it move as closely as possible to the character flying in the previs. Sometimes we had to dial back the speeds and range of movement quite a bit to fit within the mechanical limitations of the rigs. Getting this right would mean the actors bodies would endure, as close as possible, the real physical forces of the movements that would be impossible to act out accurately. Other techniques we employed to enhance believability were things like adding cue bloops on the panels for the actors so they would know when to act out a certain performance. We also used eyeline markers so they knew where to look at a given time based on the previs and the current move they were doing on the buck. This combined with the various lighting effects that we added onto the LED walls, such as fire, helped sell all of this.Grayscale model of the bridge to Dragonstone. Full CG environment of the bridge to Dragonstone.A critical part making the dragon riders appear to be believable was making sure that when we took the animation from previs and solved it to work on the set with the rigs that it was deconstructed as little possible. We really needed to have the buck with the actor on it move as closely as possible to the character flying in the previs. Sometimes we had to dial back the speeds and range of movement quite a bit to fit within the mechanical limitations of the rigs. Getting this right would mean the actors bodies would endure, as close as possible, the real physical forces of the movements that would be impossible to act out accurately.James Thompson, Virtual Production Supervisor, PixomondoCarried over from Season 1 were the assets for Dragonstone and the Grand Sept. One significant change was the decision to shoot Dragonstone Island in a new location, a quarry that wasnt even near the sea, Martin notes. However, it provided a natural base for the castle and had a mountain behind it that we transformed into Dragonmont. Since the original location in Spain was no longer available, and with far more exterior scenes around the castle in Season 1, production chose a real location where we could digitally add missing elements, like the iconic zig-zag bridge and the crescent-shaped island with its gate. I appreciated this approach, as these new plates gave us the ideal foundation for seamlessly blending our CG extensions. Using a scan of the location, we modified the castle to fit its new setting and incorporated a set-build for the entrance, which hadnt been prominently featured before. The Grand Sept, originally a real-time asset used on the LED stage in Season 1, was also adaptable for the traditional Maya pipeline since it had been meticulously built. Driftmarks drydock was a 3D extension of the backlot set, complete with additional ships under repair. Meanwhile, The Eyrie and its surrounding alpine landscape were created using 3D geometry and concepts provided by the art department.Final VFX shot of Rhaenys Targaryen flying dragon Meleys across the sea to battle.Previsualization of Rhaenys Targaryen flying dragon Meleys towards battleFor shots of the dragon riders, the actors sat on a motorized buck and were filmed with a motion-control camera, both of which were driven by the motion data from the previs. Pixomondo devised a new system that allowed for live comps of those buck-plates over the previs on set so the filmmakers could get a better sense of what the final shot will look like while shooting.[T]he greatest challenge is ensuring the emotional depth isnt lost amidst the technical demands. The story of the riders and their dragons, their conflicts and the impact on them was always a central focus. For instance, when Meleys succumbs to her demise at the end of the sequence, we worked closely with [VFX Supervisor] Dadi [Einarsson] to perfect her performance, paying meticulous attention to the timing of her eye blinks and the subtle details as Vhagars jaws close around her. The intricate skin simulations and the depiction of blood channeling between her scales were the finishing touches, enhancing the emotional weight of the scene.Sven Martin, VFX Supervisor, PixomondoAmong the complex shots to execute were the numerous aerial ones featuring two to three dragons fighting over a fully CG battlefield. One particularly challenging shot involved Vhagar stomping through the battlefield in slow motion after a crash-landing, which caused an explosion of ash, dirt and smoke, Martin states. The original plate was shot in bright sunlight and needed to be transformed into a drastically different environment. This required integrating CG soldiers, interactive smoke, ground simulations with exploding dirt and crushed soldiers into the scene. The compositing team faced significant challenges with extensive rotoscoping and dimming the sunlight to achieve the desired effect. In such a technically complex sequence, the greatest challenge is ensuring the emotional depth isnt lost amidst the technical demands. The story of the riders and their dragons, their conflicts and the impact on them was always a central focus. For instance, when Meleys succumbs to her demise at the end of the sequence, we worked closely with Dadi to perfect her performance, paying meticulous attention to the timing of her eye blinks and the subtle details as Vhagars jaws close around her. The intricate skin simulations and the depiction of blood channeling between her scales were the finishing touches, enhancing the emotional weight of the scene.Bluescreen resembling a staircase allows for proper beach interaction between rider and dragon.Digital doubles were necessary to get the proper scope of the crowds.Devastation was digitally augmented to the pristine landscape.Adding further complexity to the aerial battles was having to simulate dragon fire. The rapid movements and high speeds of the dragons made it difficult to ensure that the fire behaved physically accurately, Martin remarks. The simulations needed to align perfectly with the real flamethrower elements captured by the special effects team. Additionally, we had to account for sparks when the fire hit the opposing dragon and integrate smoke to enhance the visual complexity. We opted for simulated wounds rather than pre-sculpted ones to allow for last-minute animation adjustments. Using Houdini, we simulated dragon claws ripping through the skin, causing inner fat and flesh to push out while blood gushed from the newly created wounds. For the flying scenes, we used digital wisps and clouds to enhance movement and parallax against the distant ground or ocean. Instead of traditional cloud renderings from Maya or Houdini, we rendered volumetric clouds and skies in Unreal Engine. This approach enabled us to make extremely fast adjustments to speed, position and lighting. Dadi Einarsson and HBO VFX Producer Thomas Horton and the rest of the HBO production team fostered a collaborative environment. Their deep understanding of our complex workflows and their appreciation for the dedication of everyone at Pixomondo made the process not just enjoyable, but a wonderful ride or better said, dance.0 Comments 0 Shares 234 Views
-
WWW.VFXVOICE.COMTAKASHI YAMAZAKI ACHIEVES KAIJU-SIZE SUCCESSBy TREVOR HOGGImages courtesy of Takashi Yamazaki, except where noted.Takashi Yamazaki and Stanley Kubrick are the only directors to have ever won an Oscar for Best Visual Effects.When people think about Japanese cinema, Akira Kurosawa and Hayao Miyazaki often get mentioned, but that is not the entire picture as renowned talent has emerged from younger generations, such as Hirokazu Kore-eda, Mamoru Hosoda, Makoto Shinkai and Takashi Miike. Another name to add to the list is Takashi Yamazaki, who accomplished a feat only achieved by Stanley Kubrick when he became only the second director to win an Academy Award for Best Visual Effects, and in the process reinvigorated a legendary kaiju [giant monster] franchise with Godzilla Minus One. What impressed him most was not being handed the golden statue but getting the opportunity to brush shoulders with his childhood idol. Receiving the Academy Award for Best Visual Effects was a great honor, but meeting Steven Spielberg at the Nominees Luncheon was perhaps an even more exciting moment, Yamazaki admits. It was a chance encounter with the God I had longed for since childhood.Previously, Yamazaki had established himself by adapting mangas, such as Parasyte and Always: Sunset on Third Street, with the sequel of the latter foreshadowing his feature film involvement with the King of Monsters, as he appears in an imagery scene. That scene was a short one, but it was just about as much as we could do with our technology and computing power we had. At that time, it was impossible to complete the visual effects for a two-hour Godzilla film with our capabilities. As time went by, we were able to process information that was incomparable to that time in terms of technology and computing power we had, so I thought I could finally create the Godzilla I envisioned and started this project. It was a good decision to wait until this happened and make the Godzilla I envisioned.Like the kaiju, manga are a cultural phenomenon. The best way to succeed as a creator in Japan is to become a manga artist. Therefore, the best talent is concentrated in manga. Furthermore, the ones who survive in the very tough competition are the ones who become known to the most people. There is no reason why the stories told by those at the top of the giant pyramid should not be interesting. Adapting a comic book into a film potentially requires the characters to be the comic book itself, which is difficult, Yamazaki says.To help define Godzillas look, Yamazaki and the animator spent time developing Godzillas walk in Godzilla Minus One. (Image courtesy of Toho Company)The science fiction genre is interesting in that it can create things that do not exist in this world. I also like the fact that it can be used as an allegory with various messages. The biggest reason for my attraction is that it excites my inner child.Takashi Yamazaki, Director,Godzilla Minus OneGrowing up in Matsumoto, Japan, Yamazaki had a childhood fascination with insects and crafts. I was surrounded by nature, so I collected insects and lizards and observed them. I was also a child who preferred drawing paper to toys and would request 100 sheets of drawing paper as a Christmas present. Neither of his parents had much to do with the arts. My father was good at drawing, and I remember that when I asked him to do something, he would do his best to draw me Ultraman or some other character. A cinematic turning point was getting the opportunity to watch the sci-fi classic by Steven Spielberg, Close Encounters of the Third Kind. What was shocking was the scene where the giant mothership flips over. With the visual effects before this, it took some effort to believe that it is real, but this was the first moment when I had the illusion that it is real.Four-year-old Takashi Yamazaki stands in front of Matsumoto Castle with his family.Takashi Yamazaki started off as a model maker for Shirogumi in 1986.Godzilla destroyed the Tokyo shopping district of Ginza. (Image courtesy of Toho Company)In 2009, Takashi Yamazaki directed the action romance drama Ballad: Na mo naki koi no uta, where a young boy prays for courage to a great Kawakami oak tree and finds himself transported to feudal Japan.A major reason that Godzilla Minus One won the Oscar for Best Visual Effects is that there were both practical and digital effects.Yamazaki became part of the Japanese film industry while studying film at Asagaya College of Art and Design. When I was at art school, many expos were being held in Japan, and Shirogumi, which was skilled in creating unique visuals, was producing visuals for many of the pavilions, Yamazaki explains. There was a part-time job offer for this, and I was able to join Shirogumi as a result of participating in it. Visual effects were led by TV commercials, which had a relatively large budget to work with. We were also trying to introduce the techniques we had tried in TV commercials into film. Around the time I made my debut as a director, CG became more readily available. At that time, it was very difficult to scan live-action parts in theatrical quality, so we even built a scanner in-house that was converted from an optical printer. The pathway to becoming a director began when there was a call for pitches within Shirogumi leading to the production of Juvenile [2000], which revolves around a tween having an extraterrestrial encounter. The president of the company showed the idea I submitted there to Producer Shuji Abe, who was the president of another company; he liked it and worked hard on it, leading to my debut film.Science fiction goes beyond spectacle. The science fiction genre is interesting in that it can create things that do not exist in this world, Yamazaki observes. I also like the fact that it can be used as an allegory with various messages. The biggest reason for my attraction is that it excites my inner child. With science fiction comes the need to digitally create what does not exist in reality. I decided to become a director because I wanted to make films with the type of visual effects I wanted to make in the first place. When I made my debut as a visual effects director, most Japanese films didnt have spaceships or robots in them. I think that having three jobs at the same time is economical because I can judge things quickly and write scripts with the final image in my mind, so there is no loss of time.Yamazaki has directed 20 feature films. You never know what will be a hit, so when I have an original story, I only base it on whether it excites me or not. Making a film means you have to live with the original story for a number of years, so if its not a good match, it becomes hard to get on with it. I simply ask for good actors to join the cast. I am basically a person who wants to do everything myself. When it comes to the staff, I try to ask for people who are at least more skilled than me, people who have talent that I can respect.In Japan, Godzilla represents both God and Monster, so Takashi Yamazaki wanted its movement to feel almost divine or God-like in Godzilla Minus One. (Image courtesy of Toho Company)International markets are rarely taken into consideration when approving film budgets in Japan. This is because for a long time it was said that Japanese films could not go mainstream even if they were released overseas, and that was probably true, Yamazaki states. It was a great surprise that Godzilla Minus One was seen by so many people overseas, and to put it bluntly, it was a joyful experience that opened up new possibilities for future film production in Japan. Hopefully, the budget will reflect that element. I guess well just have to build up our track record and prove that pouring big budgets into it is not a bad option. Stories scripted and directed by Yamazaki have ranged from siblings trying to learn about their grandfather who died as a kamikaze pilot in World War II in The Fighter Pilot, to contributing to the Space Battleship Yamato franchise where an interstellar crew attempt to locate a device to make a devastated Earth inhabitable again, to a forbidden book that can grant any wish but at the cost of a life-threatening ordeal in Ghost Book. The growing popularity of video games has not altered the essence of storytelling. Interesting stories are interesting in any media, and the core of stories that can be developed in various media continues to be influenced by stories that have been around for a long time.Back in the early digital age when Takashi Yamazaki was learning how to create visual effects.At the age of 10, Takashi Yamazaki ventures to downtown Matsumoto with his sister Satsuki.An extremely complex shot to design, create and execute is found in Godzilla Minus One, where a kamikaze pilot has to overcome survivor guilt in order to protect those he loves and Japan from the rampaging title character. The sea battle between Shinsei Maru, the mine disposal ship, and Godzilla was difficult because we had to combine a live-action small boat with CG waves and a giant Godzilla, Yamazaki reveals. The boat in the foreground is live-action, so it was a very time-consuming job to build the waves at a level that would blend in with it. Im glad it worked out.When asked what are the essential traits of a successful director and what has allowed him to have a long career, he responds, What it takes to be a successful film director is to keep everything interesting all the time, but I am not sure about the career. It would be bad if a film failed, so I think its easier to prolong my life if I get the next project off the ground before the next film is released. Yamazaki is appreciative of his good fortune. Thanks to the many people around the world who liked Godzilla Minus One. Godzilla Minus One has received many wonderful awards. I will continue to make films, treasuring the memories of the days I created with you all. Thank you very much. Arigato.The sea battle between the mine disposal ship Shinsei Maru and Godzilla was difficult because CG waves and Godzilla had to be integrated with the practical vessel.0 Comments 0 Shares 261 Views
-
WWW.VFXVOICE.COMVFX IN ASIA: BOOM TIMEBy CHRIS McGOWANMysterious creatures fall from space, prey on humans and use them as hosts in Parasyte: The Grey. The Netflix live-action sci-fi/horror series is based on the Hitoshi Iwaaki manga Parasyte. Dexter Studios provided the VFX. (Image courtesy of Dexter Studios and Netflix)The Asian VFX industry is experiencing a meteoric rise driven by a confluence of powerful forces, says Merzin Tavaria, Co-Founder and President, Global Production and Operations for DNEG. The region possesses a vast pool of highly skilled and technically adept VFX artists, a critical foundation for producing top-tier visual effects.Jay Seung Jaegal, VFX Supervisor for Seoul-based Dexter Studios, adds, I believe that the Asian region will become a new core base for the content and VFX industries in the future. As Asian VFX studios increasingly participate in global projects, their presence is expanding. Although they have already proven significant competitiveness and potential, I think there is still immense room for growth.Asia is playing an evolving role in shaping the global VFX ecosystem. Key regions and cities driving the growth of the Asian VFX industry include India, South Korea, Japan, China, Taiwan and Singapore, with Bangkok and Vietnam beginning to gain traction. Homegrown VFX studios like Dexter are on the rise, and multinational firms with VFX branches in Asia include DNEG, BOT VFX, Framestore, ILM, Digital Domain, Rotomaker Studios, Mackevision (Accenture Song), The Third Floor, Tau Films, Method Studios, MPC and Outpost VFX.South Korea has risen to become one of the most important Asian VFX hubs, and the trajectory of Dexter, founded in 2012, is one of the most impressive in South Korea. Jaegal says, As of the first half of 2024, the company has grown into a group with six subsidiaries connected. Dexters headquarters alone employs about 330 people, including around 200 VFX artists. Currently, Dexter Studios is active as a comprehensive content studio with an all-in-one system covering content planning, development, production, post-production, DI and sound. We are also expanding our business areas to new technology fields such as immersive content, AR/VR/XR and the metaverse.South Koreas Gulliver Studios handled the VFX for the Emmy-winning suspense/horror/survivalist series Squid Game. Season 2 is scheduled for December release. (Image courtesy of Gulliver Studios and Netflix)Along the way, Dexter has provided visual effects for several noteworthy films, including Parasite (2019), which captured four Academy Awards, including Best Picture. Jaegal comments, Parasite is a significant film in Korean cinema history as it was the first film to win an Academy Award. It marked a pivotal moment that showcased the excellence and prestige of Korean films to the world. Notably, Parasite is famous for its invisible VFX. Many people think that little VFX was used, but in reality, much of it was created after filming, including the two-story house of Mr. Park, the driving scenes and the neighborhood where Ki-Taeks family lives. Our company designed the VFX and DI [Digital Intermediate], and our subsidiary, Livetone, handled the sound, making us an all-rounder in post-production.Dexter also provided VFX for Space Sweepers (2021), which holds a special meaning as a Korean-style SF [sci-fi] film, Jaegal explains. It successfully [put together] a unique story involving space, the spaceship Victory and robots, which had not been commonly attempted in Korea at that time. We also handled all three post-production parts of this film. I think it redefined the standards for the space/SF genre that can be produced in Korea. Based on this experience, we [went on] to handle KSF-VFX for Netflixs JUNG_E, the Alienoid series, and The Moon. Recently, Dexter has worked on Knights of the Zodiac [with DNEG], YuYu Hakusho with Scanline VFX and Japans Digital Frontier, Gyeongseong Creature for Netflix and Parasyte: The Grey.A volcano erupts on the China/North Korea border and a special team tries to save lives and prevent further eruptions in Ashfall. (Image courtesy of CJ Entertainment and Netflix)Parasite, directed by Bong Joon-ho, won South Koreas first Academy Award for Best Picture in 2020. Dexter Studios provided the VFX for the class-conscious black comedy, most of the VFX invisible. (Image courtesy of Dexter Studios and CJ Entertainment)Gulliver Studios handled the VFX for Squid Game, winner of the 2022 Primetime Emmy Award for Outstanding Special Visual Effects in a Single Episode, which was among the six total Emmys garnered by the series. Squid Game VFX Supervisor Cheong Jai-hoon notes, After Squid Game was released on Netflix, it was gratifying and meaningful to see that viewers worldwide loved it, especially considering that they couldnt tell which parts were created with VFX.Gulliver Studios is a VFX company (also called C-Jes Gulliver Studios) established in 2019 in Goyang by C-Jes Studio. The latter manages actors, singers, and K-pop artists, and is involved in the planning and production of movies, dramas, and musicals, notes the firm. At the end of 2022, Gulliver Studios and C-Jes Studio merged to become a comprehensive content group that extends its scope from planning and producing theatrical films and OTT [Over-The-Top] content to post-production VFX.The first attempt to put Koreans on the moon ends in disaster, and a second try leaves one astronaut alone and stranded in space in The Moon, a survival drama about South Koreas manned flights. (Image courtesy of Dexter Studios and CJ ENM Studios)Looking at the growth of VFX in South Korea, Jai-hoon explains, Around 2015, there was a notable increase in the production of large-scale fantasy and action films within China, yet there werent many proficient VFX companies in China at the time. As a result, the majority of VFX work during that period was handled by Korean companies. As Korean VFX companies gained significant experience through working on various Chinese films, it led to substantial growth in the Korean VFX industry.As the volume of work in Korea increased exponentially, Korean VFX companies established satellite companies in countries like Vietnam and China, where labor costs were lower, and they also outsourced a significant portion of their work to India and Ukraine. As a result, the VFX industry in Asia experienced growth during this period, Jai-hoon remarks. By the late 2010s, the Chinese film industry faced a slowdown, which also halted the growth of the Korean VFX market. However, in the 2020s, the production of Asian content by platforms like Netflix and Disney+ revitalized the industry. Successes such as Squid Game and [prior to that] Bong Joon-hos Parasite [also] energized the global OTT production scene in Asia.DNEGs Indian crews contributed VFX to Godzilla x Kong: The New Empire. (Image courtesy of DNEG and Warner Bros. Pictures)Jai-hoon adds, Recently, there have been talks about Netflix increasing its investment in Korean content production and, following Disney+, even Amazon Prime is outsourcing a lot of work to Korean VFX companies. This signifies that the level of Korean VFX has already been recognized worldwide. Additionally, some global VFX companies like Scanline, Eyeline Studios and The Mill have recently entered the Korean market, gradually increasing their investment in Korean artists potential. As a result, existing Korean VFX companies are building pipelines according to Hollywoods VFX pipeline and standard production processes, different from the Korean system. Also, Korean artists with experience from abroad are gradually returning to Korean VFX companies.Westworld VFX in Goyang, Korea, was established in 2019 and has about 200 employees. Westworld handled the VFX for the Netflix sci-fi series The Silent Sea, the first large-scale project in Korea to use virtual production and LED walls. Asked about Asias VFX growth, Managing Director Koni Jung responds, Its difficult to say exactly, but the growth of young artists and the entry of global OTT platforms into Asia seem to be factors driving growth. [And] as Korean films and series achieve global success; an increasing number of overseas projects are being filmed and post-produced in Korea. Honestly, isnt it because it costs less than the North American market?Wooyoung Kim, Director of Global Business at Seoul-based VA Corporation, comments, As the investment of OTT platforms in the Asian market expanded during the pandemic, the budget for content rose significantly, and many content projects [were] planned that [could] expand expression in a technical direction. This led to successful outcomes for VFX companies in each country, allowing them them to showcase the technical skills that they may not have had in their home markets. VA successfully launched a Netflix series called Goodbye Earth, participated in Netflix Japans project YuYu Hakusho in 2023 and is working on the movie New Generation War: Reawakened Man.In India, DNEG has teams of thousands of talented artists spread across 10 locations (including Chennai, Bangalore, Mumbai and Chandigarh), encompassing both DNEG and ReDefine studios, according to DNEGs Tavaria. This strategic network allows for seamless collaboration with our Western counterparts on every DNEG and ReDefine film and episodic VFX and animation project. Were incredibly proud of the vital role that India plays in DNEGs global success. Tavaria continues, Our talented Indian teams play a pivotal role in all our top-tier international projects, from feature films to episodic television series. Just to name a few, our Indian teams have recently brought their magic to Dune: Part Two, Furiosa: A Mad Max Saga and The Garfield Movie, among others, showcasing their versatility across genres. Their expertise has also been instrumental in projects like Oppenheimer, NYAD, Masters of the Air, Ghostbusters: Frozen Empire, Godzilla x Kong: The New Empire, Borderlands and many others.Tavaria notes, Many Asian governments are actively nurturing the industrys growth. Take Indias AVGC Promotion Task Force, for example. This initiative recognizes the significant contribution VFX makes to the economy and aims to propel India further onto the global stage. By establishing a national curriculum focused on VFX skills development, the Task Force paves the way for India to produce even more world-class content.Larger Asian studios are staying ahead of the curve by rapidly embracing cutting-edge technologies. This ensures their VFX offerings and capabilities remain at the forefront of the global landscape. Tavaria says, This confluence of a skilled workforce and a commitment to technological innovation has solidified Asias position as a major player in the ever-evolving world of VFX.About DNEG, Tavaria comments, Were continually working hard to refine our global pipeline to open the door to a new era of creative collaboration across our locations. This allows our Western studios and Asian teams to work seamlessly together to push the boundaries of whats possible in VFX.Streaming has been another factor in the Asian VFX rise. Tavaria explains, The rise of streaming, along with a flourishing domestic film market, has fueled a surge in high-quality content, presenting a thrilling opportunity for the Asian VFX industry. This explosion of content demands ever more exceptional visual effects for Asian audiences that are hungry for stories that reflect their own cultures and aesthetics.The Masters of the Air wartime miniseries included VFX by DNEGs Indian visual artists. (Image courtesy of DNEG, Amblin Television and Apple Studios)Extraordinary Attorney Woo is an autistic lawyer with a photographic memory and a great love of whales. Westworld VFX contributed effects to the Netflix series. (Image courtesy of Westworld VFX and Netflix)South Koreas Dexter Studios co-produced and helmed the VFX for Jung_E, a near-future post-apocalyptic story about a warring faction on a desolate Earth that attempts to clone the brain of a legendary warrior to develop an AI mercenary and stop a civil war. (Image courtesy of Dexter Studios and Netflix)Space Sweepers was South Koreas first big-scale sci-fi film, proving they could handle the genre. (Image courtesy of Dexter Studios and Netflix)DNEGs Indian teams worked on VFX for Ghostbusters: Frozen Empire. (Image courtesy of DNEG and Columbia Pictures/Sony)Westworld VFX contributed effects to the Netflix series Black Knight, a South Korean television series based on a webtoon, a digital comic book read on smartphones in Korea. (Image courtesy of Westworld VFX and Netflix)BOT VFX has four locations in India (Chennai, Coimbatore, Pune and Hyderabad) and one in Atlanta. Our total team size is 800, says BOT VFX CEO and Founder Hitesh Shah. The firm has been working on many high-profile projects, including Kingdom of the Planet of the Apes, Fallout, 3 Body Problem, Shogun, Monarch: Legacy of Monsters, Knuckles, Civil War, The Boys 4, Furiosa: A Mad Max Saga and Bad Boys: Ride or Die.The size of the talent pool has been growing significantly in India thanks to nearly two decades of RPM work that motivated many new entrants to join the industry. According to Shah, What was a pool of several hundred artists in 2005 is in the tens of thousands today. Also, there is now a large base of training institutions that continually feed new talent into the ecosystem. From the large pool of talent, a portion has had the skills and the ambition to fill highly specialized and technical roles required to build full-service facilities.About the move to provide full-service VFX for Western clients, Shah comments, That shift is from three segments. First, those facilities that have been historically providing full-service VFX to the Indian domestic market are turning part of their attention to Western productions. Second, independent facilities that have historically been point-services providers for example, RPM roto, paint, matchmove are shifting towards full-service VFX. Finally, even large multinational VFX companies that have set up a footprint in India initially for point-services support [are] leveraging more Indian talent towards the full VFX value chain.Shah states, For India specifically, the growth of the VFX industry is driven by the strong value proposition it offers to Western productions in the form of three compelling components: strong cost advantage, a large talent pool and a broadly English-speaking talent pool that has an affinity with Western content.He adds, Despite strong tax incentives in other global regions and trends toward rising talent costs in India, the overall cost advantage of India is still compelling. It seems most Western productions implicitly, or sometimes explicitly, expect their VFX vendors to bake in the lower costs of getting RPM work done in India into their bids. Finally, the affinity to Western content and English has had a subtle but notable impact on VFX growth in India. Many young artists are bi-cultural and equally motivated to work on Western content as they are in working on Indian domestic films. There is a swifter cross-pollination and travel between India-based artists and team members in Canada, the U.S., the U.K. and Australia.In addition, VFX has gained prominence in Indian content, streaming and theatrical. The availability and affordability of VFX in the content creators toolbox have opened up whole new genres and the ability to tell epic Indian tales that were out of budget reach previously, Shah says. Keep in mind that India is not just one monolithic content market, but multiple mature regional markets with their own vibrant histories of storytelling, all of whom have taken a fondness towards what VFX can enable. The fact that the Indian VFX market is well poised to serve both Western content, as well as the expanding Indian domestic content, gives it a firm footing in the global VFX ecosystem.Peter Riel, Owner and CEO of Basecamp VFX, a small studio founded in 1996 and based in Kuala Lumpur, Malaysia. He argues that its important to understand how the SEA (Southeast Asian) market works. Riel says, Each nation here is quite sovereign both in terms of language and culture. While its easy to quickly take a glance and see the various countries similarities, its a mistake to think they all share the same cultural sentiments that perhaps the Western world does to films made in the U.S. As an example, one would expect Malaysian movies to be popular in Indonesia and vice-versa, due to their similar language and culture, but thats far from the case. I do still think there is tremendous value to be found in SEA VFX. The artists here are extremely dedicated and are used to working fast and efficiently.Kraken Kollective CEO Dayne Cowan has over 30 years of experience in the VFX business, worked with DNEG and other major VFX firms, managed VHQ Medias feature film divisions in Singapore, Kuala Lumpur and Beijing, and worked for Amazon Studios South East Asia as the Post Production Lead for Local Original Movies. Earlier this year, Cowan founded a new VFX firm in Singapore. Kraken Kollective is a next-generation post-production and visual effects management company, he says. It leverages the cost and skill benefits of Asia for foreign productions that would like to have the work done here but are unfamiliar with business environments, cultures and capabilities locally. Asia is a massive diverse region with so many countries. Working here may appear straightforward, but there are many unseen challenges that we help to navigate. The skill of the talent [here] continues to grow and develop, almost at an exponential rate. When combined with technology advancements like generative AI and the sheer size of the talent pool here, it represents a serious value add.Cowan comments, Parts of Asia have long been known for handling the entry-level work like roto, matchmove, paint work. As new technology changes the shape of things, I am seeing smaller companies emerge with stronger, specialized skill sets. I think people forget that nearly 4.5 billion people live in Asia, and with a rapidly developing talent base, it will play a huge role going forward in production and post-production. Broadly speaking, I believe we are looking at a boom time for Asian VFX.Dexter Studios was in charge of VFX for Ashfall, a 2019 South Korean disaster film. Dexter co-produced and distributed with CJ ENM Studios.Dexter Studios handled the VFX for Space Sweepers, a 2021 South Korean space western film regarded as the first Korean space blockbuster.Dexter Studios supplied VFX fuel for The Moon, a 2023 South Korean space survival drama film written, co-produced and directed by Kim Yong-hwa. (Image courtesy of Dexter Studios and CJ ENM Studios)0 Comments 0 Shares 226 Views
-
WWW.VFXVOICE.COMARTISTIC ALCHEMY: THE PERSONAL CREATIONS OF VFX TALENTBy TREVOR HOGGBanana Slug Vase (Image courtesy of Liz Bernard)In essence, an alchemist can transform matter into something else, which oddly enough, describes the creative process where pieces of paper, canvas, clay or a blank computer screen are turned into works of art by combining and applying different materials guided by human imagination. In the world of visual effects, digital toolsets reign supreme, but that does not mean that traditional mediums of oil paint, pottery or watercolor have been tossed to the wayside. Outside of office hours, private collections are being assembled that, in some cases, have entered into the public consciousness through art exhibitions and published childrens books. To showcase the sheer breadth of artistic ingenuity, seven individuals have been curated, each of whom demonstrate a unique perspective and talent, which we have the privilege to share with you.Liz Bernard, Senior Animation Supervisor, Digital Domain Art has been a part of the life of Liz Bernard ever since her graphic designer parents placed an X-Acto knife in her hands as a child. The creative inclinations have culminated in a career that has seen her animate the Alien Queen in Enders Game, video game characters in Free Guy and a lawyer with gamma-ray issues for She-Hulk: Attorney at Law. A major source of inspiration is a deep love for nature, which Bernard draws upon when producing each piece of ceramic, either through the art of wheel throwing or utilizing flaming trash cans.Parker Ridge (Image courtesy of Zoe Cranley)I took a day off because there is a workshop that only happens a couple of times per year at a local community arts center where you can do an alternative firing called Raku, which originated in Japan, Bernard explains. The idea is that you fire things in a kiln. While theyre still yellow hot, you open the kiln up, reach in with tongs and quickly take your pottery over to a prepared nest of newspaper situated in a sandpit; it instantly catches on fire, and you up-end a miniature metal trash can, which has even more combustibles, over your piece so to create a reduction atmosphere. You get these crazy metallic reds and coppers, beautiful colors that are hard to achieve with other firing techniques. Its an unpredictable, chaotic, elemental experience.I find that my animation background influences me heavily because Im always wanting to find an interesting pose for something, Bernard notes. You can do a straight-on profile of an eagle or find something that has more personality to it. I love finding those personalities in animals. and I try to put that into my work. There is a lot of experimentation. One of my favorite things to do right now is called Sgraffito, where I formed a piece of clay into a bowl, painted the entire interior surface in black and scraped away the lighter parts. What Ive been doing with these particular pieces is begin with a footprint of a local animal, like a heron, and then use the negative space to start drawing in random shapes. A different aspect of the brain gets creatively simulated. The reason I like this so much is because its so tactile and real. The images we make in the computer, you cant interact with using your hands. This is a nice counterpoint to what I do daily. Visit: www.lizupclose.comVenetian Caprice (Image courtesy of Andrew Whitehurst)Balduin Owl (Image courtesy of Sigri de Vries)Black Cats (Image courtesy of Sigri de Vries)Aragon at Christmas (Image courtesy of Andrew Whitehurst)Zoe Cranley, Head of CG, beloFXMajor franchises such as Jason Bourne, MonsterVerse, The Hunger Games, Wonder Woman and Mission: Impossible appear on the CV of Zoe Cranely, who has transitioned from being a digital artist to a CG supervisor to a more managerial role. Throughout all of this, the passion for oil painting has remained and led to an exhibition at the Seymour Art Gallery in Vancouver showcasing landscapes transformed into geometric shapes and blurred lines.Its being in them, Cranley observes. You can paint or draw anything you want. I used to do a lot of still life and flowers which look pretty. but they dont mean anything to you. Landscapes are so epic, and generally most of the paintings Ive done Ive been there, so Im drawn back to them and can remember that exact moment. Being in Vancouver, beautiful landscapes are abundant wherever you go. Unlike visual effects, the goal is not to achieve photorealism. When you look at a picture that is real, I dont have that desire to keep looking at it because you go, Oh, yeah. Thats what it looks like. I love it when people recognize not instantly what it is, but then have an attachment. I feel like Ive done what I had set out to do, which is to capture the essence of that place in an abstract way.The Faun (Image courtesy of Mariana Gorbea)Ive been using oils for at least 20 years and wont go back to anything, states Cranley, who is not a fan of digital painting. There is something so magical about putting a paintbrush to a canvas. I like that it takes so long to dry and is so malleable for so long. You can do so many different things to it based on the stage of drying. Also, I like the science of the various solvents that you can use. So much of it is the fundamentals of design, color, negative space and composition. Generally, the meaning to me is what makes a nice picture. The quality of the work and brushstrokes have improved. Ive gotten a lot more critical and precise. The edges are neater and I have learned to varnish properly. I have refined the process. A lot of people have said that Ive gotten more abstract. Last year, I learned how to digitize everything, which was a whole process in itself. Visit: https://www.zoecranley.artSigri de Vries, Compositor, Important Looking Pirates There is no shortage of high-profile projects to work on whether it be Shgun, Avatar: The Last Airbender, Ahsoka or Foundation, but this has not stopped Sigri de Vries from embarking on a journey to discover her medium of choice. Along the way, she was hired to create the cover and 12 illustrations for the childrens book Balduin, die Entdecker-Fledermaus by Bianca Engel. I was expecting more kickbacks and having to re-do things and such, instead I was given a lot of freedom with how I wanted to do the illustrations and what parts of the story I wanted to paint, de Vries states. I started with a few sketches of the various characters, and once I got the green light on those, the rest of the illustrations were smooth sailing.Ethereal Cathedral (Image courtesy of Marc Simonetti)Pink Kits (Image courtesy of Zoe Cranley)Experimentation is the only constant. I always start with a sketch, de Vries remarks. I erase the sketch so you can almost not see the lines and then do the watercolor and a pen on top. I found that to be what I like aesthetically, but Im still at the beginning of this journey where Im experimenting a lot and looking at YouTube videos for inspiration and techniques. I follow a number of artists on the Internet and want to do what they do. I want to try everything. Ive done watercolor, clay sculpting, digital art, acrylic and ink. Its my hobby, so Im just having fun! Initially, the plan was to learn digital painting to do concept art. I did a lot of landscapes and played around with compositions. I also did a lot of matte paintings at work, but matte paintings are more photo collaging than painting. As my journey progressed, I got interested in characters and creating them in a cute illustrative style.Phil Tippett Portrait (Image courtesy of Adam Howard)Deathly Silence (Image courtesy of Mariana Gorbea)When I finally had enough money to buy an iPad, I switched from Photoshop to Procreate, de Vries states. Since then, Ive been painting so much more. Procreate is so easy and intuitive, and I can paint and draw directly on the screen, which I love. What a lot of artists do is paint with an actual brush on paper, scan that and use it as a texture for a brush in Procreate. My next big project is a scanner/printer so I can do that stuff as well because it sounds fun to make your own brushes.Visit: https://www.artstation.com/sigriMariana Gorbea, Senior Modeller, Scanline VFXModeling creatures and characters is something that Marianna Gorbea does on daily basis for Scanline on projects such as Game of Thrones, X-Men: Dark Phoenix or Terminator: Dark Fate, but that all occurs within the digital realm. This expertise has also been taken into the physical world where clay is literally shaped and transformed into figments of her imagination. I started with ZBrush and then moved onto clay, Gorbea states. The biggest difference is that you have to be mindful of what youre doing with clay because if you mess up, those hours cannot be taken back. Lessons have been learned from working with clay. It has made me observe more of the whole picture, to be more careful with details, composition and how a sculpture looks from all angles; that has helped me to make better sculptures in ZBrush. The tools I use with clay, I try to replicate in ZBrush and vice versa.Gorbea is attracted to crafting fantastical creatures. Mexican culture is fascinated with death, and some artists can turn dark things into something beautiful. Im drawn to that, and thats why I try to sculpt creatures and characters. Designs are simplified for clay. Building armatures is the hardest and trickiest part with clay. It has to be able to stand. You have to be familiar with skeletons. For example, if Im making a horse, Im looking at horse anatomy, how the bones are built and proportions. I build the armature first because if that is not done properly, its not going to work.Three types of clay are utilized: oil-based, polymer and water-based. All of them are quite different, so I have to think about how Im going to make a structure and base for it, Gorbea remarks. Water-based clay dries quickly, and I use it to make bigger sculptures that have fewer details. With polymer or oil-based clay, you get to spend more time with it and put in more detail; I use them for smaller sculptures. The sculptures are usually made of several pieces, and I create layers of detail. Depending on the size, a sculpture can take five to 10 hours. The hardest part of making a sculpture is to give it personality and convey emotion. If the face, especially the eyes, dont work, then the sculpture is not going to work. Visit: https://www.instagram.com/margo_sculptures/Adam Howard, Visual Effects SupervisorInterwoven with the space exploration universe envisioned by Gene Roddenberry, Adam Howard has been lauded with Primetime Emmys for Star: Trek: Voyager and Star Trek: The Next Generation as well as nominations for Star: Trek: Deep Space Nine and Star Trek: Enterprise. However, his artistic eye has gone beyond the realm of Federation and Klingon starships as he paints with light to produce character studies of friends, colleagues and celebrities.Aftermath (Image courtesy of Marc Simonetti)The human face is a never-ending source of wonderful detail and surprise, Howard explains. Based on a photograph, I start with a detailed pencil outline that determines the overall shape of the face. Within that outline, I also mark out areas for shadow and highlights. I paint masks for each major area: face, eyes, ears, neck, hair, beards and clothing. Once each area has a clean mask, then I start the actual painting. First, come base colors and areas of shadow and highlight followed by middle ground detail then eventually on to finer detail. I paint in digital oils because I love being able to blend my paint to help give subtle form to each area. I also love the fact that by painting on my iPad, I can paint anywhere. I am not restricted to a physical studio or materials.Sleeping Beauty (Image courtesy of Marc Simonetti)Buying Pane Cunzato in Trapani (Image courtesy of Andrew Whitehurst)Tidal Raku Vase (Image courtesy of Liz Bernard)Slip Trailed Box (Image courtesy of Liz Bernard)Howard begins each portrait by painting the eyes. Eyes truly are the window to the soul, and I try to capture the real essence of each subject by painting the fine detail and shape of eyes. Sometimes, it can be a really tiny detail like a single point of highlight on an eyelid that makes the person feel real. I love those moments when the face pops off the page at me as the person I am painting. Depending on the portrait, I sometimes work in additional detail over the final painting from the original pencil outline. This can assist in deepening lines around the eyes and accentuating hair detail. I used to do colored pencil and ink portraits on a plain white background. The backgrounds have become more detailed. This plays a big part in portraits, like my paintings of Ve Neill and Steven Spielberg, where so many films they have created are represented in the background. Sometimes, the backgrounds take longer to paint than the person. Visit: www.adamhoward.artMarc Simonetti, Art Director, DNEGInitially trained as an engineer, Marc Simonetti decided to become a concept artist and has made contributions to Valerian and the City of a Thousand Planets, Aladdin and Transformers: Rise of the Beasts. He has also illustrated Discworld by Terry Pratchett, Royal Assassin (The Farseer Trilogy) by Robin Hobb and The Name of the Wind by Patrick Rothfuss. When I started my career, the only job available was for book covers in sci-fi or fantasy, Simonetti notes. I grew up with that trend. Maybe I would have had a completely different style if I had tried fine art first. But thats life.Sometimes, I start with watercolors or pastels, but that is rare because we have to be fast, Simonetti remarks. The only thing that I try to do all of the time is to change my process because I need to have fresh options. If I stick to something then my picture will always look the same. At the same time, its trying to be as honest as possible. Most of the time, I start with pencil and paper because its the easiest one. Once the composition is set in my mind, there is an app, Substance 3D Modeler, that allows you to sculpt in VR, which is a lot like ZBrush. I use my Meta Quest headset to scout everything. I can put lighting on the model and find different cameras. I also can create a library by sculpting a tower or window that are used later on. Once I have the model, I can use Octane, Cinema 4D, Blender or Unreal Engine. Then I render and paint it in Procreate or Photoshop.Sketches are conceptualized without references. I want to be as free as possible to set up a good composition, Simonetti states. However, when I need to fill the composition with elements, I try to have lots of references whether its architecture or anatomy. Everything has to be grounded. Even when Im making an alien, it has to be believable. Same thing with architecture. I want people to connect with it. If you dont have any reference for the scale, it takes people out of the painting. Lighting is critical. When Im using 3D, its a huge help. Im trying so many different lighting scenarios to fit the correct mood and to be as impactful as possible. Visit: https://art.marcsimonetti.com/Oyster Shell Bowl: Eagle Talon (Image courtesy of Liz Bernard)Andrew Whitehurst, VFX Supervisor, ILMGiven that Andrew Whitehurst studied Fine Arts before becoming an Oscar-winner for Ex Machina, his belief that music, pictures, lunch and ball sports are the greatest achievements of humanity is not entirely surprising. The enjoyment of studying faces and drawing caricatures has come in handy. If I know that were doing a digital face for someone, literally the first thing that I will do is type an actors name + caricatures and search the Internet, Whitehurst reveals. If there are loads of good caricatures then its going to be an easier job because something is capturable about their likeness. If there arent that many good caricatures then its going to be much harder. There arent many good caricatures of Harrison Ford, and it was hard.Al Pacino Portrait (Image courtesy of Adam Howard)Dragon (Image courtesy of Sigri de Vries)Ringwraith (Image courtesy of Mariana Gorbea)There is an interplay between the way that I paint and what I understand about the world, which I have gleaned from doing visual effects for a long time, Whitehurst notes. Im always trying to make something psychologically interesting. I love abstract art, but Im not good at doing it. I started doing a lot of landscape paintings, and I discovered what painting is to me; its a way for me to meditatively spend time somewhere I find special or engaging in some way; and to have the opportunity to think about it, enjoy it, and try to capture something of it, but in a reflective way.If Im going on location or holiday, I have a sketchbook with me, Whitehurst remarks. I will do black-and-white pen-and-ink drawings. Some of them I will scan and add color in Procreate later if I feel like it. The drawings tend to be a more immediate reaction to a place and have more of a comic book style because that is generally how I draw. I like to exaggerate and use a little bit of imagination. The paintings consist of casein on wooden panels. Casein has the advantage over gouache because when its properly dry, it doesnt reactivate as easily, so you can paint over the top of it, and its slightly creamier in texture, so its a little bit like oil paint but is water soluble and dries quickly. I would paint in oil but for the fact I cant have my house stinking of turpentine!Contact: @andrewrjw on Cara and Instagram.0 Comments 0 Shares 217 Views
-
WWW.VFXVOICE.COMCATCHING A RIDE ON THE VFX VENDOR-GO-ROUNDBy TREVOR HOGGThe foundation for shows such as Vikings: Valhalla are previous collaborations that enabled visual effects supervisors and producers to deliver shots within budget and on schedule. (Image courtesy of Netflix)Compared to the 1990s when only a few visual effects companies were capable of doing film-quality work, the number of options has exploded around the world. It is fueled by the ability to achieve photorealism and the growing acceptance of CGI as a filmmaking tool. As a consequence, the success or failure of a project is often dependent upon hiring the correct group of supervisors, producers, digital artists and technicians either through a single or several vendors who aim to achieve the same goal. In some ways, the vendor selection process has remained the same, but in other areas it has become sophisticated, reflecting the maturity of the visual effects industry as it travels further down the pathway once traveled by the crafts of cinematography, editing, production design and costume design to become entrenched members of the entertainment establishment.Vendor connections begin with the production companies, studio visual effects executives or visual effects producers and supervisors. On the studio side, we break down a script; we are typically the first ones, and we tend to do this before a director is hired, states Kathy Chasen-Hay, Senior Vice President of Visual Effects at Skydance Productions. We work closely with a line producer to discuss shoot methodology, then well send the breakdown out to four or five trusted visual effects companies. We pick these vendors based on their specialties, shot count and the size of the budget. Finding and hiring vendors is a group effort. The VFX studio executives work closely with the visual effects teams when picking vendors. Since studio visual effects executives work with tons of vendors, we know and trust certain vendors. Did that vendor deliver on time? Was the work stellar? Did we get change-ordered to death? Relationships are key, and several VFX supervisors have built relationships with vendor supervisors, and its important to support these relationships; after all, they are the ones in the trenches, day after day. Agents are typically not involved. Relationships are built on past projects. Successful vendors have someone at the top who communicates with studios, production companies and the visual effects producers. We trust these folks as we have worked with them on prior projects. Its all about previous projects.Established relationships are favored given the difficult nature of delivering finished shots within budget and on time. Depending on the type of work required in the visual effects breakdown, the visual effects production team would work together with their production company and/or studio to start understanding how many vendors may be needed and which ones have the capacity and capabilities to handle that type of work in the given timeframe, explains Jason Sperling, Creative Director/VFX Supervisor at Digikore Studios. This can help narrow the list of possible vendor candidates significantly, and at that point, visual effects production teams begin the specific task of reviewing vendor demos and sample bidding numbers and expressing the creative and logistical expectations. If individual artists are needed for an in-house visual effects production team, they begin to assemble and reach out to their known available freelance crew or other resource lists.The selection process for visual effects teams can vary significantly depending on the structure and needs of a particular project, states Neishaw Ali, CEO and Executive Producer at Spin VFX. While sometimes studio preferences might dictate the choice, more commonly, the decision-making is often led by the VFX supervisor and the VFX producer. These key figures play crucial roles due to their expertise and their understanding of the projects technical and artistic requirements. The visual effects supervisor is primarily responsible for ensuring that all visual effects are seamlessly integrated into the live-action footage and align with the directors vision. Meanwhile, the visual effects producer manages the budget, scheduling and logistics of the visual effects work, coordinating between the studio and the creative teams. Their collaboration is essential in choosing the right visual effects team[s] that can deliver high-quality results within the constraints of the projects timeline and budget.Scale and budget have an impact on the audition and hiring process. For independent films, I found theres more flexibility, while the big studio productions may have predetermined criteria or preferences, notes Julian Parry, VFX Supervisor and VFX Producer. Producers and directors typically seek out talent based on their track record and previous work. Artists or visual effects houses with impressive portfolios are usually approached for potential collaborations. Its not uncommon when vetting a visual effects vendor that the artists are promoted in the pitching. Breakdown reels, showcasing previous work and expertise play a significant role in the hiring process. Producers and directors look for visual effects houses or artists whose style and capabilities match the projects needs and offer a detailed insight into their experience in specific disciplines, such as creating monsters, which can be crucial for achieving desired visual results.Producers and directors look for vendors and artists whose style and capabilities match the projects needs, like for The Wheel of Time. (Image courtesy of Prime Video)Considering the capacity of the vendor to meet deadlines and handle the complexity of the work is the first crucial step in the selection process for shows like Fallout. (Image courtesy of Prime Video)The selection process for visual effects teams can vary significantly depending on the structure and needs of a particular project such as Asteroid Hunters.(Image courtesy of IMAX and Spin VFX)Scale and budget have an impact on the audition and hiring process for vendors on projects like The Witcher. (Image courtesy of Netflix)Another part of the vendor equation for films like Dungeons & Dragons: Honor Among Thieves are in-house visual effects teams that can consist of a designated vendor or a group of handpicked digital artists. (Image courtesy of Paramount Pictures)Generally, for VFX Producer Tyler Cordova, one to three major vendors are brought on board during the majority of prep and shoot to develop assets for films like Dungeons & Dragons: Honor Among Thieves. (Image courtesy of Paramount Pictures)Considering the capacity of the vendor to meet deadlines and handle the complexity of the work is the first crucial step in the selection process. Can the vendor commit to delivering a specific number of shots by a set date, and do they have the necessary resources to handle the project? notes Pieter Luypaert, VFX Producer at Dream Machine FX. Competitive pricing is important, as multiple vendors are bidding for the same work. The vendors location also plays a role, as tax incentives can significantly impact cost. Breakdowns are a big part of the bidding process, as they provide the vendors with all the essential information needed to provide a first bid and propose a methodology. Does the vendor believe they can achieve a certain effect with a 2D solution? The chosen methodology can drive the cost and schedule. Lastly, pre-existing work relationships, mutual connections and shared history are important. Due to the interconnected nature of the visual effects industry, personal connections can ultimately be the deciding factor. Multiple vendors are often used to mitigate risks. The main vendor typically handles the majority of the work, while the studios visual effects production team oversees the coordination among the different vendors. This is becoming more common as the vendor providing virtual production services needs to collaborate closely with other vendors using their assets.In many ways, the career trajectories of individuals determine future studio and vendor collaborations. I was a compositor by trade and knew a lot of the people at Pixomondo who went on to form their own companies such as Crafty Apes, states Jason Zimmerman, Supervising Producer and VFX Supervisor at CBS. Me bouncing around working at Zoic Studios and Eden FX, you meet a lot of people along the way and collect those people with whom you resonate. Ive been fortunate to meet a lot of awesome people along the way who have either started a company or gone to a company. To me, its all about the artists, having been one myself. I keep track of all my favorite people, and they have all moved around and done great work at different places. Not everything is about past relationships. If someone has great visuals then youre going to try them out, regardless. Having a reel that has got a good variety is important because you know that they can do more than one type of shot or effect or discipline. And how does it look to your eye? Do you agree with the integration and lighting? All of those shots were driven by a supervisor, studio, timelines and budget. You take it for what it is, and every decision made was not only one person because there are a lot of people who go into making a visual effect shot work.Setting up a visual effects company has become more economical. The technology is at a point where if youre an independent artist, you can buy the software and render it on the cloud, notes Jonathan Bronfman, CEO at MARZ. You dont need infrastructure. But it has been that way for a while. Its quite homogenous. Everyone is using the same tech staff. We have artists who have worked at Wt FX and vice versa. What is that differentiator? Which is why we ended up developing the AI. Thats through differentiation. If you can nail the tech. Outside of the AI that were developing, were very much a creature/character shop. We still do environments because creatures and characters need to live in environments. There are other companies like Zoic Studios which are television-focused. But if you go to Wt FX or ILM, they do everything. Everything stems from reliability. Word of mouth is the result of doing a good job and executing at a high level. You have to produce good quality on time and budget. If you can do those things then it spreads. Certain stakeholders have to be impressed. You have the visual effects supervisor, visual effects producer, production company and studio. If you have all three levels of stakeholders, that is ideal. But ultimately, it is the visual effects supervisor who gets the final say.Conversing with potential vendors actually commences before the studio assembles a visual effects team. I will get a look at the scripts early, know what type of work it is, and I can reach out to my counterparts at some of those vendors, explains Todd Isroelit, Senior Vice President, Visual Effects Production at 20th Century Studios. Id say, We have project X, which has a creature/character or water effects simulation. Here is the rough schedule that were looking at. Its important to plant a flag in some of these vendors so your project is on their radar as theyre talking to all of the other studios and filmmakers about other things that might be happening in a similar timeframe and looking for similar resources. As we start to identify the team or the candidates for the team, well look at what projects theyve done and what relationships they have. Sometimes, well look at actually creating a package scenario where we are talking to a vendor and vendor supervisor. The proliferation of visual effects has led to more agent representation. In the early days, all of the visual effects supervisors were tied to facilities like ILM, Digital Domain and Sony. There wasnt this big freelance pool. As the industry grew and people started moving around, it became this emerging piece of the business that gave the supervisor a head of department status that fits into that below-the-line approach to filmmaking where you are looking at DPs and costume designers. Visual effects supervisors started having a bigger stake and voice in how the projects were coming together. Thats when I saw people getting agents started to evolve, even to the point where big below-the-line talent agencies who represent DPs, editors and costume designers started realizing the same thing. Agent representation is not as significant for the vendors as a point of contact for the studios. Executive producers or business development executives at the vendors; those are the relationships that we have, Isroelit says.Rather than hire agents, vendors tend to have a top executive communicating with production companies and studios to work on series such as Foundation. (Image courtesy of Skydance Television and AppleTV+)Having a reel that has a good variety is important because it demonstrates the ability to do more than one type of shot, effect or discipline when attempting to work on series such as Star Trek: Discovery. (Image courtesy of Paramount+)Conversations with potential vendors actually commence before the studio assembles a visual effects team, reveals 20th Century Studios Todd Isroelit, who worked on Prometheus. (Image courtesy of Twentieth Century Fox)Another part of the vendor equation are in-house visual effects teams that can consist of a designated vendor or a group of hand-picked digital artists. In my experience, an in-house team usually comes in closer to the end of post-production to do easier, mostly non-CG shots, remarks VFX Producer Tyler Cordova. Typically, opticals, re-times, split screens and simple paint-outs, things of that nature. Its important because its a cost-effective solution to have a small team do simpler shots after the edit has settled. Ive hired in-house artists on past shows through contacts Ive worked with for years and years. In-house artists will suggest other artists theyve worked with as well. There are some legendary in-house artists that a lot of visual effects producers know about Johnny Wilson, John Stewart, looking at you! though some studios and producers prefer going to a vendor instead of using in-house artists to give some accountability to a company performing efficiently, rather than relying on individual artists to manage themselves; it depends. In-house teams are rarer these days since COVID-19 hit, and a lot of productions seem to be hiring smaller in-house-type vendors rather than individual artists so they can do the work securely and efficiently while working remotely.0 Comments 0 Shares 228 Views
-
WWW.VFXVOICE.COMTHE PENGUIN: GUNNING FOR THE TOP OF GOTHAM CITYBy TREVOR HOGGImages courtesy of HBO.Colin Farrell felt that prosthetic makeup designed by Mike Marino allowed him to express a sense of tragic wrath for Oswald Oz Cobb.A secondary villain who was the object of an explosive highway pursuit by the Batmobile gets his own Max limited series to bridge the narrative gap between The Batman and its upcoming sequel directed by Matt Reeves. The Penguin consists of eight episodes executive produced by Reeves, Dylan Clark and Lauren LeFranc, who also serves as the showrunner.Colin Farrell dons the prosthetic makeup and body suit again as an aspiring crime lord nicknamed after an aquatic flightless bird with a distinct waddle. Its great to be in the Batman universe and in the version of Gotham City that Matt Reeves created, states Lauren LeFranc, Executive Producer and Showrunner. It felt real, less comic book and more crime drama. We took that and ran with it. The film always takes place at night because its the Batman and hes not a guy who comes out in the day. But Oswald Oz Cobb does. We tried to embrace that aesthetic, but we were more handheld in how we shot. I always say that Batman is up high looking down on Gotham City. Oz is the guy whos down in the streets in the muck of the city looking up wanting to rise to power; that allows for us to enter seedier worlds in our show. The French Connection and Taxi Driver were parallels that we talked about a lot. Batman moves and thinks methodically while Oz is unpredictable in his actions, so we tried to have the camerawork and feel of our show mirror more who Oz is rather than Batman.Complicating the daily prosthetic makeup process for Mike Marino was having to flatten, cover and integrate the real hair of Colin Farrell.Contributing to the difficulty of the production was having lead actor Colin Farrell go from having to transform into his character 30 times for the feature film to 100 times for the limited series. Ive seen every single frame of this show, have been on set and watched this from every angle, LeFranc notes. Colin Farrell can be an actor, as he normally is, with his own face. Oz is a man to me. I know its Colin because I talk to him all of the time. But Mike Marino [The Penguin Prosthetic Makeup Designer] and Colin together created a completely new person. Crew members would come on set, especially when we first started shooting, and would stare at him because you kept looking for seams or something to understand how this was a mask and you couldnt find it. We have shots that are extremely close up, and the detail on Ozs face is incredible. Mike always wanted to challenge himself. I would write something and hed be, Keep writing like that because its so hard and I want to see what I can do. Oz is naked in the first episode. That is a full body prosthetic, which Mike and his team created. There are little specific hairs on his chest put in place with tweezers. The detail is remarkable. The Oscar-nominated actor never felt that his performance was impeded. Thats the alchemy of what Mike Marino does, observes Colin Farrell, who plays Oswald Cobb. The mix of what I was doing beneath the makeup and the kind of sense of tragic wrath that Oz carries himself through the world with, it just seemed to work. I was somewhere in there, but the face just moved beautifully. It may be a bit idealistic, but I do think if everyone approaches their work from a place of integrity and purity, and wants to do best by the project and understands youre a significant spoke of the project being born to life, then things do work out.Rather than peering down upon Gotham City, the perspective of the show is from the street level.Bridges, subways, underpasses and overpasses are a visual motif for The Penguin.Five body suits were in constant rotation. They get sweaty and have to be cleaned and dried then get the next one on, remarks Mike Marino. We tried to plan as much as we can based on experience, but you never know whats going to happen. There is so much movement and bubbles that form, and hes sweating, and the environment. We tried to keep him cool and tried to plan that youre going to stay in this tent that is hardcore air-conditioned; hes basically wearing a whole snowsuit every day. The nude suit was a one-off. Its blended at the wrists and ankles, and all of the hair that goes on. We didnt get to go through the finalizing process of the second suit because we got it all within one day and it maintained itself. The makeup process for the nude suit was even more laborious. We had pre-painted and pre-done a lot of hair, but the connection of it all had to be done on the day. Colin was in five or six hours of makeup, and the whole day was dedicated to that scene, which became its own entity. We see why he limps. There is a whole leg involved that is strange-looking. Colin is 95% covered in prosthetics during that nude reveal, Marino adds. Some digital augmentation was required. Im working with the visual effects team, personally circling and pointing out unavoidable errors. My eye is going to see things that they wont see because Im looking at where things begin and end that may seem seamless to someone else, he notes.The story takes place a couple of weeks after the flood, so the water has receded leaving behind mud and muck.Approximately 2,200 visual effects shots were created over a period of months for the eight episodes.Accenture Song VFX, Pixomondo, ReDefine, Stormborn Studios and Frost FX created approximately 2,200 visual effects shots over a six-month period. We developed an in-house asset library that was searchable and filterable, and shoehorned it into ShotGrid, reveals Visual Effects Supervisor Johnny Han. The vendors could peruse our assets, not just elements like sparks, glass, smoke and blood, but also cityscapes, sunrises, sunsets and looking at Manhattan in 100 different directions. We thought that was key to give the vendors the freedom to find what they thought was useful that perhaps I didnt even think would be necessary for shots. Various elements had to be recreated from The Batman, such as lens flares. I got on the phone with Dan Lemmon, the Visual Effects Supervisor on The Batman, and said, Tell me how you did this. He sent me this great document where he fleshed it all out. I went to the hardware store and bought all kinds of different silicone gels, hand sanitizers, soaps and anything we could stick onto the glass of the matte box of the ALEXA LF camera to smear different patterns. I also bought a bunch of different flashlights, LED and incandescent, to get wet caustic lens flares by shining them through the silicone gel smeared onto the glass in front of the camera. We got some amazing material that we felt even expanded upon what Wt FX had done in the film because we have a larger variety of scenes, Han says.The first shot after the prologue in Episode 101 was digitally altered after it was captured to achieve the desired thematic effect. Han describes, Were beginning way up above the ground looking at a classic Gotham skyline as if it was the same one from the film. But hold on; this is not where we are. Lets come down, and the camera starts descending through the rain, passing train tracks, and now coming into much smaller and rougher-looking buildings that have seen better days, all the way down to almost six inches off the ground. Literally, as low as you can get. Those same skyscrapers that we saw at the start of the shot are almost looking down at us from afar, like the rich looking down at a safe distance. Of course, a moment later, Ozs Maserati pulls right up to the nose of the camera. A different perspective was adopted for a flashback sequence in Episode 103 when the Riddler destroys the seawall causing a devastating flood. In The Batman, we were always far away from the water. We wanted to make this flood feel like Victor Aguilar was at ground zero. He saw the seawall break open, the water rush in and consume all of the cars and people on the street. And to make it a horrific and traumatic experience for him, which is a lot of the basis for his backstory of having lost everything. Conveying weight is always hard. We filled the water with chunks of debris from the concrete seawall and played that up, so the water itself ended up having this inky black depth to it, Han adds.Given the nature of the subject matter, a certain element was unavoidable. Its a show about gangsters who like to shoot guns often at night or in shadowy places, Han notes. The first week of shooting, we had a hero gunshot. I got an old camera flash and a sound trigger that sports photographers use, hooked them together, brought it on set and crossed my fingers. When the blank shot off, the bang of the trigger would sound-trigger the camera flash I had. It was this big Eureka! moment of an interactive light that we needed for the show. Visual effects still added a little muzzle flash cloud. We also obtained some technology to phase-synchronize the timing of the flashes with the rolling shutter of the camera so that it would never flash halfway through a frame. I felt from a visual effects perspective, it was a nice little visual stamp for the show, and we could take our gunfire to the next level. Atmospherics added a touch of life to shots. We were inserting birds, airplanes, helicopters, trains, cars and digital people to give the scene the right tone of life. We dont want it to feel too active. Its just that Gotham City itself is a living, breathing city, so there has to be some pulse to it at any given time. The Gotham City envisioned by Matt Reeves had to be honored. Part of the formula was Gothic architecture and skyscrapers. You might call it Neo-Gothic, where you have buildings that look like they have cathedrals on top of them.[Visual Effects Supervisor] Johnny Han and I are collaborating on creating some of the sets and translating them into VR sets and putting the Oculus goggles on our director Craig Zobel and have him walk through them.Kalina Ivanov, Production DesignerThe goal for the visual effects was to be invisible and provide the environmental scope required by the storytelling.A significant part of the digital set dressing was creating rubble and debris inspired by the aftermath of Hurricane Katrina.Blackgate Prison went through many iterations with inspiration coming from Eastern European correctional facilities.An asset library that was searchable and filterable was inserted into ShotGrid to assist visual effects companies in the creation of environments.Genre does not impact the design language. I find a character that I like and approach it from a character arc, reveals Production Designer Kalina Ivanov. I dont connect with Batman, but I get the Penguin. I get him with a chip on his shoulder for being poor and wanting to become rich. For me, everything that informs the design is that character: what hes about, his angst and his emotional journey. I design from an emotional state. The Penguin occurs a couple of weeks after the flood. Ivanov explains, The story takes place in locations where the water has receded, so what youre left with is the mud and muck. Thats where Katrina helped because the scale of that disaster was so huge and the way it pushed things around and the way cars were pushed into each other by the water and created these structures of three cars on top of each other. The special effects team created rigged cars to be like that. Because FEMA had started to clean up, we were also showing the dumpsters where people were throwing stuff, but what we needed to create was mud. Blackgate Prison went through many iterations. Ivanov recalls, I did this black-and-white sketch of a prison embedded into a rock formation, playing with the idea that its the Palisades across from Manhattan. Matt and Lauren thought the design was terrific but not grounded enough in reality. They encouraged me to go and look at more real prisons. It was the time that I realized theyre not approaching this as a comic book. I looked at some Eastern European prisons and Brutalism. There is an idea that Im going after, but its not vertical. Its horizontal. And its on an island. I looked at prisons on islands. I started creating it from there.Ivanov loves collaborating with visual effects. Johnny Han and I had a great relationship from the beginning. We are collaborating on creating some of the sets and translating them into VR sets and putting the Oculus goggles on our director Craig Zobel and have him walk through them, Ivanov states. I find that to be useful. We all agreed that in a show where you want the characters to feel like real people, you want to create as much around them for real and have fewer visual effects, [digitally] extend from there and [lean on visual effects] for the things you couldnt create, like the prison.Originally, the project was daunting. I thought I was supposed to follow the film, which was scary, to be honest. Its an eight-part series, not a two-hour film. It became clear that Matt, Dylan and Lauren were not looking to duplicate the film but capture the spirit of it. There was another reason we couldnt duplicate the film. The film was shot in England, and we were shooting in New York. The whole motif of The French Connection and shooting under tracks, subways and passages came from that original discussion. It became a theme, Ivanov says. Singling out an environment is not easy. Truly there are so many wonderful sets in this show, and theyre all my children, she acknowledges. Its hard for me to play my favorites. The Penguin is so rich for a designer. Its a true visual gift.0 Comments 0 Shares 244 Views
-
WWW.VFXVOICE.COMTAKASHI YAMAZAKI ACHIEVES KAIJU-SIZEBy TREVOR HOGGImages courtesy of Takashi Yamazaki, except where noted.Takashi Yamazaki and Stanley Kubrick are the only directors to have ever won an Oscar for Best Visual Effects.When people think about Japanese cinema, Akira Kurosawa and Hayao Miyazaki often get mentioned, but that is not the entire picture as renowned talent has emerged from younger generations, such as Hirokazu Kore-eda, Mamoru Hosoda, Makoto Shinkai and Takashi Miike. Another name to add to the list is Takashi Yamazaki, who accomplished a feat only achieved by Stanley Kubrick when he became only the second director to win an Academy Award for Best Visual Effects, and in the process reinvigorated a legendary kaiju [giant monster] franchise with Godzilla Minus One. What impressed him most was not being handed the golden statue but getting the opportunity to brush shoulders with his childhood idol. Receiving the Academy Award for Best Visual Effects was a great honor, but meeting Steven Spielberg at the Nominees Luncheon was perhaps an even more exciting moment, Yamazaki admits. It was a chance encounter with the God I had longed for since childhood.Previously, Yamazaki had established himself by adapting mangas, such as Parasyte and Always: Sunset on Third Street, with the sequel of the latter foreshadowing his feature film involvement with the King of Monsters, as he appears in an imagery scene. That scene was a short one, but it was just about as much as we could do with our technology and computing power we had. At that time, it was impossible to complete the visual effects for a two-hour Godzilla film with our capabilities. As time went by, we were able to process information that was incomparable to that time in terms of technology and computing power we had, so I thought I could finally create the Godzilla I envisioned and started this project. It was a good decision to wait until this happened and make the Godzilla I envisioned.Like the kaiju, manga are a cultural phenomenon. The best way to succeed as a creator in Japan is to become a manga artist. Therefore, the best talent is concentrated in manga. Furthermore, the ones who survive in the very tough competition are the ones who become known to the most people. There is no reason why the stories told by those at the top of the giant pyramid should not be interesting. Adapting a comic book into a film potentially requires the characters to be the comic book itself, which is difficult, Yamazaki says.To help define Godzillas look, Yamazaki and the animator spent time developing Godzillas walk in Godzilla Minus One. (Image courtesy of Toho Company)The science fiction genre is interesting in that it can create things that do not exist in this world. I also like the fact that it can be used as an allegory with various messages. The biggest reason for my attraction is that it excites my inner child.Takashi Yamazaki, Director,Godzilla Minus OneGrowing up in Matsumoto, Japan, Yamazaki had a childhood fascination with insects and crafts. I was surrounded by nature, so I collected insects and lizards and observed them. I was also a child who preferred drawing paper to toys and would request 100 sheets of drawing paper as a Christmas present. Neither of his parents had much to do with the arts. My father was good at drawing, and I remember that when I asked him to do something, he would do his best to draw me Ultraman or some other character. A cinematic turning point was getting the opportunity to watch the sci-fi classic by Steven Spielberg, Close Encounters of the Third Kind. What was shocking was the scene where the giant mothership flips over. With the visual effects before this, it took some effort to believe that it is real, but this was the first moment when I had the illusion that it is real.Four-year-old Takashi Yamazaki stands in front of Matsumoto Castle with his family.Takashi Yamazaki started off as a model maker for Shirogumi in 1986.Godzilla destroyed the Tokyo shopping district of Ginza. (Image courtesy of Toho Company)In 2009, Takashi Yamazaki directed the action romance drama Ballad: Na mo naki koi no uta, where a young boy prays for courage to a great Kawakami oak tree and finds himself transported to feudal Japan.A major reason that Godzilla Minus One won the Oscar for Best Visual Effects is that there were both practical and digital effects.Yamazaki became part of the Japanese film industry while studying film at Asagaya College of Art and Design. When I was at art school, many expos were being held in Japan, and Shirogumi, which was skilled in creating unique visuals, was producing visuals for many of the pavilions, Yamazaki explains. There was a part-time job offer for this, and I was able to join Shirogumi as a result of participating in it. Visual effects were led by TV commercials, which had a relatively large budget to work with. We were also trying to introduce the techniques we had tried in TV commercials into film. Around the time I made my debut as a director, CG became more readily available. At that time, it was very difficult to scan live-action parts in theatrical quality, so we even built a scanner in-house that was converted from an optical printer. The pathway to becoming a director began when there was a call for pitches within Shirogumi leading to the production of Juvenile [2000], which revolves around a tween having an extraterrestrial encounter. The president of the company showed the idea I submitted there to Producer Shuji Abe, who was the president of another company; he liked it and worked hard on it, leading to my debut film.Science fiction goes beyond spectacle. The science fiction genre is interesting in that it can create things that do not exist in this world, Yamazaki observes. I also like the fact that it can be used as an allegory with various messages. The biggest reason for my attraction is that it excites my inner child. With science fiction comes the need to digitally create what does not exist in reality. I decided to become a director because I wanted to make films with the type of visual effects I wanted to make in the first place. When I made my debut as a visual effects director, most Japanese films didnt have spaceships or robots in them. I think that having three jobs at the same time is economical because I can judge things quickly and write scripts with the final image in my mind, so there is no loss of time.Yamazaki has directed 20 feature films. You never know what will be a hit, so when I have an original story, I only base it on whether it excites me or not. Making a film means you have to live with the original story for a number of years, so if its not a good match, it becomes hard to get on with it. I simply ask for good actors to join the cast. I am basically a person who wants to do everything myself. When it comes to the staff, I try to ask for people who are at least more skilled than me, people who have talent that I can respect.In Japan, Godzilla represents both God and Monster, so Takashi Yamazaki wanted its movement to feel almost divine or God-like in Godzilla Minus One. (Image courtesy of Toho Company)International markets are rarely taken into consideration when approving film budgets in Japan. This is because for a long time it was said that Japanese films could not go mainstream even if they were released overseas, and that was probably true, Yamazaki states. It was a great surprise that Godzilla Minus One was seen by so many people overseas, and to put it bluntly, it was a joyful experience that opened up new possibilities for future film production in Japan. Hopefully, the budget will reflect that element. I guess well just have to build up our track record and prove that pouring big budgets into it is not a bad option. Stories scripted and directed by Yamazaki have ranged from siblings trying to learn about their grandfather who died as a kamikaze pilot in World War II in The Fighter Pilot, to contributing to the Space Battleship Yamato franchise where an interstellar crew attempt to locate a device to make a devastated Earth inhabitable again, to a forbidden book that can grant any wish but at the cost of a life-threatening ordeal in Ghost Book. The growing popularity of video games has not altered the essence of storytelling. Interesting stories are interesting in any media, and the core of stories that can be developed in various media continues to be influenced by stories that have been around for a long time.Back in the early digital age when Takashi Yamazaki was learning how to create visual effects.At the age of 10, Takashi Yamazaki ventures to downtown Matsumoto with his sister Satsuki.An extremely complex shot to design, create and execute is found in Godzilla Minus One, where a kamikaze pilot has to overcome survivor guilt in order to protect those he loves and Japan from the rampaging title character. The sea battle between Shinsei Maru, the mine disposal ship, and Godzilla was difficult because we had to combine a live-action small boat with CG waves and a giant Godzilla, Yamazaki reveals. The boat in the foreground is live-action, so it was a very time-consuming job to build the waves at a level that would blend in with it. Im glad it worked out.When asked what are the essential traits of a successful director and what has allowed him to have a long career, he responds, What it takes to be a successful film director is to keep everything interesting all the time, but I am not sure about the career. It would be bad if a film failed, so I think its easier to prolong my life if I get the next project off the ground before the next film is released. Yamazaki is appreciative of his good fortune. Thanks to the many people around the world who liked Godzilla Minus One. Godzilla Minus One has received many wonderful awards. I will continue to make films, treasuring the memories of the days I created with you all. Thank you very much. Arigato.The sea battle between the mine disposal ship Shinsei Maru and Godzilla was difficult because CG waves and Godzilla had to be integrated with the practical vessel.0 Comments 0 Shares 227 Views
-
WWW.VFXVOICE.COMUNSUNG HEROES: SFX ARTISTS KEEP IT REAL IN A DIGITAL WORLDBy OLIVER WEBBDominic Tuohy was nominated for an Academy Award for his visual effects work on The Batman (2022). (Image courtesy of Dominic Tuohy)The role of the special effects artist is to create an illusion, practically or digitally, to enhance the storytelling. Practical effects include the use of makeup, prosthetics, animatronics, pyrotechnics and more, while digital effects rely on computer-generated imagery. With the two mediums having blended together in more recent years, the role of the special effects artist is often overlooked. Following are just a few of the many artists working today who are responsible for providing audiences with outstanding effects and immersing us in the world of film. From concept designers to special effects supervisors, they are working tirelessly behind the scenes to make movie magic happen.Neil Corbould, VES, Special Effects SupervisorMy uncle, Colin Chilvers, was the Special Effects Supervisor on Superman back in 1978. Being a big fan of Superman, I bugged Colin into taking me to see some of the sets. The first set he showed me was the fortress of solitude built on the 007 stage at Pinewood Studios. I arrived just at the right time to see Superman aka Christopher Reeves flying down the length of the stage on wire through the smoke, mist and dry ice. From that moment on, I knew that I wanted a job in special effects. After Superman, I went on to work on Saturn 3, which starred Kirk Douglas and Farah Fawcett Majors, then Superman 2. After that, I started working with a company called Effects Associates Ltd., run by Martin Gutteridge. This was an amazing place to learn the art of practical effects both in small and large-scale productions.Neil Corbould supervised this train-over-the-cliff effect for Mission: Impossible Dead Reckoning Part One (2023). (Image courtesy of Neil Corbould and Paramount Pictures)I feel that practical effects are now in a great place. I embrace all the new technologies that come along, and I am always on the lookout for the next generation of machines and materials I can use and integrate into my special effects work. YouTube has been a great source of information for me. Whenever I have any free time, I search through many clips on the platform, and its amazing what people come up with. Then I try to figure out how I can use it. With the influx of streaming platforms and the need for product, I have seen a surge in the need for more practical effects personnel. This has meant an increase of crew coming through the ranks at a fast pace, which has been a concern of mine here in the U.K. To be graded as a special effects supervisor takes a minimum of 15 years. During these 15 years, you need to have completed a certain number of movies in the various grades: trainee, assistant technician, technician, senior technician and then on to supervisor. This is the same with regards to pyrotechnic effects, which has similar criteria, with added independent explosion and handling courses that need to be completed before you are allowed to handle and use pyrotechnics and explosives. There is a worry that some have been fast-tracked through the system, which is where my concern lies. Because of the nature of the work we do, safety is paramount. You need to have completed the time and have the experience to say This is not right. This comes over time and spent working with seasoned supervisors who have been in the industry a long time. We need to create a safe working environment for everyone who is in and around a set.Dominic Tuohy, Special Effects ArtistIf you look at when I first started, everything we did was in-camera, i.e. filmed for real. It was harder back then nearly 40 years ago than it is today because we had almost no CGI and relied on matte painting and miniature work, etc. Thats where the saying, Its all smoke and mirrors comes from.Now the goal is to create a seamless collaboration between special effects and visual effects so that you dont question the effects within the film. Visual effects always have this problem: If you say to someone Draw me an explosion, your image will be different from mine. Whereas, if I create a real explosion, you, the audience, will accept it, and it grounds the film in reality. Thats the starting block of the smoke and mirrors moment. Now, using the existing SFX explosion, visual effects can augment that image to fit the shot and continue to convince the audience its real. All of this cannot be achieved without a great team effort, and thats something the British Film Industry has in abundance. [Note: Tuohy won the Academy Award in 2020 for Best Achievement in Visual Effects for his work on 1917.]Neil Corbould feels that practical effects are in a great place, and he embraces all the new technologies that come along that add to his toolkit of options. (Image courtesy of Neil Corbould)Nick Rideout, Co-Founder and Managing Director, Elements Special EffectsI am one of the few of my generation who wanted to be in SFX from an extremely early age. I was completely taken by Star Wars and Hammer [Film Productions] movies and wanted to be involved in making monsters and models for the film industry. I was lucky enough to go to art school and had the good fortune of a very honest tutor who pointed out that at best I was an under-average sculptor, but shouldnt let that stop me from pursuing a career in effects work as it is such a varied department. With that, it was a matter of writing a lot of letters and hanging in there once given a chance within a physical effects department. Weather effects, mechanical rigs, fire and pyrotechnics, it felt like the greatest job ever still does most days. I worked hard in the teams that I was part of and was fortunate enough to be alongside some of the best technicians of the time, who took the time to explain the how and why along with giving me the chance to express my ideas. Its a long road as no two days are ever the same, and even now Im not sure when you become an expert in the field as it is a constant learning curve.With all HETV, the challenges start with script expectation, directors vision versus the schedule, budget and location. The challenges can be so varied, the physicality of getting equipment onto a location, grade-one listed buildings, not having the means to test prior to the shooting day, all this before cast and cameras are present.John Richardson is proud of his work on all the Harry Potter films (2001-2011). (Image courtesy of Warner Bros. Pictures)Dominic Tuohy won the Academy Award in 2020 for Best Achievement in Visual Effects for his work on 1917. (Image courtesy of Universal Pictures)John Richardson worked on the Bond film Licence To Kill (1989). (Image courtesy of John Richardson, Danjaq LLC and MGM/UA)There are so many times that I am proud of my crew and their accomplishments. Without romanticizing our department, the pressure to deliver on the day is huge with nowhere to hide when the cameras are turning. Concentration, expertise and, at times, sheer grit get these effects over the line. Film and television production is always evolving and reinventing. SFX is at times an old technology but remains able to integrate itself with the most modern of techniques. Thats not to say we have not developed alongside the rest of the industry, but we will always be visual and physical.Max Chow, Concept Designer, Wt WorkshopI didnt know much about SFX. I wanted to try different stuff, and it just so happened that Wt Workshop had opportunities. Because Im quite new to this, learning about the harmony between physical manufacturing and the digital pipeline is very important. It is exciting to play a role or part in any of these projects at Wt Workshop. I think these limits make it relatable to us and give a realistic feel, which I think we all appreciate. Being fairly new to special effects, one of the most challenging things was storyboarding for Kingdom of the Planet of the Apes. Seeing how great those storyboards were, I had to try my best to match that standard while learning how important pre-visualization and boarding are for VFX.Its easy to get lost and want to reinvent things and put in alternate creative designs putting a lot of yourself in it, but we have to pull back. We have to trust in our textiles and leather workers, because they know every stitch, every seam, better than us. We have to design with the realities of SFX in mind. Later, when something is made wet, when there is cloth physics applied to a design, or when rigging is applied in VFX, theres no doubt that it works, because it worked in real life. As VFX progresses, SFX is used in tandem to progress at the same rate we are using new materials, workflows, tech and pipelines for physical manufacture. The world has started to demand more of the unseen in theaters, like Avatar, Kingdom of the Planet of the Apes and Godzilla vs. Kong, where were getting fully made-up worlds and less human ones. We pay to see something that we cant experience in real life. As that evolves, as we demand greater entertainment, theres more work needed to ground these films and make everything a spectacle but also believable.Iona Maria Brinch, Senior Concept Artist, Wt Workshop Being a fan of The Lord of the Rings books and films, and fascinated by Wt Workshops work on the films, it was a thrill when I was introduced to Wt Workshop Co-founder Richard Taylor, VES, through a friend while I was backpacking through New Zealand. He saw my paintings, pencil sketches and wood carvings that I had been creating on the road, and invited me in. I began as an intern, working through different departments before eventually landing a role in the 3D department. From there, I worked my way into our design studio, initially doing costume designs for Avatar: The Way of Water, working closely with Costume Designer Deborah Scott.I tend to lean towards the more organic and elegant side of things when designing, such as the Navi costumes, which have a natural feel to them. So, working on the designs with Deborah Scott for the RDA [Resources Development Administration], who are a military unit in Avatar, was a fun challenge, from having to consider hard surfaces, to thinking about futuristic yet realistic-looking gear. I was part of the team that designed a lot of the female characters clothing in Avatar, like Tsireya, Neytiri and Kiri, with Deborah Scott. Getting to properly dive into the world of Pandora, then eventually see my designs go through our costume department, where theyd build them physically, was incredible. They brought it to a completely new level. You cant take ownership of a design its a collaboration that goes through so many hands, which is an absolute joy to see and be a part of.Theres a lot to consider when youre designing stuff that is going to be made digitally and whats going to look good on a screen. There needs to be volume and 3D textures, interesting shapes and silhouettes that also read well at a distance, such as adding strands that can sway in the wind, shells that can give a slight dangling sound, or certain objects that can catch the light in an interesting way. All these things make it feel immersive and realistic. Its amazing to see how SFX is blending with VFX, but how theres still a desire to make as much as possible physically. SFX adds a certain grounded-ness to the films, perhaps because there are still limits to how much you can do in SFX.On the set of Napoleon with Special Effects Supervisor Neil Corbould. Corbould believes that creating a safe working environment for everyone on and around a set is paramount. (Image courtesy of Neil Corbould and Columbia Pictures/Sony)The digital 3D modeling work for Kingdom of the Planet of the Apes was completed by the Aaron Sims Company with Wt Workshop designer Max Chow, who was also a concept and storyboard artist on the film. (Image courtesy of Walt Disney Studios Motion Pictures)Hayley Williams served as Special Effects Supervisor on Wonka. (Image courtesy of Warner Bros. Pictures)Iona Brinch was part of the team that designed female characters clothing for Avatar, such as for Tsireya, Neytiri and Kiri, with Deborah Scott. (Image courtesy of Iona Brinch and Wt Workshop)John Richardson was the Special Effects Supervisor on A Bridge Too Far (1977). (Image courtesy of John Richardson)Abbie Kennedy, Layout and Matchmove Supervisor, ILM I had mainly focused on modeling and texturing during my Computer Animation degree, so I had no experience with matchmove or layout when I joined the industry in 2010. I distributed my showreel to the London VFX studios with the dream of becoming a texture artist and soon realized that a graduate position in that field was hard to come by. At that time, the entry-level route across the 3D departments was via matchmove. I landed a junior matchmove artist role at DNEG working on Walt Disney Studios John Carter and soon recognized the importance of this skill. You could have the most amazing textured models but if the matchmove wasnt correct, the shots wouldnt look convincing. I discovered that I enjoyed the problem-solving aspect of replicating exactly what took place on set. I decided to commit to a path in the matchmove department there, eventually progressing to Matchmove Supervisor. When ILM opened its London office, the layout department appealed to me as it encompasses matchmove and layout, satisfying both the technical and creative sides of my brain. I came in to work on Star Wars: The Last Jedi and quickly adapted to the ILM pipeline and proprietary software. The scope of work in the department is varied and each project has different challenges. One month you could be body-tracking an actor to transform them into someone else and the next youre flying a camera through space. I now manage the department as a whole, growing and nurturing a team of highly skilled career layout artists while working on really exciting projects.Layout is right at the beginning of the shot pipeline. If our work isnt complete or correct then it holds up all the other departments, so there can be a lot of pressure at the beginning of a show to output a lot of work. For layout supervisors/leads, their job is to make sure artists have all the ingredients to complete their shot work. Were processing LiDAR scans so that they are user-friendly, separating out movable set pieces so we can reposition them; solving the lens grids so we have a library of lens distortion we can add to our cameras; and liaising with production to schedule the shots in an order that allows for the most efficient workflows both in layout and downstream, to name but a few! Organization is key, and spreadsheets are my friend! The work that comes our way in the layout department is constantly evolving, and were always developing new technology to push the boundaries of what we can achieve. In the last few years, we have taken on some innovative projects such as the ABBA Voyage immersive experience. This presented many technical challenges and involved putting a team of artists together to capture thousands of seconds of facial performance. Im proud of being able to give a wave of junior artists a foot in the door at ILM. I have a passion for growing emerging talent and Im excited to see what projects come our way next.Gem Ronn Cadiz, Senior Creature Technical Director, ILM I got my role as a Creature TD in visual effects when I applied for the Jedi Masters Apprenticeship program at Lucasfilm Singapore back in 2009. It was an eight-month training course that taught me all the basics for cloth, flesh, hair and rigid body simulations. I was also fortunate to have amazing mentors from ILM San Francisco who guided me and provided invaluable insights throughout the program. Their expertise and support helped me build a solid foundation, and that experience not only kickstarted my passion for VFX but also set me on the path to where I am today. It was an incredible opportunity that shaped my career in ways I couldnt have imagined.I think the most challenging aspect is making sure our simulations look as natural as possible while juggling the technical side of asset development. Its a balancing act to maintain and update those assets throughout the entire timeline of the show and still deliver quality shots within the timeframe. It can be challenging, but its also a lot of fun and very rewarding when everything comes together. As a hobbyist garment maker in my spare time and a Clo3d user, its exciting to see how our discipline is slowly evolving. The setup for simulated clothes is getting closer to proper sewing patterns, and the level of detail we can achieve now is amazing. The introduction of Houdini and its simulation solver is gradually being accepted industry-wide, which is another exciting development. Theres a strong motivation to push for physically accurate simulations and to adapt realistic techniques, which makes this an exciting time to be in this field.Hayley Williams on the set of the Alex Garland film Annihilation. (Image courtesy of Hayley Williams)Gem Ronn Cadiz served as ILM Creature Technical Director on the animated feature Rango (2011). (Image courtesy of Paramount Pictures)Hayley Williams, Special Effects SupervisorI have been around the film industry and SFX since I was a child as my father and uncles were/are in the business. I developed a love of SFX as I got older and was always very interested in the mechanical side of things, so I went to college and qualified as a mechanical engineer, then went on to become a project engineer at a company in the Midlands. After gaining an extensive skill base outside of the film industry, I felt it was a good time to use those skills in SFX. I joined my fathers team on Charlie and the Chocolate Factory and built my career in SFX from there.The most challenging and exhilarating sequence I have supervised is a recent film with Steve McQueen directing, involving a huge water setup and lots of stunt choreography linked to big-scale effects. Another nail-biting effect I have been involved in is launching a military tank built by SFX out of a huge carrier down a motorway in Tenerife on Fast & Furious 6. There was only one go at this as the entire front of the carrier was breakaway and the tank would likely be damaged beyond repair once it hit the road, so we tested and prepped a huge amount to give us the best outcome possible.The world of physical in-camera effects took a hit around 10 years ago, but I believe that the desire for on-set effects is strong again, and directors along with VFX supervisors are keen to have as much physically in-camera as is practical with time and financial constraints. Advances in things like 3D printing are changing how we work all the time and allowing us faster turnarounds and the ability to test more ideas out within better timescales.John Richardson, VES, Special Effects DesignerMy father was an FX man who started in the industry in 1921, so I got into SFX through nepotism. My first film as a supervisor was Duffy in 1968. I had already worked with Bob Parrish, the director on Casino Royale (1967), the year before. There are too many challenging special effects sequences that Ive worked on to name them all, but A Bridge Too Far was challenging as was Lucky Lady and Aliens. Sequence-wise, the Nijmegen River Crossing on A Bridge Too Far and the Bond films I did were very challenging. Im proud of all 60 years of my career, but Im most proud of A Bridge Too Far, the Bond movies, Cliffhanger, Aliens and the Harry Potter films.As special effects shift more into CGI, it is sadly losing reality, which is something I have always strived to put on screen. [Note: Richardson authored the book Making Movie Magic.]John Richardson, in the red hat, with a safety climber cliffside on Cliffhanger (1993). The majority of the movie was shot in Cortina dAmpezzo, Italy, in the Dolomite Mountains, which doubled for the Colorado Rockies. (Image courtesy of Carolco Pictures and TriStar/Columbia/Sony)0 Comments 0 Shares 242 Views
-
WWW.VFXVOICE.COMPROGRAMMING THE ILLUSION OF LIFE INTO THE WILD ROBOTBy TREVOR HOGGImages courtesy of DreamWorks Animation and Universal Pictures.Central to the narrative is the parental relationship between Roz and Brightbill.A shift to the island perspective occurs when the otters discover Roz.What happens when a precisely programmed robot has to survive the randomness of nature? That is the premise that allowed author Peter Brown to bring a fresh perspective to the fish-out-of-water scenario that captured the attention of filmmaker Chris Sanders and DreamWorks Animation. Given the technological innovations that were achieved with The Bad Guys and Puss in Boots: The Last Wish, the timing proved to be right to adapt The Wild Robot in a manner that made distinguishing the concept art from the final frame remarkably difficult.When I read the book for the first time, I was struck by how deep the emotional wavelengths were, and I became concerned that if the look of the film wasnt sophisticated enough, people would see it as too young because of the predominance of animals and the forest setting, explains director/writer Chris Sanders. Weve always talked about Hayao Miyazakis forests, which are beautiful, sophisticated, immersive and have depth. We wanted that same feeling for our film.Achieving that desired visual sophistication meant avoiding the coldness associated with CG animation. Im absolutely thrilled about the analog feel that we were able to revive, Sanders notes. Its one of the things that Ive missed the most about traditional animation. The proximity of the humans who created it makes the look resonate. When you have hand-painted backgrounds, the artists hand is evident; theres a warmth and presence that you get. When we moved into full CG films, we immediately got these wonderful gifts, like being able to move the camera in space and change lenses. However, we also lost that analog warmth and things got cold for a while. The painterly approach made for an interesting discovery. A traditionally done CG tree is a structure that has millions of leaves stuck to it, and were fighting to make those leaves not look repetitive. Now were able to have someone paint a tree digitally. They can relax the look and make it more impressionistic. The weird and interesting thing is, it looks more realistic to my eye, Sanders remarks.The migration scene pushed the crowd simulations to the limit.CG animation was not entirely avoided as a visual aesthetic, in particular when illustrating the character arc of the island-stranded ROZZUM Unit 7134 robot, which goes by the nickname Roz. From the first frame of the movie to the last frame, Roz is dirtier and growing things on her, notes Visual Effects Supervisor Jeff Budsberg. But if you look at the aesthetic of how we render and composite her, there is a drastic difference. Roz is much looser. Youll see brushstroke highlights and shadow detail removed just like a painter would. We slowly introduce those things over the course of the movie. When the robots are trying to get her back, it becomes a jarring juxtaposition; she now fits with the world around her while the robots have that more CG look to them. Something new to DreamWorks is authoring color in the DCI-P3 color space rather than SRGB. Budsberg explains, It allows us a wider gamut of available hues because we wanted to feel more like the pigments that you have available as a painter. It allows us to hit way more saturated things than weve been able to do at DreamWorks before. The same with the greens. Theyre so much richer in hue. Maybe to the audience, its imperceptible, but the visceral experience is so much more impactful.Storyboards depicting the interaction of Roz with Brightbill.Eliminating facial articulation was an important part in making Roz a believable and endearing robot.Elevating the themes of The Wild Robot is the shape language. Weve always loved the potential that Roz was going to be a certain fish out of water, and that influenced the design, notes Production Designer Raymond Zibach. The simple circular clean, the way that we know a lot of our technology from the iPhone to the Roomba, everything is simple shapes, whereas nature is jagged or pretty with flowers, but everything is asymmetrical. The trio of Nico Marlet, Borja Montoro and Genevieve Tsai were responsible for the character designs. All three of them were studying animals and doing the almost classic Disney approach, like when they studied deer for Bambi. Our style landed somewhere in-between a realistic drawing and a slightly pushed style. You can see that in the character, Fink. How big his tail is to how pointy his face is. Those things are born out of an observation that Nico made on all of his fox drawings. We have quite a few species. I heard somebody say 60, but its because we have quite a few birds, so maybe thats why it ended up being that many, Zibach says.Photorealism was not the aim of the visuals, but rather to create the warmth associated with handcrafted artwork.Its limiting not to have a mouth, but thats red meat for us as animators because thats when we start to imagine and emphasize pantomime. We looked at Buster Keaton, who has very little facial expression, but his body language conveys all of the emotions, as well as other masters of pantomime and comedy, Charlie Chaplin and Jacques Tati. There is definitely comedy in this film, but all of it is grounded in some form of reality.Jakob Jensen, Head of Character AnimationAs the story progresses, the visual aesthetic of Roz becomes less slick and more like the rugged surrounding environment.While the watercolor paintings Tyrus Wong did for Bambi influenced the depiction of the island, a famous industrial designer was the inspiration for the technologically advanced world of the humans and robots. Syd Mead is the father of future design from humans and robots. Syd Mead is the father of future design from the 1960s through the 1980s, Zibach observes. The stuff before Blade Runner was optimistic. We wanted to bring that sense to our human version of the future, which to the humans is optimistic. However, for the animals, its not a great place for them. That ended up being such a great fit. As for the robots, Ritche Sacilioc [Art Director] did the final designs for Roz, RICOs and Vontra. Ritchie designed most of the future world except for a neighborhood that I did. Ritchie loves futurism, and you can see it reflected in all of those designs because he also helped matte painting do all of the cityscapes. I couldnt be happier because I always wanted to work on a sci-fi movie in animation, and this is my first one. We got to blow the doors off to do cool stuff.Tools were made to accommodate the need for wind to blow through the vegetation. In Doodle, where you can draw whatever foliage assets that you want, we have Grasshopper, which can build rigs to deform these plants on demand, and you can build more physical rigs of geometry that dont connect or flowers that are floating, Budsberg explains. We can build different rigs at various wind speeds, or you can do hero-generated geometry. Water was tricky because it could not look like a fluid simulation. Its one thing to make the water, but how do you make the water look like a painter painted it? Its not good enough to make physically accurate water. You have to take a step back to be able to dissect it: How would Hayao Miyazaki paint this or draw that splash? How would a Hudson River painter detail out this river? You have to forget what you know about computer-generated water and rethink how you would approach some of those problems. You want to make sure that you feel the brushstrokes in that river. Look at the waterfall shot where the water starts to hit the sunlight; you feel that the brushstrokes of those ripples are whipping through the river. Theres a little Miyazaki-style churn of the water that is drawn. Then the splashes are almost like splatter paint. Its an impressionistic version of water that allows the audience to make it their own. You dont want to see every single micro ripple or detail, Budsberg remarks.A conscious decision was made not to give Roz any facial articulation to avoid her appearing cartoony. Its limiting not to have a mouth, but thats red meat for us as animators because thats when we start to imagine and emphasize pantomime, states Jakob Jensen, Head of Character Animation. We looked at Buster Keaton, who has very little facial expression, but his body language conveys all of the emotions, as well as other masters of pantomime and comedy, Charlie Chaplin and Jacques Tati. There is comedy in this film, but all of it is grounded in some form of reality.Roz teaches Brightbill how to fly.Skies play a significant role in establishing the desired tone for scenes.The forest environment was inspired by animation legend Hayao Miyazaki.The lodge Roz helped to construct serves as a refuge for wildlife during a nasty winter.Effects are added to depict Roz fritzing out in the forest.It was also important to incorporate nuances that revealed a lot about the characters and added to their believability. One of our Chief Supervising Animators, Fabio Lignini, was the supervisor of Roz chiefly, but also was with me from the beginning in developing the animal stuff. He did so much wonderful work that was so inventive and grounded in observations of how otters behave. We would show the story department, This is what were thinking. That would then make them pivot from doing a lot of anthropomorphic hand-acting of certain creatures that was not the direction Chris Sanders or our animation department felt that it should go in. By seeing what we were doing, they adapted their storyboards, because you have to stage your shot. How do otters swim? There is so much fun stuff to draw from nature, so why not use that?An animation storyboard that explores the motion of Roz.Locomotion had to be understood to the degree that it became second nature to the animator. The animal starts walking and trots over here, or maybe gallops and then stops, Jensen explains. For that to become second nature for an animator, you have to study hard, and youll see shots where its amazing how the team did it because you never pay attention to it. You just believe it. That was my first pitch to Chris Sanders as to how far I saw the animation approach and style. I wanted the animation to disappear into the story and for no one to concentrate on the fact that we are watching animation. Were just watching the story and characters. Crowds were problematic. Jensen comments, Sometimes, we have an immense number of characters who couldnt only be handled by the crowds department. We threw stuff to each other all of the time. But in order to even have a session in our Premo software that would allow for more than five to 10 characters in the shot, they had to come up with all kinds of solutions to have a low-resolution version of the character that could be viewed while animating all of the other characters because there are a ton of moments when they all interact with each other.Animation tends to have quick cuts which was not appropriate for The Wild Robot. From the start, it became evident that we needed time [to determine] how to express that in a way when you dont have a final image to say, See this is gorgeous and youre going to want to be here, notes Editor Mary Blee. We were using tricks and tools like taking development art and using After Effects to put characters in it moving, or getting things on layers to say, She is walking through the forest. You guys are going to be interested one day, but its hard to see right now. There was a lot of invention to get across the flavor, tone and pace of the movie. We had a couple of previs shots, one in particular where Roz stands on top of the mountain and sees the entirety of the island. Chris Stover [Head of Cinematography Previz/Layout], with help from art, made that up out of nothing before we had a sequence. Boris FX was utilized to create temporary effects. The first sequence I cut was Roz being chased by a bear, falling down a mountain and discovering that she has destroyed a goose nest and there is only an egg left. We needed to show that her computer systems were failing, so we made a fake HUD and started adding effects to depict it fritzing out and an alarm going off. That sequence was amazing because it encapsulated what the movie was going to be, which was a combination of action, stress, excitement, devastation, sadness and silence. All that happens within two and a half minutes, Blee says.Illustrating scenes involving crowds was a major task for the previs team. We have a lot of naturalistic crowds whether it was all of the animals in the lodge, the migration scenes, flying flocks of birds and the forest fire with all of the animals running, Stover notes. For the geese, it was fairly simple. It was like a sea of birds. When you are looking at moments like the lodge, it was impactful for the audience to understand that there were a lot of animals that were going to be affected if Roz didnt step in and help the island over this harsh winter. Each of the three oners were complex to execute. Stover explains, Those types of shots were often tricky because I dont want anybody to ever look at it and go, Thats a single shot. Its really cool. What we want to do is allow you to be with that character for an extended period of time in a way that would ground you to that characters challenges, as well as the cinematic moment that were trying to achieve in the storytelling. Throughout the story, various camera styles were adopted. At the beginning, were on a jib arm and it feels controlled. It wasnt until we created the shot where the otters jump into the water and pop up that we then realized we are now in the islands point of view. The wildness of the island became a loose camera style. The acting was going to drive the camerawork. When Vontra tries to lure Roz onto the ship, we use this flowy camera style. We let the action move in a much more dynamic way. The camera feels deliberate in its choices. That sensibility of being deliberate versus the sensibility of reactionary camerawork was the contrast that we had to play with throughout the film.To create the illusion of spontaneity, close attention was paid to the background characters. At one point, a pair of dragonflies are behind Roz, Sanders states. I never sat down and said, I must have dragonflies. This was something that was built in as people were working on it. It was perfectly placed and was so well thought through because it looked believable. Things fly in; theyre asymmetrical, off to the side and dart out. It feels like the dragonflies flew through a shot that we were shooting with a camera on that day.Jensen is partial to the character of Fink. That was one of the first characters that we developed that was supervised by Dan Wagner [Animation Supervisor], who is a legend. I liked animating him because its difficult to do a fox. They dont quite move like dogs or cats. Its not even an in-between. Its something interesting. Even moments of silence were carefully considered, such as during the migration scene where Brightbill is at odds with his surrogate mother Roz before he flies away with a flock of geese, which was aided by a shifted line of dialogue that reinforces the idea that things are not good between them.If we have done our job, there is storytelling in the silence, Blee observes. Its not just a pause or beat. Its the weight of what weve all had in our lives when we needed to say something to somebody, but we cant. Its awkward and difficult, and there are too many emotions. Hundreds of storyboards were drawn to try to get that moment right over time. Because we dont want it to sit there and use exposition to have people just talk to explain, Youre supposed to be feeling this right now. No. Its a lot of work to make it so you dont have to say anything but the audience understands whats happening.Experimenting with different shape languages for the character design of Roz.Concept art for the pivotal sequence where Roz aids Brightbill in joining a flock of migrating geese.Concept art of Roz and Fink attempting to survive a brutal snowstorm, which served as a visual template for the painterly animation style.Roz appears to be oblivious to a looming wave as her attention is focused on the coastline activities of crabs.Depicting the size and scale of Roz in comparison to the beach formations.0 Comments 0 Shares 247 Views
-
WWW.VFXVOICE.COMFORMING A STRONG BOND WITH RACHAEL PENFOLDBy OLIVER WEBBImages courtesy of Rachael Penfold, except where noted.Rachael Penfold, Co-Founder and Company Director, One of UsRachael Penfold grew up in Ladbroke Grove in West London where she attended the local comprehensive school. At the time, it was less of an institute for education and more like a social experiment, Penfold reveals. A symptom of the same problems we see today because of the lack of funding for state education. A small example, but its easy to see why diversity is a problem in the wider film industry.Penfold didnt initially study to become a visual effects artist, instead she learned on the job. I had the best apprenticeship really, Penfold says. It was at the Computer Film Company (CFC), a pioneering digital film VFX company. I think they may have been the first in the U.K. It was a sort of melting pot of scientists, engineers, filmmakers and artists. It was weird and fun and pretty unruly to be honest, but always and without compromise, it was about image quality. I obviously made a half-decent impression as a runner at CFC and was moved to production assistant. Either that or they simply needed bodies in production. So, in that sense, I caught a lucky break with my entry into the industry.Penfolds lucky break came in 1997 with the British fantasy film Photographing Fairies, serving as Visual Effects Producer. This was followed by other CFC projects, which saw Penfold work as a Visual Effects Producer on acclaimed films such as Tomorrow Never Dies, Spice World, The Bone Collector, Mission: Impossible II, Chicken Run and Sexy Beast. Penfolds last CFC project was the 2002 film Resident Evil for which she was Head of Production.In 2004, Penfold co-founded London visual effects studio One of Us alongside Dominic Parker and Tom Debenham, where she currently serves as Company Director. Working across TV and film, One of Us currently has a capacity for over 300 artists, but is looking to expand across more exciting projects. One of Us also launched its Paris studio in 2021, which houses 70 artists. The company won Special, Visual & Graphics Effects at the 2022 BAFTA TV Awards for their work on Season 2, Episode 1 of The Witcher, and in 2022 they were also nominated for Outstanding Visual Effects in a Photoreal Feature at the VES Awards for their work on The Matrix Revolutions. Some of their recent work includes Damsel, The Sandman, Fantastic Beasts: The Secrets of Dumbledore and Bridgerton, Season 2.Setting up the company was a very organic process for Penfold. We set up a small team to deal with the visual development of a particular project, Penfold says. We kept ourselves small for quite a few years, always wanting to engage with more of the outsider work. It was great fun, lots of risk, and my two partners, Dominic Parker and Tom Debenham, are also two wonderful friends. We still work so closely together.Some of the work that One of Us completed for Damsel included digi-doubles, burnt swallows, armor swords, dragon fire, melting and cracking ice and huge-scale cave environments and DMP set extensions. (Image courtesy of Netflix)The One of Us leadership team. From left: Tom Debenham, Rachael Penfold and Dominic Parker.One of Us won the BAFTA Craft Award in 2018 for their work on The Crown.Penfold at the 2022 Emmy Awards where One of Us was nominated for Special Visual Effects in a Single Episode for Episode 1 of The Man Who Fell to Earth as well as for Special Visual Effects in a Season or a Movie for The Witcher, Season 2.Penfolds first film as One of Us Visual Effects Producer was the 2007 film When Did You Last See Your Father? starring Jim Broadbent and Colin Firth. Since launching One of Us, Penfold has worked on an array of projects including The Tree of Life, Cloud Atlas, Under the Skin, Paddington, The Revenant and The Alienist. One of Us also served as the leading vendor on the Netflix original series The Crown. Their work for the show included digital set extensions, environments, crowd replication and recreating Buckingham Palace. They have also contributed to key scenes throughout the series, including Queen Elizabeth and Prince Philips Royal Wedding of 1947, the funeral of King George VI and subsequent coronation of Queen Elizabeth II. Penfold served as Visual Effects Executive Producer across 20 episodes. The companys work on the show helped them gain more recognition after winning the BAFTA Craft Award in 2018 for their work on the show, as well as receiving an Emmy nomination in 2017 for Outstanding Special Visual Effects in a Supporting Role for their work on Season 1 of the show and again in 2018 for Season 2. The company ethos of One of Us is one of creative intelligence and the ability to select and adapt ways of approaching practical problems involved in bringing ideas to life. Since their launch in 2004, their work has truly reflected these values.The Zone of Interest won Best International Feature Film at the 96th Academy Awards and was also nominated for Best Picture. (Image courtesy of A24)Choosing a favorite visual effect shot from her oeuvre, however, is an almost impossible task for Penfold. Boasting an impressive catalog of award-winning films and series, its easy to understand. I definitely have favorite work, but not always because its the biggest or most ambitious. Theres so much that makes an experience great, not just the outcome, but the journey and who you go on that journey with, Penfold explains. But, from a professional pride perspective, Damsel is a huge achievement.Penfolds first film as the Visual Effects Producer for One of Us was the 2007 film When Did You Last See Your Father? starring Jim Broadbent and Colin Firth. (Image courtesy of Sony Pictures)One of Us enjoyed its creative involvement with Mirror Mirror (2012). (Image courtesy of Relativity Media)Some of the impressive work that One of Us completed for the film included digi-doubles, burnt swallows, armor swords, dragon fire, melting and cracking ice, huge-scale cave environments and DMP set extensions. Again, I cant choose a favorite shot from the film, Penfold explains. There are so many massive shots in that film, but the shot where the dragon flattens the knight underfoot feels like the culmination of many years of hard work, growing One of Us to the point where it can take on the toughest challenges. From a simple aesthetic perspective, I love the work we did on Mirror Mirror a perfectly told story in a beautifully designed world.Another recent project that Penfold is particularly proud of is The Zone of Interest, directed by her partner Jonathan Glazer. The film marks their third feature collaboration after Sexy Beast (2000) and Under the Skin (2013). The Zone of Interest follows Rudolf Hss, the commandant of Auschwitz, and his wife, Hedwig, as they strive to build a dream life for their family besides the extermination camp that Hss helped to create. Sometimes you are proud to be associated with a project because its a great piece of filmmaking, even if our work is a relatively minor contribution, Penfold remarks. Im very proud to have been a part of The Zone of Interest. I believe it is an important film. There are also 660 visual effects shots in it, and Im delighted that no one knows that!On the experience of setting up a company in a male-dominated industry, Penfold explains that the important thing is to look after the work and look after the people and the rest should follow. Parity/ equality is best served by looking after your people and by looking after all people so you create an environment where everyone can thrive. Key roles for women, and a diverse team, will come through a genuine commitment to value everyone which, in turn, will enrich everything that we do, Penfold states.Penfold and Jonathan Glazer at the premiere of Under the Skin at the 70th Venice International Film Festival in Venice in 2013. (Photo: Aurore Marechal)Penfold taught a masterclass in VFX for Becoming Maestre A springboard for a new generation of professionals in cinema and seriality in Rome in 2023. The mentoring program, aimed at Italian female audiovisual talent, was conceived and developed by Accademia del Cinema Italiano, David di Donatello Awards and Netflix as part of the Netflix Fund for inclusive creativity. (Image courtesy of Accademia del Cinema Italiano)VFX tools and technology never stop developing, and there have been many advancements since the beginning of Penfolds career. Thinking back to the rudimentary tools we used to have massive clunking hardware that seemed to constantly fall over or fail. The most basic software we have come a really long way, she notes. Many of todays tools are designed to improve long-established techniques. The craft is being reinvigorated by new technologies in better and more exciting ways. Some technologies are completely new, and we are learning how to use them. But, as consumers, we have an irrepressible desire to examine our own humanity, one way or another. I cant see a world where human storytellers are not at the heart of creating and bringing to life our own stories.For Penfold, that humanity is key to her definition of success. Without question, the thing that I enjoy most about my role is the absolutely wonderful people I work with. Whether thats internally our teams, or whether thats part of the film family these endeavors are often hard and long and unknown. So, you really do form extremely strong bonds. Producing exciting imagery is thrilling; its a real buzz. So, to do that as part of a tight-knit team is rewarding in so many ways.0 Comments 0 Shares 214 Views
-
WWW.VFXVOICE.COMREALISTIC FACIAL ANIMATION: THE LATEST TOOLS, TECHNIQUES AND CHALLENGESBy OLIVER WEBBTimothe Chalamet and mini Hugh Grant in Wonka. When it came to generating a CG version of an actor as recognizable as Hugh Grant for Wonka, Framestore turned to sculptor and facial modeler Gabor Foner to better understand the quirks of muscular activation in the actors face, eventually developing a formula forre-creating facial performances. (Image courtesy of Warner Bros. Pictures)Vicons CaraPost single camera tracking system tracks points using only a single camera. Single-camera tracking works automatically when only one camera can see a point. If a point becomes visible again in two cameras, the tracking reverts to multi-camera tracking. (Image courtesy of Vicon Motion Systems Ltd. UK)Realistic facial animation remains a cornerstone of visual effects, enabling filmmakers to create compelling characters and immersive storytelling experiences. Facial animation has come a long way since the days of animatronics, as filmmakers now have access to a choice of advanced facial motion capture systems. Technologies such as performance capture and motion tracking, generative AI and new facial animation software have played a central role in the advancement of realistic facial animation. World-leading animation studios are utilizing these tools and technologies to create more realistic content and breaking new ground in the way characters are depicted.A wide range of facial capture technology is in use today. One of the pioneers of facial animation technology is Vicons motion capture system, which was vital in its use in films such as Avatar and The Lord of the Rings trilogy. Faceware is another leading system that has been used in various films, including Dungeons & Dragons: Honor Among Thieves, Doctor Strange in the Multiverse of Madness and Godzilla vs. Kong, as well as games such as Hogwarts Legacy and EA Sports FC 24. ILM relies on several systems, including the Academy Award-winning Medusa, which has been a cornerstone of ILMs digital character realization. ILM also pioneered the Flux system, which was created for Martin Scorseses The Irishman, as well as the Anyma Performance Capture system, which was developed with Disney Research Studios. DNEG, on the other hand, uses a variety of motion capture options depending on the need. Primarily, DNEG uses a FACS-based system to plan, record and control the data on the animation rigs. However, we try to be as flexible as possible in what method we use to capture the data from the actors, as sometimes we could be using client vendors or client methods, says Robyn Luckham, DNEG Animation Director and Global Head of Animation.ILMs Flux system, which was created for Martin Scorseses The Irishman, allows filmmakers to capture facial data on set without the need for traditional head-mounted cameras on the actor. (Image courtesy of ILM)Developing a collaborative working understanding of the human aspects, nuances and complexities and the technology was particularly challenging for Framestore on 2023s Wonka. Part of that was building an animation team that had a good knowledge of how FACS [Facial Action Coding System] works, remarks Dale Newton, Animation Supervisor at Framestore. While as humans we all have the same facial anatomy, everyones face moves in different ways, and we all have unique mannerisms. When it came to generating a CG version of an actor as recognizable as Hugh Grant, that raised the bar very high for us. At the core of the team was sculptor and facial modeler Gabor Foner, who helped us to really understand the quirks of muscular activation in Hughs face, [such as] what different muscle combinations worked together with what intensities to use for any particular expression. We ended up with a set of ingredients, recipes if you like, to re-create any particular facial performance.Masquerade3 represents the next level of facial capture technology at Digital Domain. This latest version brings a revolution to facial capture by allowing markerless facial capture without compromising on quality, Digital Domain VFX Supervisor Jan Philip Cramer explains. In fact, it often exceeds previous standards. Originally, Masquerade3 was developed to capture every detail of an actors face without the constraints of facial markers. Utilizing state-of-the-art machine learning, it captures intricate details like skin texture and wrinkle dynamics. We showcased its outstanding quality through the creation of iconic characters such as Thanos and She-Hulk for Marvel. Eliminating the need for markers is a natural and transformative progression, further enhancing our ability to deliver unmatched realism. To give an example of the impact of this update; normally, the CG actor has to arrive two hours early on set to get the markers applied. After each meal, they have to be reapplied or fixed. The use of COVID masks has made this issue infinitely worse. On She-Hulk, each day seemed to have a new marker set due to pandemic restrictions, and that caused more hold-ups on our end. So, we knew removing the markers would make a sizeable impact on production.Vicons Cara facial motion capture system. (Image courtesy of Vicon Motion Systems Ltd. UK)Facewares Mark IV Wireless Headcam System for facial capture. (Image courtesy of Faceware Technologies Inc.)ILM relies on the Academy Award-winning Medusa system, a cornerstone of ILMs digital character realization, the Anyma Performance Capture system, which was also developed with Disney Research Studios, and the Flux on-set system. (Image courtesy of ILM)Wt FXs FACET system was developed primarily for Avatar, where it provided input to the virtual production technology. Other major facial capture projects include The Hobbit trilogy and the Planet of the Apes trilogy. (Image courtesy of Wt FX)Ensuring that the emotions and personalities of each character are accurately conveyed is critical when it comes to mastering realistic facial animation. The process would loosely consist of capture, compare, review and adjust, Luckham explains. It would be a combination of the accuracy of the data capture, the amount of adjustments we would need to make in review of the motion capture data against the actors performance and then the animation of the character face rig against the actors performance, once that data is put onto the necessary creature/character for which it is intended. Once we have done as much as we can in motion capture, motion editing and animation, we would then go into Creature CFX for the flesh of the face, skin folds, how the wrinkles would express emotion, how the blood flow on a face would express color in certain emotions to again push it as close as we can to the performance that the actor gave. After that would be lighting, which is a huge part of getting the result of facial animation and should never be overlooked. If any one of these stages are not respected, it is very easy to not hit the acting notes and realism that is needed for a character.For Digital Domain, the most important aspect of the process is combining the actor with the CG asset. You want to ensure all signature wrinkles match between them. Any oddity or unique feature of the actor should be translated into the CG version, Cramer notes. All Thanos wrinkles are grounded in the actor Josh Brolin. These come to life especially in his expressions, as the wrinkle lines created during a smile or frown exactly match the actor. In addition, you dont want the actor to feel too restricted during a capture session. You want them to come across as natural as possible. Once we showed Josh that every nuance of his performance comes through, he completely changed his approach to the character. Rather than over-enunciating and overacting, he underplayed Thanos and created this fantastic, stoic character. This is only possible if the actor understands and trusts that we are capturing the essence of his performance to the pixel.On Disneys Peter Pan & Wendy, Framestores facial animation for Tinker Bell was generated through a mixture of facial capture elements using a head-mounted camera worn by the actress Yara Shahidi, which then underwent a facial solve involving deep learning algorithms trained to translate her facial motion onto Framestores CG facial rig. (Image courtesy of Walt Disney+)Performance capture is a critical aspect of realistic facial animation based on human performance. Since Avatar, it has been firmly established as the go-to setup for realistic characters, Cramer adds. However, there are unique cases where one would go a different route. On Morbius, for instance, most faces were fully keyframed with the help of HMCs [head-mounted cameras], as they had to perfectly match the actors face to allow for CG transitions. In addition, some characters might need a more animated approach to achieve a stylistic look. But all that said, animation is still needed. We get much closer to the final result with Masquerade3, but its important to add artistic input to the process. The animators make sure the performance reads best to a given camera and can alter the performance to avoid costly reshoots.Masquerade3 was developed by Digital Domain to capture every detail of an actors face without the constraints of facial markers, as showcased through the creation of iconic characters such as Thanos from the Avengers films. All Thanos wrinkles are grounded in the actor Josh Brolin. (Image courtesy of Marvel)For Framestores work on Disneys Peter Pan & Wendy, the facial animation for Tinker Bell was generated through a mixture of facial capture elements using a head-mounted camera worn by the actress Yara Shahidi. This, then, underwent a facial solve, which involves training deep learning algorithms to translate her facial motion onto our facial rigs, Newton says. The level of motion achieved by these solves required experienced animators to tighten up and finesse the animation in order to achieve the quality level for VFX film production. In contrast, performance capture on Wonka meant that we had good visual reference for the animators working on the Oompa Loompa as voiced by Hugh Grant. Working with Hugh and [co-writer/director] Paul King, we captured not only Hughs performance in the ADR sessions but also preparatory captures, which allowed us to isolate face shapes and begin the asset build. We had a main ARRI Alexa camera set up that he performed towards. Additionally, we had a head-mounted 1k infrared camera that captured his face and a couple of Canon 4k cameras on either side of him that captured his body movements.Masquerade3 represents the next level of facial capture technology at Digital Domain by allowing markerless facial capture while maintaining high quality. (Image courtesy of Digital Domain)YouTube creators of The Good Times are Killing Me experiment with face capture for their custom characters. Rokoko Face Capture for iOS captures quality facial expressions on the fly. It can be used on its own or alongside Smartsuit Pro and Smartgloves for full-body motion capture. (Image courtesy of Rokoko)Rokoko Creative Director Sam Lazarus playing with Unreals MetaHuman Animator while using the Headrig for iPhone face capture. (Image courtesy of Rokoko)According to Oliver James, Chief Scientist at DNEG, generative AI systems, which can generate data resembling that which they were trained on, have huge potential applications in animation. They also have the potential to create huge legal and ethical problems, so they need to be used responsibly, James argues. Applied to traditional animation methods, its possible to generate animation curves which replicate the style of an individual, but can be directed at a high level. So instead of having to animate the motion of every joint in a character over time, we could just specify an overall motion path and allow an AI system, trained on real data, to fill in the details and generate a realistic full-body animation. These same ideas can be applied to facial motion too, and a system could synthesize animation that replicated the mannerisms of an individual from just a high-level guide. Face-swapping technology allows us to bypass several steps in traditional content creation, and we can produce photorealistic renderings of new characters driven directly from video. These techniques are typically limited by the availability of good example data to train the networks on, but this is being actively tackled by current research, and were already starting to see convincing renders based on just a single reference image.Newton suggests that given how finely tuned performances in film VFX are today, it will take some time for AI systems to become useful in dealing with more than the simplest animation blocking. A personal view on how generative AI is developing these days some companies create software that seems to want to replace the artist. A healthier attitude, one that protects the artists and, thereby, also the business we work in at large, is to focus AI development on the boring and repetitive tasks, leaving the artist time to concentrate on facets of the work that require aesthetic and creative input. It seems to me a safer gamble that future creative industries should have artists and writers at their core, rather than machines, Newton says.For Digital Domain, the focus has always been to marry artistry with technology and make the impossible possible. There is no doubt that generative AI will be here to stay and utilized everywhere; we just need to make sure to keep a balance, Cramer adds. I hope we keep giving artists the best possible tools to make amazing content. Gen AI should be part of those tools. However, I sure hope gen AI will not be utilized to replace creative steps but rather to improve them. If someone with an artistic background can make fancy pictures, imagine how much better an amazing artist can utilize those.There has been a surge in new technologies over the past few years that have drastically helped to improve realistic facial animation. The reduced cost and complexity of capturing and processing high resolution, high frame rate and multi-view video of a performance have made it easier to capture a facial performance with incredible detail and fidelity, James says. Advances in machine learning have made it possible to use this type of capture to build facial rigs that are more expressive and lifelike than previous methods. Similar technology allows these rigs to perform in real-time, which improves the experience for animators; they can work more interactively with the rig and iterate more quickly over ideas. Real-time rendering from game engines allows animators to see their work in context: they can see how a shadow might affect the perception of an expression and factor that into their work more effectively. The overall trend is away from hand-tuned, hand-sculpted rigs, and towards real-time, data-driven approaches.Cramer supports the view that both machine learning and AI have had a serious impact on facial animation. We use AI for high-end 1:1 facial animation, lets say, for stunts. This allows for face swapping at the highest level and improves our lookdev for our 3D renders. In addition, we can control and animate the performance. On the 3D side with Masquerade3, we use machine learning to generate a 4D-like face mask per shot. Many aspects of our pipeline now utilize little training models to help streamline our workflow and make better creative decisions.On Peter Pan & Wendy, Framestore relied on advanced facial technology to capture Tinker Bell. We worked with Tinker Bell actress Yara Shahidi who performed the full range of FACS units using OTOYs ICT scanning booth. The captured data was solved onto our CG facial rig via a workflow developed using a computer vision and machine learning-based tracking and performance retargeting tool, Newton details. This created a version of the facial animation for Tinker Bell the animators could build into their scenes. This animation derived from the solve required tightening up and refinement from the animators, which was layered on top in a non-destructive way. This workflow suited this show, as the director wanted to keep Yaras facial performance as it was recorded on the CG character in the film. Here, the technology was very useful in reducing the amount of the time it might have taken to animate the facials for particular shots otherwise.Concludes Luckham. For facial capture specifically, I would say new technologies have generally improved realism, but mostly indirectly. Its not the capturing of the data literally, but more how easy we can make it for the actor to give a better performance. I think markerless and camera-less data-capturing is the biggest improvement for actors and performances over technology improvements. Being able to capture them live on set rather than on a separate stage, adds to the filmmaking process and the involvement of the production. Still, at the moment, I think the more intrusive facial cameras and stage-based capturing does get the better results. Personally I would like to see facial capture become a part of the on-set production, as the acting you would get from it would be better. Better for the actor, better for the director and better for the film as a whole.DNEG primarily uses a FACS (Facial Action Coding System) based system to plan, record and control the data on the animation rigs, but tries to be as flexible as possible in what method they use to capture the data from the actors, as sometimes they could be using client vendors or client methods. (Images courtesy of DNEG)Creating Ariels digital double for The Little Mermaid was one of the most complicated assets Framestore ever built. It involved replacing all of Halle Baileys body, at times also her face, into her mermaid form. (Image courtesy of Walt Disney Studios)On Morbius, where Masquerade3 was used for facial capture, most faces were fully keyframed with the help of HMCs, as they had to perfectly match the actors face to allow for CG transitions. (Image courtesy of Digital Domain and Columbia Pictures/Sony)0 Comments 0 Shares 245 Views
-
WWW.VFXVOICE.COMBLENDING CG CREATURES & WILDLIFE PHOTOGRAPHY TO BRING LOST GIANTS TO LIFEBy TREVOR HOGGA herd of Mammoths walk across the wintry landscape.(Image courtesy of Netflix)A quarter century ago, the landmark production Walking with Dinosaurs was released by BBC Studios Science Unit, Discovery Channel and Framestore. The six-part nature docuseries pushed the boundaries of computer animation to envision the iconic prehistoric beasts in a more realistic fashion than the expropriated Hollywood DNA of Jurassic Park and sentimental cartoon cuteness of The Land Before Time. Building upon the nature documentary television miniseries are Netflix and Apple, which partnered with BBC Studios Natural History Unit, Silverback Films, paleontologists Dr. Tom Fletcher and Dr. Darren Naish, narrators Sir David Attenborough and Morgan Freeman, cinematographers Jamie McPherson, Paul Stewart and David Baillie and digital creature experts ILM and MPC to do a natural fusion of photorealistic CGI and wildlife photography to produce Life on Our Planet and Prehistoric Planet.The giraffe-sized flying predator Hatzegopteryx courting on its own love island in the Islands episode of Prehistoric Planet 2. (Image courtesy of Apple Inc.)Shooting CG rather than live-action creatures was educational for Jamie McPherson, Visual Effects Director of Photography for Life on Our Planet, who previously took a GSS (Gyro-Stabilizer System) normally associated with helicopters, attached it to a truck and captured wild dogs running 40 miles per hour in Zambia for the BBC One documentary series The Hunt. I never did visual effects before this, so it was a big learning curve for me, and ILM never tried to do visual effects in a style that we came up with, which was to make it feel like a high-end, blue-chip documentary series. It was also sitting alongside Natural History. If youre doing pure visual effects, youve got more leeway of people not seeing how those two worlds mix. In terms of the creatures, the process was incredibly long. We worked out what the creature was going to be between the producer, director, myself, ILM and Amblin. Then you have to work out how it interacted with another creature of the same or different species. You have all of these parameters that youre trying to blend together and make them feel believable. When we were out filming the back plates for this, we made sure that they felt reactive, so we were careful to work out where the creature was going. As for behavior, we had to dial it back from what I filmed in the real world because we didnt want to break that believability and take people out of the moment of them noticing a crazy camera move, like a T-Rex walking over you.The T-Rex was clever, as revealed by a large brain, so its young were doubtless curious and even playful, so Cinematographer Paul Stewart imagined this scene in the Coasts episode of Prehistoric Planet 1.(Image courtesy of Apple Inc.)GSS and RED Monstro camera on location in Morocco to film Terrorbirds hunting for Life on Our Planet. (Image courtesy of Jamie McPherson)Being able to rely upon predetermined CG creatures provided the opportunity to utilize the best of narrative and Natural History cinematography. To me, its a drama, and you dont give the audience anything more by pretending that youre in a hide 160 million years ago with a long lens, observes David Baillie, Director of Photography for Prehistoric Planet and Life on Our Planet. Its more important to tell the story. Coloring his perspective is that Baillie continually shifts between narrative projects like Munich: The Edge of War and Natural History productions such as Frozen Planet. My job as a cinematographer is to tell the story with all of the emotion, and I do that by using focal length, camera movement and framing. Some limitations need to be respected. Were in a location that we maybe havent recce before. There is a bit of rock and river and you say, Lets do this. Everybody says, That looks nice. Then, the Visual Effects Supervisor will say, Using that shot will cost another 100,000 because weve planned it the other way. That can be quite frustrating. A slightly different mental attitude had to be adopted. I have to be more disciplined with things like changing focal length and stop. If Im doing a documentary or even a drama, I might think, Ill tighten it up a bit here. But theyve already done one pass on the motion control, Baillie notes.According to Cinematographer Paul Stewart, Prehistoric animals are a mix of the familiar and strange. This large predatory Pterosaur evolved into a lifestyle similar to some modern storks. (Image courtesy of Apple Inc.)An example of a full CG shot from the Coasts episode of Prehistoric Planet 1. (Images courtesy of Apple Inc.)Aerial photography for the Oceans episode of Prehistoric Planet 2,which was captured in Northern Sweden. (Image courtesy of Apple Inc.)Everything had to be choreographed narratively and visually before principal photography commenced. To ensure our plates matched the action we wanted, we had reference previsualization video prepared by the animators for every backplate we planned to shoot, remarks Paul Stewart, writer, Producer and Director of Photography for Prehistoric Planet, who won Primetime Emmy Awards for The Blue Planet, Planet Earth and Planet Earth II. Using accurate scale cut-out models, or [for the big ones] poles on sticks, we could see how big the dinosaur was in the actual scene. Sometimes shockingly huge! Knowing size helped us figure out the speed and scope of any camera moves; we would capture a good plate then plates where we mocked up environmental interactions like kicking dirt when running, brushing bushes and picking up twigs. We also tried where possible to isolate foreground elements using bluescreen. In some cases, we even used beautifully made blue puppets and skilled puppeteers to create complex interactions; for example, a baby Pterosaur emerging from a seaweed nest. This all went back to the MPC wizards, together with LiDAR scans and photogrammetry, to make the magic happen.There was a lot of unforeseen rethinking later about how things worked out. For instance, when the Natural History Unit is on location, its one guy with the camera shooting for maybe months, explains Kirstin Hall, Visual Effects Supervisor at MPC. Theres no DIT [Digital Imaging Technician] or a script, and hes not even taking notes on his cards. We didnt think about that as a reality. When we showed up with our huge crew, and all of these shots that we had to get and things we had to do, it was a huge shock for everybody involved. It caused us to think, We need to plan this a different way. How are we going to get the data? Also, they need to shoot a script, whereas most of the narratives in Natural History come in the edit. Because we wanted to keep it as holistic as possible, the biggest thing for us was working together and becoming one healthy team. The NHU started doing their own charts, and we did HDRIs on the side. The shoots became like clockwork for Prehistoric Planet 2.ILM put a lot of detail into the CG dinosaurs, as showcased by this image of a T-Rex. (Image courtesy of Netflix)Even with the experience of working on The Lion King and The Jungle Book, MPC still had room for improvement when it came to depicting CG creatures in a naturalistic and believable manner. We had to get up to speed on animal behaviors and instinct and tailor our whole kit to that, like the lenses and cameras we used and filming off-speed; everything is slightly slow-motion, about 30 fps, Hall states. In Prehistoric Planet 2, we went 100 fps or more, which is hard to do with feathers and fur, but it helped us to get the full experience of these blue-chip Natural History Unit productions. It was not until Jurassic World Dominion that feathered dinosaurs appeared in a Hollywood franchise, but this was not the case for Prehistoric Planet, We knew from the beginning we would have to [do feathers] for it to be scientifically accurate, Hall explains. We were lucky enough to work with Darren Naish and didnt realize how integrated he would be in our team. It felt like Darren was a member of MPC because he was in every asset and animation review. If something was not authentic in how something moves or blinks, we would catch it early on and rectify going forward. We learned a lot, even with the plants. When shooting on location, we made sure to rip out holly, and we couldnt film on grass because it did not exist [during the Late Cretaceous period].Cinematographer David Baillie used a helicopter to capture aerial photography for Prehistoric Planet 2. (Image courtesy of David Baillie)Some unexpected artistic license was taken given the nature of the subject matter of Life on Our Planet. We tried to be as authentic as possible, certainly in the cinematography, remarks Jonathan Privett, Visual Effects Supervisor at ILM. Where it varied because we could control what the creatures did, there are quite a few places where we used continuity cuts that you wouldnt be able to do if you were shooting Natural History because it would have required multiple cameras which they rarely use. We didnt start off like that. It was a bit of a journey. From the outset, we said we wouldnt do that. However, theres something about the fact that creatures are not real even though they look real, which led to a sense of being slightly odd that you wouldnt do those continuity cuts when you watched the edits back; so we ended up putting them in.A Morturneria breaks the water surface courtesy of MPC for Prehistoric Planet 2.(Image courtesy of Apple Inc.)Cinematographer Paul Stewart filming on a beach in Wales for a Prehistoric Planet sequence involving a Pterosaur beach. (Image courtesy of Paul Stewart)A hovercraft brings in a proxy 3D-printed dinosaur head to allow Cinematographer David Baillie to properly frame a shot for Prehistoric Planet 2. (Image courtesy of David Baillie)Essentially, the virtual camera kit emulated the physical one. We shot a lot of it on the Canon CN20, which is a 50mm to 1000mm lens, Privett states. Jamie has a doubler [lens extender] that can make it 1500mm. An incredible bit of kit. We also used a Gyro-Stabilizer System. The process is the same as if we were making a feature. We took Jaimes GSS and shot lens grids for it. Its optically quite good because its quite long, so everything is quite flat, but we mirrored the distortion, and inside the GSS is a RED camera, so that is a relatively standard thing. The other $300,000 worth of equipment is the stabilization bit, and our traditional methods of camera tracking work fine for that. The hard bit is we never use lenses that long. In a drama, nobody is breaking out the 600mm lens. Thats quite interesting to have to deal with because you probably dont have much information. It could be a blurry mass back there, so our brilliant layout department managed to deal with those well.What made my hair go gray is the language of wildlife photography and doing long, lingering close-ups of creatures, Privett laughs. Youre right in there, so theres nowhere to hide in terms of your modeling and texturing work. We had to spend a lot of time on the shapes. For instance, a lizard has a nictitating membrane, so when closing its eye all the muscles around it move, and actually the whole shape of the face almost changes. We had to build all of those into the models probably in more detail than we would normally expect. The image is more compressed as well. Generally, the crane is panning with the GSS, and Jamie will counter-track around the creature so you get this great sense of depth. You can also see the air between you and the subject because youre so far away from it. Any kind of temperature gradient shows up as heat haze in the image. In some of the shots, were warping the final rendered image to match it with whats happening in the background because you can get some crazy artifacts, Privett remarks.There was lots of creativity but less freedom. The fusion of science with the creatives at MPC paid off in a spectacular way, Stewart notes. Creativity could never come at the expense of accuracy, and surprises and beauty had to be hard-baked into the sequences rather than serendipitously revealed in the course of filming. Giving ourselves hard rules about what could and could not happen in the animal world helped set limits and improve the believability of the films. We might have wanted the animal to jump or run, but the bones tell us it could not, so it didnt. I even found myself checking the craters on the moon for any we should erase because they happened in the last 65 million years! There was also the matter of cost. We could never afford to make all the creatures we wanted or to get the creatures to do everything we would have liked. Interaction with water, vegetation, even shadows and the ground, required huge amounts of art and render time to get right but would be hardly noticed by the audience. We got savvy quickly at how to get impact without costing the sequence out of existence. But the thrill of recreating a world that disappeared so long ago never dulled. Even the scientists and reviewers said they soon forgot they were watching animation, Stewart says.Standard visual techniques had to be rethought. The easiest way to explain it is, if Im filming a tiger in the jungle, I would want to be looking at it so you get a glimpse into its world, McPherson explains. I tend to shoot quite a long lens and make all of the foliage in the foreground melt so youre looking through this kaleidoscopic world of this tiger walking through a forest. But you cant do that with a visual effects creature because they cant put the creature behind melty, out-of-focus foliage. The best example is the opening shot of Episode 101 of Life on Our Planet of a Smilodon walking through what looks to be grass. There is a lot of grass in front and behind it. The only way to achieve that was to shoot where the creature was going to be on this plate. You shoot it once clean. Then we add in and shoot multiple layers of out-of-focus grass and then those shots are all composited together so it looks like the creature is walking behind the grass, and we match the frame speed, which then makes it feel like youre looking into that world.Having limitations is not a bad thing. There are restrictions, but they also make you more creative, McPherson observes. You have to overcome the limitations of a limited number of shots by making sure that every shot works together and tells the story in the best possible way. The usual friend or foe had to be dealt with throughout the production. Because of weather, some shoots were hard, which had nothing to do with visual effects, Baillie states. We had to do some stormy cliff shots in Yesnaby, Scotland, and had winds of nearly 100 miles per hour, which was great for crashing waves. In Sweden, when we were doing the ice shots for the Oceans episode of Prehistoric Planet, it was great to begin with because it was -28C, but there werent any holes in the ice. We nearly flew to Finland to try to find one. Then overnight the temperature went up to 3C, the wind picked up and all of the ice broke up and melted, and we couldnt find any ice without a hole! Dealing with the requirements for visual effects led to some surreal situations. The Pterosaur cliffs sequence was one of the most complex sequences because it involved shoots on land, sea, aerial and practical effects, Stewart recalls. Animation Supervisor Seng Lau worked with me in the field to help direct the plate work, and it was a fun collaboration. Watching our Smurfblue baby puppet Pterosaurs emerge from their seaweed nests was a bonding moment!Proxies ranging from 3D-printed heads to cut-outs were allowed Paul Stewart to frame shots properly for Prehistoric Planet. (Image courtesy of Paul Stewart)Cinematographer Paul Stewart describes, mimicking other cameras like thermal cameras to point out that many dinosaurs were warm-blooded and insulated by feathers. (Image courtesy of Apple)Cinematographer Jamie McPherson on location to film Komodo dragons with the cine-buggy. (Image courtesy of Jamie McPherson)0 Comments 0 Shares 261 Views
-
WWW.VFXVOICE.COMOPENING DOORS TO ALTERNATE CHICAGOS AND VIVID NEW WORLDS IN DARK MATTERBy OLIVER WEBBImages courtesy of Apple TV+.Every time Jason Dessen (Joel Edgerton) opens a door in the corridor, a different variation of Chicago is revealed. Dark Matter is based on Blake Crouchs bestselling novel that follows family man Jason as he is abducted into an alternate version of his life.Based on Blake Crouchs 2016 bestselling novel, Dark Matter follows family man Jason Dessen (Joel Edgerton) as he is abducted into an alternate version of his life. To return to his family, he must embark on a journey to save them from his own worst enemy: himself. Philippe Massonnat, who served as Folks VFX Visual Effects Producer on the AppleTV+ series, had first heard of Dark Matter through a sister company in the Pitch Black film family. We loved Blake Crouchs book and had previous history with the production VFX team on the show, so it was a no-brainer for us to try to be involved, Massonnat states.Initial conversations between the VFX team and the producers focused on laying the groundwork to create the various worlds of Dark Matter, producing high-quality realizations of Blake Crouchs worlds.VFX Supervisor Lionel Lim first heard about Dark Matter as a potential project through his producers at Folks VFX. Lim explains, Blake Crouchs novel is truly captivating, so we were thrilled to work on the series adaptation. The chance to bring this story to life on screen was exciting and inspiring for me and the entire team. Our initial conversations focused on laying the groundwork to create the various worlds of Dark Matter. The expectation was to produce high-quality realizations of Blake Crouchs worlds. From there, we quickly moved on to concept development.Folks VFX Visual Effects Supervisor Lionel Lim and his team gathered reference material on Chicago from photos, footage and maps, providing the filmmakers with a solid foundation.Massonnat understood early on that it was imperative for the work to be grounded in reality. Chicago being the only constant in the many worlds the characters go through, we knew we would have to work on many variations of the city and its skyline. All of our variations needed to be a realistic version of a what if scenario that could have happened if things went differently.Having a real city to reference was incredibly helpful, providing us with a solid foundation to stick to and refer back to whenever we had doubts. This approach ensured that the essence of Chicago was consistently captured in each alternate world.Lionel Lim, VFX Supervisor, Folks VFXFor the bidding process, Massonnat and his team got access to early concepts from Blakes creative team. They had worked on variations of the potential worlds our heroes would go through, he explains. It helped us evaluate the workloads for some of the worlds, but also gave us great pointers for the mood and level of creativity Blake and his team were looking for. With the production VFX team on set in Chicago, we also received a lot of references of the city and its surroundings that were very helpful.Chicago was the only constant in the alternate worlds the characters go through, requiring many variations of the city and its skyline. Dynamic new iterations of Chicago were built each time the door was opened.Every alternate world is a variation of Chicago, and all the variations of the city needed to be realistic versions of what could believably happen in an alternate Chicago reality.Every time our heroes open a door in the corridor, we had to reveal a different variation that needed to be as stunning, detailed and realistic as the previous one. Whether it would start a whole sequence like the snow world or futuristic Chicago, or only be for a couple of shots like the scorched Chicago or the tsunami wave, the quality needed to stay at the highest level.Philippe Massonnat, Visual Effects Producer, Folks VFXAs every alternate world is a variation of Chicago, Lim and his team gathered as much reference material as possible about the city from various sources. This included pictures, footage, maps and more, Lim details. Having a real city to reference was incredibly helpful, providing us with a solid foundation to stick to and refer back to whenever we had doubts. This approach ensured that the essence of Chicago was consistently captured in each alternate world.From a production standpoint, the main technical challenge of the show was the sheer amount of worlds. There were so many worlds to create that different teams concept, environment, VFX and compositing were spread over many different variations at the same time, working on future episodes, upcoming worlds and building new iterations of Chicago.From a production standpoint, the main challenge of the show was the sheer amount of worlds. Every time our heroes open a door in the corridor, we had to reveal a different variation that needed to be as stunning, detailed and realistic as the previous one, Massonnat says. Whether it would start a whole sequence like the snow world or futuristic Chicago, or only for a couple of shots like the scorched Chicago or the tsunami wave, the quality needed to stay at the highest level. Planning was a very important part of making all this possible, we had to be very mindful of our resources and our scheduling, making sure we had the best possible crew focusing on the right environment or FX at the right time, ready to jump on the next one as soon as they were done to keep the wheels turning and steadily feed our compositing team.When it came to streamlining workload for Lim and his team, the art department created concepts for each Chicago variation and sought early approval with the initial footage. Simultaneously, our CG team built a rough model of Chicago using satellite data and maps, Lim says. This Chicago environment was then used, alongside our concept art, as a base to feed each worlds variations. We then built specific props and assets to help set [artists] dress the city and adapt the look for each world, whether it was snow banks, snowed-in trees, houses, boats, sand dunes, sci-fi buildings, etc.Whether it was the snow world, futuristic Chicago, scorched Chicago or the tsunami wave, a consistently high level of quality was required throughout.Massonnat agrees that the snow world and elevator sequence were technically challenging, though from a producers perspective he points to the restaurant episode in episode 7 as being the trickiest. We had to build a 180-degree vista from the restaurant at the top of the highest building in the city, above a cloudscape with some of the tallest city buildings peaking through, he describes. Its a very emotional sequence for our characters, so the background needed to help convey that emotion. It was a very iterative process. We had to try a lot of options until we nailed the timing [of] the sun position, the color palette and the mood of each section of the sequence, which is spread over about 90 shots from the end of the afternoon until night. We had to be very precise with continuity. Each revision implied that we had to correct a massive amount of shots in a short amount of time to be able to see the sequence as a whole and figure out if it was true to Blakes vision or not. To streamline the process and allow for faster turnaround and reviews, we decided to have a composting supervisor specifically assigned to that sequence. It helped a lot in quickly proposing revisions and spreading them efficiently to the team. In the end, the sequence is gorgeous and carries beautifully the weight of the decisions being made by our heroes.Simultaneously, our CG team built a rough model of Chicago using satellite data and maps. This Chicago environment was then used, alongside our concept art, as a base to feed each worlds variations. We then built specific props and assets to help set [artists] dress the city and adapt the look for each world, whether it was snow banks, snowed-in trees, houses, boats, sand dunes, sci-fi buildings, etc.Lionel Lim, VFX Supervisor, Folks VFXThe mysterious box, which functions as a interdimensional transport device, always appears at the same place in every world, always at the same coordinates regardless of the reality, even in the middle of a busy freeway.All the worlds created for the show initially began with a discussion with Production VFX Supervisor John Heller, and looking at various references to convey the essence of what Blake was looking for. Then we would move on to concepts to refine the mood and the details and get an early approval, before going on building the worlds themselves, Massonnat says. For all the city worlds we used a rough CG Chicago we built as a layout base, as the box is always appearing at the same place in every world. Then we altered the city by adding props and CG assets, FX simulations, layers of matte paintings projected on our 3D base, and anything needed to make it the best possible. Even if only seen for a few frames before closing the door, never to be seen again, our worlds needed to be instantly impactful and memorable.The elevator scene required a high level of detail to make it believable, grounded in reality and compelling at the same time. Simultaneous to the creation of concept art, the CG team built a rough model of Chicago using satellite data and maps. This Chicago environment was then used to differentiate, design and detail the look of each world.Lim notes that due to the big variety of biomes throughout the alternate Chicago worlds mixed with the shots themselves often having camera movements the team decided to use a 2.5D approach for much of the shot work. We were very rigorous with having proper layout steps so that we would never change camera orientation once we started painting environments. For most of our worlds, we often had heavy matte paintings projected onto our 3D models, using a render as a base. This allowed for quick iterations and a lot of flexibility.For all the city worlds we used a rough CG Chicago we built as a layout base, as the box is always appearing at the same place in every world. Then we altered the city by adding props and CG assets, FX simulations, layers of matte painting projected on our 3D base, and anything needed to make it the best possible.Philippe Massonnat, Visual Effects Producer, Folks VFXProduction on the show lasted around eight months, and the complexity of having such a vast amount of worlds to create ultimately meant that the crew needed to be spread over many different variations at the same time. We had our concept team looking into future episodes, while our environment team was split working on upcoming worlds, building new iterations of Chicago each time the door was opened. Simultaneously, our FX team was busy simulating snow storms, deadly bee swarms or a gigantic tsunami in finished environments while our composting team was delivering the latest renders in beautiful shots, Massonnat says. Folks VFX completed approximately 700 visual effects shots for the show. FuseFX, MPC, Digital Domain, Ghost VFX and Papaya VFX were also vendors on the show.The snow scenes also demanded a high degree of detail. Even if only seen for a few frames before closing the door, never to be seen again, the different worlds needed to make an instant impact.Lim explains that the snow world and elevator scenes were the most challenging to capture as they required a high level of detail. Making each world believable while keeping it grounded in reality was a tricky balance to achieve. Multiple variations of cityscapes depicting different times of day were also required. Drawing from earlier concepts established by our art department, we had a solid foundation to build upon. The complexity arose from ensuring each cityscape maintained a high level of detail while subtly bringing them to life, Lim adds. The joy of working on the show definitely came from the opportunity to create vivid and wonderful worlds with our VFX team at Folks. We were fortunate to have an amazing team of artists and supervisors who worked tirelessly to achieve results we can all be proud of.Massonnat reflects, Bringing so many worlds and environments to life was incredible. It was a creative challenge, but bouncing ideas and trying out new things to make a world interesting and captivating with our team at Folks was also deeply satisfying, and we were lucky to have a truly fantastic crew ready to push the envelope to make all those amazing worlds unforgettable.0 Comments 0 Shares 225 Views
-
WWW.VFXVOICE.COMRECAPTURING THE APOLLO 11 MISSION TO TELL A LOVE STORY ON EARTH IN FLY ME TO THE MOONBy OLIVER WEBBImages courtesy of Sony Pictures.Fly Me to the Moon explores the relationship between marketing executive Kelly Jones (Scarlett Johansson) and NASA official Cole Davis (Channing Tatum) as he makes preparations for NASAs historic Apollo 11 moon landing during the tumultuous 1960s Space Race between the United States and Soviet Union. As the public loses interest in the race, Jones must reignite public excitement around NASA and the space program.Cole Davis (Channing Tatum) and Kelly Jones (Scarlett Johansson) in Fly Me to the Moon. The film is not a space movie, but a love story focusing on the people on the ground. (Photo: Dan McFadden)Framestore led the asset team on [the Saturn V rocket], and the detail that they put into that model is something Ill geek out on forever. I also just want to give credit to our in-house team because they were paramount in determining the creative look of a lot of stuff and finding the right research and stock footage to make sure it worked. Then cutting in the stock footage and using that as previs to then basically do a CG version of it later.Sean Devereaux, Visual Effects SupervisorSean Devereaux served as Visual Effects Supervisor on the Apple Studios/Sony film, joining the project as post-production commenced in March 2023, with production wrapping a year later. Harry Jierjian, the Editor, was editing throughout production and had a really good assembly, so we were ready to work immediately, Devereaux states. Director Greg Berlanti likes to see stuff pretty far along before some of the cuts, so we brought on three in-house compositors who absolutely saved our butts and were a huge creative part of the show, not just to get temps out, but to really help us develop the look of things.Cole Davis (Channing Tatum), Moe Berkus (Woody Harrelson) and Kelly Jones (Scarlett Johansson). Creative references included 1960s Space Race documentaries and recent films such as First Man. (Photo: Dan McFadden)When it came to creative references, the team looked at the available Space Race documentaries as well as recent films such as First Man. We watched all the films, Devereaux notes. We didnt avoid watching them, but we certainly didnt want to mimic them. It would be impossible for us to not look at films that were so well researched. We looked at the Apollo 11 documentary and licensed some of the footage that came from that documentary. We had a full-time archivist, and we looked through every frame of film that was available to us, which was a lot. A self-proclaimed Space Race geek himself, Devereaux enjoyed watching everything he could get his hands on, which included hundreds of hours of material. Some things I had never seen before, he admits. We did change some shot designs if they were too similar to, say, First Man and make it our own, even if it wasnt intentionally duplicated.The 1960s Space Race between the U.S. and Soviet Union serves as the backdrop of the film. While the Space Race isnt the central premise, effects were essential to project realism and heighten the drama. Invisible VFX were required for period corrections. (Photo: Dan McFadden)Director Greg Berlanti likes to see stuff pretty far along before some of the cuts, so we brought on three in-house compositors who absolutely saved our butts and were a huge creative part of the show, not just to get temps out, but to really help us develop the look of things.Sean Devereaux, Visual Effects SupervisorDevereaux and his team worked closely with NASA, who were constantly involved with the project. We used a lot of stock footage from NASA, but we also had CG shots that had to feel like they were stock footage. At that point in time, it was the most photographed event in history, so we have dozens of hours of footage. Greg really wanted it to feel authentic. Its not really about space, its about the love story. So, he wanted the story to be about the people on the ground, not about whats happening in space. In order to do that we really had to nail the realism, so at no point are you taken away from the story. Even if you have a really stunning visual effect shot, it was visual effects no matter how real it looks. So, we really tried hard to place our cameras where they could only be placed. For example, the Apollo 11 moon landing only had a single camera position, but with Apollo missions 12, 14, 15, 16 and 17, they had a lot more camera placements, so we did use stock footage and match-mimicked stock footage with CG just to give the audience more of a sense of the story. Technically, a lot of the cameras werent there until the later missions, but we took some creative liberty there.Apollo 11 astronauts rehearse for a moonwalk on a moon set, which served as a backup just in case the mission failed, because failure was not an option for NASA in Fly Me to the Moon. There were about 1,000 visual effects shots required for the film, with work split up between Framestore, RISE, Zero VFX, Nexodus and Ingenuity Studios. (Photo: Dan McFadden)Berlanti was adamant that he didnt want the effects to be showy in any shape or form as he felt concerned this would take away from the central premise of the film. I storyboarded a launch sequence that followed the rules and only put cameras where NASA put cameras, but once Greg saw it, he said it looked too prepared for what we were doing, Devereaux explains. Greg felt that was too much and was taking away from the people on the ground by showing how majestic this launch could be. We still want to obviously show the scale of what happened, which is massive. The Saturn V rocket is still the most powerful machine ever built by man, so we wanted to show that power. We got to do it in some longer shots that we let sit, rather than doing cutaways to rockets going off. That was the biggest visual language, and Greg needed to make sure that our stuff was not at all discernible from the stock footage. This includes crowd shots, rocket launches, moon landings and all the period corrections.[T]he Apollo 11 moon landing only had a single camera position, but with Apollo missions 12, 14, 15, 16 and 17, they had a lot more camera placements, so we did use stock footage and match-mimicked stock footage with CG just to give the audience more of a sense of the story. Technically, a lot of the cameras werent there until the later missions, but we took some creative liberty there.Sean Devereaux, Visual Effects SupervisorThe moon set. The VFX team studied the Apollo 11 documentary and licensed some of the footage that came from that documentary. They also had a full-time archivist and combed through every frame of available footage.Although Devereaux knew hed be facing a daunting challenge, he was excited to get started on the project and already had a few ideas in mind in terms of what he wanted to achieve. Greg knew the movie, and he won that battle quickly, but initially, my scope was to go bigger with this. Very few people on this planet will be able to tell what is stock footage and what are our CG shots. The approach to this project is different because its so real, so Im not going to change what is there, but thats my job to give the director options and understand their vision as quickly as I can. The creative in this wasnt like other movies where you have a completely blank slate to work with, we had very clear elements and pieces we had to work with. It was a more technical than creative approach because we really did go down and figure out and mimic all the things that NASA had previously done, so when you are watching the film you cant tell that it was visual effects.Kelly Jones (Scarlett Johansson) and Ruby Martin (Anna Garcia).Visual Effects Supervisor Sean Devereaux and his team worked closely with NASA, who was constantly involved with the project to ensure authenticity.There were just over 1,000 visual effects shots required for the film. The work was split up between five vendors Framestore, RISE, Zero VFX, Nexodus and Ingenuity Studios who were all brought on to provide their unique skillsets required for the film. There was a variety of work that included the moon, rocket launches, stars in the sky and a lot of period fixes and crowd work. Theres also a lot of CG cars in this because there were only so many cars we could get from 1969 that looked brand new. There are thousands of CG cars in this movie that was a really fun challenge. Most of my work is invisible, and as I said before, its hard to see what we did. This movie fits squarely into that category, which I honestly love. I want the story to be highlighted and not show off effects.The astronauts in Fly Me to the Moon. Director Greg Berlanti needed all the imagery shot for the film seamlessly blended with the stock footage, including crowd shots, rocket launches, moon landings and all period corrections.Kelly Jones (Scarlett Johansson) and Ruby Martin (Anna Garcia), The actual Apollo 11 moon landing only had a single camera position while other Apollo missions had more camera placements. Taking some creative liberty, stock footage and newly-minted stock footage with CG was used to give the audience more of a sense of the story.The entire control room set was built on a gimbal, allowing production to shake the set on lift-off of the giant rocket. VFX was tied into the shaking to make the moment feel as dramatic as possible.NASA headquarters in Fly Me to the Moon. There are many CG cars in the film due to the limited availability of brand-new-looking cars from 1969.Director Greg Berlanti wanted to show the massive scale and power of the Saturn V rocket, but he felt that by showing how majestic the moon launch could be would take away from story of the people on the ground. Longer shots rather than cutaways were used to show the power of the launch rocket going off.A lot of stock footage from NASA was used, so the CG shots had to feel like they were stock footage to ensure that few people would be able to tell the difference. Director Greg Berlanti didnt want the effects to be too attention-grabbing that they distracted from the focus of the film.There was a variety of work that included the moon, rocket launches, stars in the sky and a lot of period fixes and crowd work. Theres also a lot of CG cars in this because there were only so many cars we could get from 1969 that looked brand new. There are thousands of CG cars in this movie that was a really fun challenge. Most of my work is invisible, and as I said before, its hard to see what we did. This movie fits squarely into that category, which I honestly love. I want the story to be highlighted and not show off effects.Sean Devereaux, Visual Effects SupervisorOne of the most challenging aspects of the project for Devereaux and his team involved the Apollo 10 launch. Its actually a side scene and not really a focus of the story of the film, he explains. Its not a huge story point and its barely mentioned in the movie, but its the first time Scarlett Johanssons character really understands the weight and power of what she is involved with. Shes never seen a launch before, and its a nighttime launch. We had no stock footage of it, so we did do a lot of CG there. We had a balance between Dariusz Wolskis cinematography that was lit dark, but also being able to backlight Scarlett from rockets that shes looking at. The entire control room set was built on a gimbal, so they shook the set, and we had to just tie our visual effects into that and make it feel as real as that.VFX was deployed to put wings under the romance between Kelly Jones (Scarlett Johansson) and Cole Davis (Channing Tatum) in Fly Me to the Moon.Concludes Devereaux: Getting to create a Saturn V rocket was probably the most enjoyable part of the project. It still blows my mind that we put people on a 33-story building and sent it into space. It does not cease to amaze me the more I learn about it. Framestore led the asset team on it, and the detail that they put into that model is something Ill geek out on forever. I also just want to give credit to our in-house team because they were paramount in determining the creative look of a lot of stuff and finding the right research and stock footage to make sure it worked. Then cutting in the stock footage and using that as previs to then basically do a CG version of it later. They were a huge asset, and they were there from before I was until the very end. They went from temp work through to final shots.0 Comments 0 Shares 290 Views
-
WWW.VFXVOICE.COMILM RETURNS TO FEATURE ANIMATION FOR ULTRAMAN: RISINGBy TREVOR HOGGEstablished in 1966, the Ultraman franchise remains alive and well in its ongoing battle between extraterrestrial supernatural beings and kaiju, with ILM entering into the cosmic fray to make a career ambition for American animator/director Shannon Tindle a reality for him and Netflix. The concept gets turned on its head for Ultraman: Rising as a baseball phenomenon struggling to accept and execute the responsibilities of Ultraman becomes the guardian of a baby kaiju hunted by an unscrupulous government agency. Having previously worked on Lost Ollie for Tindle, where a digitally-animated toy rabbit had to be integrated into live-action plates, Visual Effects Supervisor Hayden Jones found himself in the middle of a CG animation feature narratively driven by some serious parental angst.The way that Art Director Sunmin Inn depicted the moon and stars in the concept art resembled a crib mobile, which was retained for the final image.The toughest challenge on Ultraman: Rising was being true to the artwork created by Production Designer Marcos Mateu Mestre and Art Director Sunmin Inn, which had a beautiful style that merged manga and anime techniques. We have to be able to light our characters in any way possible but still keep the look. What we have to do is play off of shape, form, shader design and baked-in textures. We have a lot of filtration on the image that creates this slightly marker-pen watercolor feel. Its about balancing all of these things together to find something that feels like the original artwork but gives you the flexibility to introduce elements of cinematography and animation.Hayden Jones, Visual Effects Supervisor, ILMAfter a 13-year-long lull. ILM has gotten back into feature animation with Transformers One and Ultraman: Rising. Because so many live-action films now are becoming so large, and having such a large amount of environment and character work being brought in, theyre almost getting to the stage where the pipeline has to deal with them like animated features, notes Jones, who partnered with ILM facilities in London, Vancouver and Singapore. On an animated feature, youre obviously building the whole world, so theres the sheer volume of objects that you have to build, especially when its stylizing the world as well because you cant take objects off the shelf. Then there is the complexity of having the world built, bringing the camera in and layering in animation. There are so many stages, and youre in control of everything. Its a huge logistical challenge to make a feature animation.When Ken Sato starts to connect with Emi through baseball was one of the first scenes that demonstrated to ILM that the animation style was working.Growing up in the U.K., Jones was not familiar with Ultraman, so Tindle provided DVDs of the original animated series to familiarize him with the franchise. I remember watching them and realizing the cultural importance of Ultraman. We had a responsibility to make the Japanese culture feel as authentic as possible. We had a cultural committee who was onboard from the beginning to help us get the details right. There were some amazing things like the way Ken and Ami break apart their chopsticks and pick up the food; every single detail has gone through multiple layers of people to make sure thats exactly the way it would happen in Japan. Tokyo is a character in its own right. Were creating a stylized version of Tokyo that is not photoreal but has Tokyo Tower, Tonkatsu Tonki is a real restaurant where you can eat the best tonkatsu in Tokyo, and the bookstore that Emi walks up to and shatters all of the windows really exists in Yokohama.Tiny details were worked into the glowing eyes of Ultraman to indicate his emotional state and the direction of his gaze.The toughest challenge on Ultraman: Rising was being true to the artwork created by Production Designer Marcos Mateu Mestre and Art Director Sunmin Inn, which had a beautiful style that merged manga and anime techniques, Jones states. If you look at the illustration, the lighting is baked in to perfection for that one frame. We dont have the luxury of that when creating a movie. We have to be able to light our characters in any way possible but still keep the look. What we have to do is play off of shape, form, shader design and baked-in textures. We have a lot of filtration on the image that creates this slightly marker-pen watercolor feel. Its about balancing all of these things together to find something that feels like the original artwork but gives you the flexibility to introduce elements of cinematography and animation.Even though Tokyo is stylized, if one looks carefully, actual landmarks were incorporated into the urban layout.There were some amazing [authentic] things like the way Ken and Ami break apart their chopsticks and pick up the food; every single detail has gone through multiple layers of people to make sure thats exactly the way it would happen in Japan. Were creating a stylized version of Tokyo that is not photoreal but has Tokyo Tower, Tonkatsu Tonki is a real restaurant where you can eat the best tonkatsu in Tokyo, and the bookstore that Emi walks up to and shatters all of the windows really exists in Yokohama.Hayden Jones, Visual Effects Supervisor, ILMA Gigantron hatchling known as Emi gives competition to Grogu when it comes to infantile cuteness. There is a great scene when Ultraman keeps changing between Ultraman and Ken to make Emi realize hes not a threat, Jones describes. Emi goes through a change of personality almost every second, and that actually came from Shannon showing me photos of his daughter literally taken over the space of a minute, and you can see that shes curious, happy and a massive tantrum all in quick succession. An unexpected difficulty was the armor suit worn by Ultraman. We couldnt shade Ultramans suit with a metal shader and expect it to feel like an illustration. We came up with layers of texture work so we could smear the edges so they feel more like a marker-pen reflection.Hair was another important element to get right. The hair is such an intrinsic part of the character, especially for Ken and Ami, Jones remarks. We didnt want it to feel like we were simulating every strand. The hair pieces needed to feel blocky and as sculpted forms that were part of the character design, but then taking that and making it feel like an illustrative version of hair. Its lots of unseen, helping texture passes where we take highlights and smear them down the forms of the hair. It makes you feel there are hair strands, but its actually smearing highlights that creates this beautiful graphic sense rather than a sense of realism.A lot of design work went into the various Spacium Beams to reflect the characters of Ultraman, Emi and Gigantron.Sunglasses are a motif for the protagonist Ken Sato and his antagonist Dr. Onda. Ken Satos sunglasses were a dream! Jones laughs. We placed stylized reflections in the sunglasses in exactly the same way between Ken Sato and Dr. Onda, but the character design for Dr. Onda makes the sunglasses feel quite threatening whereas Kens looks cool. The scene where Ken is having his interview with Ami Wakita and he is leaning back and then forward into the light; we were developing that as one of our look-of-picture sequences. One of our artists came up with the idea of doing a graphic wipe across the lenses. Version one is what you see because it looked so good.Getting the groom right for Ken Sato required modular pieces and smeared highlights to create the impression of strands of hair.There is a great scene when Ultraman keeps changing between Ultraman and Ken to make Emi realize hes not a threat. Emi goes through a change of personality almost every second, and that actually came from [director] Shannon [Tindle] showing me photos of his daughter literally taken over the space of a minute, and you can see that shes curious, happy and a massive tantrum all in quick succession.Hayden Jones, Visual Effects Supervisor, ILMDramatic moments include the parental killing by the Kaiju Defense Force leading to the hatching of the egg containing Emi. Shannon had had this idea for an Ultraman movie 24 years ago, and that death scene was one of the few that has stuck through all of the different versions of Ultraman: Rising, Jones reveals. We saw the storyboards that John Aoshima [co-director] made and were excited because of the fight outside of the baseball stadium and sky chase, not only for the performance but also visually. You have to calm everything down. Its the calmness in the scene that allows subtle animation to play out. We created this bioluminescence look to Gigantron so as she gets angry all of these bioluminescence veins appear across her wings and down her back. It was designed to be aggressive in bioluminescence, but we actually realized what we wanted was to convey this heartbeat that is gradually slowing to the moment where she dies.When Ultraman witnesses the hatching of Emi, it was important to evoke the sense of wonder that a father has when looking at his daughter for the first time. When we saw the concept art from, Sunmin Inn, the Art Director, had created this beautiful piece of Ultraman looking at Emi in the ocean, and the moon and stars were almost hanging on wires above, Jones recalls. I remember saying, This is so beautiful. Why dont we use that as the background? Lets separate the elements out, hang them dimensionally and recreate that painting. Over 50% of that shot is actually Sunmins painting.Footage from actual toddlers of the production team were referenced when devising how Emi would walk and move.Missiles destroy the home of Ken Sato in a scene that resembles the Malibu mansion attack in Iron Man 3. One of the joys of Ultraman: Rising has been rethinking how we do effects work to make the effects feel stylized in this stylized world, Jones explains. For that, I was collecting anime and manga references for months to see how stylized we could push explosions, rocket trails, water and smoke. Underneath all of our effects there is an animation element so we can art direct. When the house explodes, we created modular pieces of explosion that we could almost art direct and create shapes to give a stylized form to everything. There is some great work on the rocket trails, which have little smears, streaks and ink lines that give you the sense of motion.To create an illustrative quality, a Copic maker aesthetic was incorporated into the animation style.Fatally damaged during the home attack is the spherical personal supercomputer assistant to Ken Sato known as Mina. We gave Mina color coding so when she gets angry her eye strip goes red and looks threatening. Mina tilts her head down and tells off Ken at one stage, Jones states. Its amazing how much character you can get into something that is essentially a floating sphere. Her death scene is traumatizing. We built this variant where she has been impacted by the explosion and is slowly losing consciousness. I had the effects team simulate little sparks. As the scene progresses and she slows down, the sparks get fewer so its like a heartbeat that is slowing down.Laser beams are part of the persona of the kaiju and Ultraman. We knew that at the end of the movie we were going to have not only Ultraman using the Spacium Beam but also Gigantron and Emi blasting at the same time, Jones notes. We needed to differentiate all of these different styles of beams against each other. Obviously, we had to pay homage to traditional Ultraman Spacium Beams. It was looking at what made the Spacium Beam so iconic in the first place and bringing that into our style. But with Gigantron and Emi we had to make sure that they felt related.Baked-in textures and filtration were key components in emulating what an illustrator would do.Whether we could animate Ultramans eyes was a big question [early on] because in most of the Ultraman series his eyes are completely static. Because Ken is so often in his Ultraman form and has to carry quite a lot comedic and moving moments, we started testing how to give him eye movement. Even though it looks traditionally like Ultraman, actually lots of small details went into those eyes to connect with the character.Hayden Jones, Visual Effects Supervisor, ILMUltraman: Rising was one of the first films at Netflix to be HDR. All the way through the process, whenever we were doing any lighting or compositing, we were always looking at it in HDR, Jones states. We had so much more latitude in exposure and color because you can not only get brighter but way more saturated in HDR imagery. The color palette is huge in this film. There are great moments in the final battle where the sky almost goes completely purple, but with a bolt of lightning its bright. Getting that real dynamism in the lighting and grade was helpful by using the HDR pipeline.Facial movement is not something that can be relied upon for Ultraman. We dont have eyebrows and his mouth is completely static, so its a huge challenge for the animators to bring a character to life when theres so little to animate, Jones remarks. Whether we could animate Ultramans eyes was a big question [early on] because in most of the Ultraman series his eyes are completely static. Because Ken is so often in his Ultraman form and has to carry quite a lot comedic and moving moments, we started testing how to give him eye movement. Even though it looks traditionally like Ultraman, actually lots of small details went into those eyes to connect with the character.Ultraman: Rising marks the return of ILM to feature animation after Rango, which was released in 2011.Ultraman battles a mechanical Gigantron sent to trick Emi into revealing the whereabouts of Kaiju Island.A cool visual element were the sunglasses worn by Ken Sato, which provided the opportunity to have graphic wipes across the lenses.The art department at Netflix designed the gunships. I hope that we get toys of them because I want to own one! Jones chuckles. The gunships we were playing into Osprey planes and helicopters in the way that they move. One of the big challenges was when you had vehicles that could never exist in the real world. The Destroyer at the end is probably the most iconic of all of the vehicles. Luckily, our Layout Supervisor, Kyle Winkelman, had done many Transformers movies, and he had some great ideas about how to break the ship apart, pull it around and create a sense that it transforms in midair, has huge jets that are slowing its descent and switch off at the last minute, and lands with a huge eruption of water. Kyle spent a day or two blocking out these ideas, and I remember seeing them with Shannon for the first time and everyones faces lit up and said, Oh, yeah. Thats our bad guy.0 Comments 0 Shares 262 Views
-
WWW.VFXVOICE.COMREBUILDING ANCIENT ROME FOR THOSE ABOUT TO DIEBy TREVOR HOGGImages courtesy of Peacock TV.Productions of an epic scale are a signature of Roland Emmerich, who shifts his focus from lunar destruction to a time when the Roman Empire appeased the disenchanted masses with blood sports that made gladiators famous and their patrons richer. Hitching a ride on the chariot through the 10 episodes of Peacocks Those About to Die is Peter Travers, Emmerichs go-to visual effects expert since Midway.Skies proved to be difficult because the clouds had to have the ability to move during takes and not become static.There is a point in the story where they have to collect some sulfur from Mount Vesuvius inside of the original caldera. We scouted a place at Vesuvius with a drone camera and flew the [Sony] Venice 2 into a lava flow that was there from the original eruption in 79 AD. We did 360s. That footage we brought back processed because the wall at Cinecitt Studios is eight meters tall by approximately 51 meters in circumference. We had pumice rocks onstage and had to build a diorama of the environment of Vesuvius, which was based on the tiles that we happened to get under perfect weather conditions. Then we did some augmentation to the plates.Peter Travers, Visual Effects SupervisorThis was an incredibly complicated show mainly because of virtual production, Travers notes. One of the lucky things was that we had an idea of what Rome looked like, but even then the scenes were complex. We had 900 visual effects shots at the end, but over 1,800 shots that were accomplished in virtual production without touching them. It was a 2:1 ratio of shots that were fully realized on the virtual production stage. The work was divided between DNEG and Dream Machine FX. Dream Machine FX handled the wide aerials of Rome and compositing shots. DNEG did all of the heavy lifting for Circus Maximus and the Colosseum. You can have vendor blow on a show like this because the data set is so massive. If we had too many vendors it would have been a mess because how do you share Rome across everybody? The most we could have was two, Travers says.Practical and digital horses were integrated with each other.Thirty-five unique environments were divided between what could be built on set and what had to be extended digitally. There is the footage that we shot that was primarily put on the wall, Travers explains. We had the weird subset of rendering out CG that was put out onto the wall that was pre-rendered. But the vast majority of it was Unreal Engine, so it had to be built and lightweight enough that we could get a 24 fps playback because we did do camera tracking on the show. Its getting the breadth of environments ready to go and all that it entails. There are a number of environments that are almost purely 2D even though we actually displayed them in Unreal Engine. There is a point in the story where they have to collect some sulfur from Mount Vesuvius inside of the original caldera. We scouted a place at Vesuvius with a drone camera and flew the [Sony] Venice 2 into a lava flow that was there from the original eruption in 79 AD. We did 360s. That footage we brought back processed because the wall at Cinecitt Studios is eight meters tall by approximately 51 meters in circumference. We had pumice rocks onstage and had to build a diorama of the environment of Vesuvius, which was based on the tiles that we happened to get under perfect weather conditions. Then we did some augmentation to the plates. The pixel resolution of the wall is 16K and is approximately a semicircle. If we had to build 360 dioramas, those were 32K, then they have to playback at 24 fps playback. We gen-locked it to the camera so youre right facing. That was an enormous undertaking. We had just enough resolution to maximize the wall but not too much that we couldnt achieve playback.The chariot racing with the actors was done as a poor mans process.The different stages of construction of the Colosseum happens in four episodes.One of the things Visual Effects Supervisor Peter Travers and his team discovered was that four horses charging at full gallop in one chariot results in a mess lacking all precision, with the horses bumping into each other and fighting.A lot of times, what we were able to do in order to creep into that daylight effect is that the wall was as bright as it could be. We put practical lights to simulate the sun and used the low-light capability of the Venice 2 to boost up the contrast and light intensity from our fake sun to get much closer to sunlight. Can you do noon sunlight on a virtual production stage? Not yet. But we crept in there. [Director] Roland Emmerich wanted to live in magic hour, so the vast majority of our scenes were in the desired lighting conditions.Peter Travers, Visual Effects SupervisorA major asset was the Venice 2 camera with its low-light sensitivity. Whenever you walk onstage the set is always brighter than whats on camera, Travers observes. This is the opposite. We would go on set and its dark. You cant see anything. Then you look through the monitor and go, Its there. A lot of times, what we were able to do in order to creep into that daylight effect is that the wall was as bright as it could be. We put practical lights to simulate the sun and used the low-light capability of the Venice 2 to boost up the contrast and light intensity from our fake sun to get much closer to sunlight. Can you do noon sunlight on a virtual production stage? Not yet. But we crept in there. Roland Emmerich wanted to live in magic hour, so the vast majority of our scenes were in the desired lighting conditions. It was rare that we would be shooting in noon sunlight anyway.Over 1,800 shots that were accomplished in virtual production without having to be augmented later in post-production.All of the departments were involved with the wall content, including costumes. The crowd shots that on the wall were cards, so [they became] pre-rendered people, Travers states. But even for those pre-rendered people, like the plebeians in crowds for Circus Maximus, we had to get those from the costume department way early on in pre-production and render them out and put them in. Circus Maximus and all of these other environments like the Colosseum were done in Unreal Engine with a 3D environment because we were using camera tracking. Lessons were learned along the way. The ongoing joke on set was, Were going to get really good at this on the last day of shooting! The first day of shooting was tough because we were trying to figure out where does the set stop and where does the Unreal Engine set begin. What we discovered [is that could be achieved by] layering of the practical set on the rotating stage and also having a layering even within the Unreal Engine environment. There are a lot of scenes, in particular Leptis Magna, where the slaves in North Africa are picked up and shipped off to Rome. The original background plates were shot in Morocco. We brought those plates in and started to build a 3D environment. We had to figure out with our Production Designer Johannes Muecke what kinds of cages because were going to have to build those virtually. We had a real giraffe on the virtual production stage, and behind it we needed to have a CG giraffe. We had to get started building the CG giraffe and put that in.Circus Maximus and all of these other environments like the Colosseum were done in Unreal Engine with a 3D environment because we were using camera tracking. The ongoing joke on set was, Were going to get really good at this on the last day of shooting! The first day of shooting was tough because we were trying to figure out where does the set stop and where does the Unreal Engine set begin. What we discovered [is that could be achieved by] layering of the practical set on the rotating stage and also having a layering even within the Unreal Engine environment.Peter Travers, Visual Effects SupervisorA number of environments are almost purely 2D even though they are actually displayed in Unreal Engine.Quick turn around is the biggest advantage of a rotating stage. Every show that you do, you have two people talking, and now you have to shoot it from this way, Travers remarks. It took a minute or two to completely rotate the stage 180. All of our sets worked in a 360 fashion. Then, in Unreal Engine we would look [at it] the other way and lock it in. By the end, we were turning around just as fast if we were on a practical set. Kit-bashing was not enough for the major environment build of Rome, so a massive Roman model was leased. People forget that Rome had transformed over the course of 500 years. We picked approximately 100 AD because the it was closest to when the Colosseum had been built. The most important thing was the layout, so where is Circus Maximus and the Colosseum, where is the Forum relative to that, how big is Palatine Hill, and how does the Tiber River go through? Had we not acquired this model we wouldnt have made it. But I dont want to undersell the amount of effort that DNEG did because the set decoration is at least double the data set that you have to consider. It got even more complicated because we needed to have digital crowds in the background. The thing with any kind of animation on a virtual production stage is, all of the animation has to loop, so we had to figure out intelligent ways to have digital characters walking in the background, which happens in almost all of our scenes. We had to hide them and then have them come around again. The takes were so long that we werent cueing the action. The different stages of construction of the Colosseum happens in four episodes. We had to go from its earliest construction to all of it being completed. We were constantly with cranes and activity. All of that stuff enriched the environment.The most important thing was the layout, so where is Circus Maximus and the Colosseum, where is the Forum relative to that, how big is Palatine Hill, and how does the Tiber River go through? Had we not acquired this model we wouldnt have made it. But I dont want to undersell the amount of effort that DNEG did because the set decoration is at least double the data set that you have to consider.Peter Travers, Visual Effects SupervisorThe chariot races were shot on a track situated at Cinecitt World in Rome.Skies proved to be difficult because the clouds had to have the ability to move during takes and not become static. Rendering volumetric clouds at that resolution would have been foolish, Travers acknowledges. DNEG had a proprietary tool that would take a HDRI and slowly loop over two minutes. You didnt notice it, but it had this beautiful movement in the sky. There are shots where we are looking straight up in the sky. During the road to Rome, youre looking up at this line of crucifixes, and the cloud movement that youre seeing is a procedurally animated still. It didnt pop when it looped. It smoothly went back to frame one so we could go on forever. There were a few times where we had to cue things, and it did loop in particular when we shot footage of Morocco and Sicily. Any of those things, like blowing reeds and grass, we had to figure out a way to intelligently get them to loop back to frame one. With those things, we did hit play when Roman yelled, Action. But we were also covered that it would loop, if in a pinch, if that happened. Unreal Engine was not suited for everything. The chariot racing with the actors was done as a poor mans process. We had them standing on a chariot, and then behind them youre seeing the other oncoming chariots; that was all pre-rendered at 16K at certain length. We would get the actors ready and shoot all of those shots by rolling back to frame one because those couldnt loop, but it was the only way to get horses on the wall that looked good. I was also the director of the action unit, so I shot all of the live-action chariot stuff [on a track at Cinecitt World]. We used that for rotomation, rendering and look development because we already had answers as to what photoreal horse chariots looked like. One of the things that we discovered is four horses charging at full gallop in one chariot is a mess. It is not precise at all. They are constantly bumping into each other and fighting. Because I had shot a lot at a high frame rate, we could even see in slow motion at 96 fps the details of how the muscles worked.There are a lot of scenes, in particular Leptis Magna, where the slaves in North Africa are picked up and shipped off to Rome. The original background plates were shot in Morocco. We brought those plates in and started to build a 3D environment. We had to figure out with our Production Designer Johannes Muecke what kinds of cages because were going to have to build those virtually. We had a real giraffe on the virtual production stage, and behind it we needed to have a CG giraffe. We had to get started building the CG giraffe and put that in.Peter Travers, Visual Effects SupervisorThirty-five unique environments were divided between what could be built on set and what had to be extended digitally.A major asset was the Sony Venice 2 camera with its low-light sensitivityA drone flew a Sony Venice 2 camera to capture a lava flow that was there from the original eruption of Mount Vesuvius in 79 AD.The size of sets was determined by knowing what was going to be captured in-camera.All of the departments were involved with the wall content, including costumes.Those About to Die was shot on the LED volume at Cinecitt Studios in Rome.When youre building a set, you only build where you know youre going to go with the camera, Travers notes. In Unreal Engine we were constantly doing that, where we would start with the Rome model and know the envelope of where we were going to shoot in and start deleting all of these distant things that we werent going to need. Sometimes it was tough. There was a rooftop where we needed to look out at a lot of Rome, so we had to keep most of Rome in the distance. The challenge with that is we wouldnt achieve 24 fps playback. There is a lot of artistry in making things be efficient, load and playback, but you have to keep in mind it is set decoration and that can change. Roland would come to us and say, I want to move that temple over there. Can I do it? Sometimes we couldnt do it, but for the most part we could. Roland was good about not being a kid in a candy store and saying, I want to move everything around. Roland has always been good at working within the box of what he has.0 Comments 0 Shares 276 Views
-
WWW.VFXVOICE.COMCG CAR CRASH MAKES A REAL IMPACT ON FARGO, SEASON 5By OLIVER WEBBImages courtesy of FX Network.Nominated for 15 Primetime Emmy Awards, including Outstanding Limited or Anthology Series, the latest installment of Fargo takes place in Minnesota and North Dakota in 2019. Season 5 follows Dot (Juno Temple), a housewife whose mysterious past begins to haunt her after she gets in trouble with local authorities. Soon, Dot must find a way to protect herself and her family as the law begins to close in on her. Created by Noah Hawley, all five installments of Fargo stream on Hulu.Sienna King as Scotty Lyon, left, and Juno Temple as Dorothy Dot Lyon in The Tiger, Episode 5, Season 5 of Fargo. Mavericks VFX was the main vendor based on the fire work they completed for The Handmaids Tale. (Photo: Michelle Faye)[O]ne of the characters is in a room and has mud all over his hand, and he puts his hand on the wall; its blood/mud and he wipes it. The [print] they had on set felt like just a handprint. We couldnt get it right, so instead I put my hand in some of the fake blood. Then I did it a bunch of times and sent a screengrab to [Post Producer/Editor] Regis [Kimble]. He picked one, and thats the one thats in the show. Shooting the elements ourselves, just small little pieces, is a fun part of the job, and there is a ton of it in the show.Brendan Taylor, President and VFX Supervisor, Mavericks VFXBrendan Taylor, President and VFX Supervisor at Mavericks VFX, served as VFX Supervisor on Season 5. Taylor had worked closely with Executive Producers Kim Todd and Warren Littlefield on The Handmaids Tale. They were looking for someone to help them with Fargo, Season 5, Taylor explains. I picked up Unreal Engine over the pandemic and had been using it on Handmaids to help them plan things out. When Kim called and asked about Fargo, I was really excited as I loved the Coen brothers and the movie Fargo as well as the series. It was honestly one of the most rigorous interview processes Ive ever been through. It became clear early on that I wasnt going to be able to be out in Calgary the whole time, so I had to find a visual effects supervisor that I could partner with, who would be there for the on-set portion, and that was Jesse Kawzenuk. The only way I could do it was to get someone to go there, then Id be there for all the big shots. I would be there on the calls, reading the scripts through and talking to Jesse about approaches. He did a great job on the ground in Calgary.Juno Temple as Dorothy Dot Lyon, left, and Sienna King as Scotty Lyon in The Tragedy of the Commons, Episode 1, Season 5. The VFX in Fargo is used to complement rather than dominate. (Photo: Michelle Faye)One thing that was quite different on this one, that became apparent as we started going into post, was the involvement of Regis Kimble, who was the Post Producer and Editor for Noah, Taylor adds. I love his process. He will just do a screenshare in the edit as he reviews the shots, as opposed to sending notes out to us. There have been times when its like two and a half hours of just looking at visual effects shots. He is so kind and his temps are amazing, and he is so understanding of the visual effects process that it makes for such a great partner. You look forward to those client calls every single day. The entire post team for the production were such friends of visual effects and so understanding. They knew if something came in late that there was a reason for it. The whole thing was really great.Jesse Kawzenuk was first approached by Dana Gonzales (Producer, DOP and director). He broke down the large VFX sequences and explained that they were looking for someone who could join the team as a creative and be able to offer collaborative solutions to problems that would arise throughout the shoot, Kawzenuk says. Dana was very specific about bringing in someone who would inherit this passion for Fargo and be transparent about the creative choices throughout. Having been a big fan of the show and knowing that the bar is set quite high, I thought that this would be a very interesting yet challenging project to be a part of. We knew that there was going to be some relatively large-scale VFX fire work in the show. Mavericks VFX was to be the main vendor based on the fire work they completed for The Handmaids Tale. We had a base amount of artists on it. Mark Ferguson was our producer. I think we had 10 or 11 compositors on it and a couple of CG artists. Usually, we can get by with just that, but when a show comes in and we start bidding it, we usually need one or two more people, so we either pull from other shows within the company or bring on more people. The good thing was that because of the communication, Regis or Christie would let us know that, say, Episode 7 is going to be quite heavy but Episode 8 will be a bit light. So, we would work that way. Its making sure that you are thinking far enough ahead into the future.It is all about the storytelling. You need to create a barrier so the bad guys cant come into the room, but you also need to create this ticking time bomb where the family needs to go out of the window, otherwise theyll catch on fire. You always try and do it practically on the day as much as possible, but at the end of the day you dont know how its going to be cut together; you could have too much fire in one spot to tell the story. My approach with fire is always do it real. Its really intensive and difficult to get right.Brendan Taylor, President and VFX Supervisor, Mavericks VFXJuno Temple as Dorothy Dot Lyon and Jon Hamm as Roy Tillman in Blanket, Episode 8, Season 5 of Fargo. In many situations, the planned VFX shots werent executed because the practical SFX led by Special Effects Supervisor David Benediktson was so strong. (Photo: Michelle Faye)There were several challenging visual effects shots throughout the show. First, Taylor points to Episode 1. There is a scene where they throw the ice on the ground. What they use in movies instead of ice is rubber, as ice is such a pain to clean up. Sometimes plastic is used instead, but in this particular case they used rubber as ice. The point is that it doesnt feel like ice, so when it slides across the floor it should slide, but it actually crumbled; it just felt like plastic cubes. One thing I personally love to do is shoot things, so we went and got a bag of ice. We have a greenscreen stage here, then we just tossed ice on the ground and replaced it. That was pretty fun. Joe Keery as Gator Tillman and Jon Hamm as Roy Tillman in The Paradox of Intermediate Transactions, Episode 3, Season 5. VFX played an important role in supporting the narrative by adding mystery and suspense to scenes with atmospherics such as fog. (Photo: Michelle Faye)Lamorne Morris as Witt Farr in Blanket, Episode 8, Season 5. For Visual Effects Supervisor Jesse Kawzenuk, the most challenging shots were all unplanned, such as when production shifted a scene from night to day and VFX was called upon to assist. (Photo: Michelle Faye)Sam Spruell as Ole Munch in The Tragedy of the Commons, Episode 1, Season 5. The fire scene was handled practically, shooting with the actors, taking the actors out, then doing a bigger burn. EmberGen, fire-and-smoke software, was also employed. (Photos: Michelle Faye)Another challenge for Taylor and the Mavericks team was finding the right amount of blood to depict to be as realistic as possible. When Dot cuts the bottom of her feet on glass, we needed to track where she had been, Taylor says. That one is so closely related to the storytelling. [The scene] needed to feel like she had been taking steps, and the more steps she takes, the more blood comes off, so the next step is going to have a little less blood. It was about finding the balance, but its so nice to be working with someone like Regis in those situations where you can really just talk about it. We couldnt get the blood prints right. We had gallons of fake blood at the studio. I just poured some into a little painters pan and put my foot in it, and started walking around. I almost fell because the stuff is very slippery. That was one of the ways that we would tackle things. There is another part where one of the characters is in a room and has mud all over his hand, and he puts his hand on the wall; its blood/mud and he wipes it. The [print] they had on set felt like just a handprint. We couldnt get it right, so instead I put my hand in some of the fake blood. Then I did it a bunch of times and sent a screengrab to Regis. He picked one, and thats the one thats in the show. Shooting the elements ourselves, just small little pieces, is a fun part of the job, and there is a ton of it in the show. Juno Temple as Dorothy Dot Lyon and Lamorne Morris as Witt Farr in The Tragedy of the Commons, Episode 1, Season 5. Visual Effects Supervisor Brendan Taylor and his Mavericks VFX team worked to find the right amount of blood to use to create as much realism as possible in capturing bloody footprints and handprints. (Photo: Michelle Faye)On a TV schedule, you cant be messing around with the look of the fire. That whole scene was around 60 shots, and it was really about getting everyone onboard with what the edit will look like and the story were trying to tell. Thankfully, we have done so much of this fire in the past that we are actually pretty good at it now, so it was less about the look and more about making it the most impactful scene it can be.Brendan Taylor, President and VFX Supervisor, Mavericks VFXIt was crucial that all the elements came together when dealing practically with the fire scene. It is all about the storytelling, Taylor explains. You need to create a barrier so the bad guys cant come into the room, but you also need to create this ticking time bomb where the family needs to go out of the window, otherwise theyll catch on fire. You always try and do it practically on the day as much as possible, but at the end of the day you dont know how its going to be cut together, so you could have too much fire in one spot to tell the story. My approach with fire is always do it real. Its really intensive and difficult to get right. On a TV schedule, you cant be messing around with the look of the fire. We try and shoot with the actors, take the actors out, then do a bigger burn there. We did use some EmberGen, which is a good piece of fire-and-smoke software. That whole scene was around 60 shots, and it was really about getting everyone onboard with what the edit will look like and the story were trying to tell. We know were trying to push them out of the window, but we cant do it too soon. It was all about pushing and pulling. Thankfully, we have done so much of this fire in the past that we are actually pretty good at it now, so it was less about the look and more about making it the most impactful scene it can be.Juno Temple as Dorothy Dot Lyon in Linda, Episode 7, Season 5. The VFX teams knew that Episode 7 would be VFX heavy, so the large VFX sequences were all broken down in advance. Originally, the sequence was going to be captured as a single shot but proved to be too much of a challenge. (Photo: Michelle Faye)The car crash sequence in Episode 7 was one of the most arduous aspects of the show. The car crash was a great example where we shot two versions, one which would allow for a CG car to hit Dot, Kawzenuk says. We shot a series of clean plates and did proper reference passes. [Visual Effects Supervisor] Cody Hernandez led his team to create this seamless sequence, and it looks incredible. Its a great example of how the VFX is used in Fargo to complement rather than dominate. The car crash sequence at the diner was necessary for timing and camera position as well, but mainly for the stunt department to see where and how to place the stunt double. The car crash sequence at the diner in Episode 7, Season 5 was necessary for timing and camera position, but mainly for the stunt department to see where and how to place the stunt double. (Photo: Michelle Faye)We figured out well in advance where we were going to put the car, where we would put Dot so that the spins feel real. The kiss of death for us as visual effects artists is when we have to break physical reality. Everyone can spot it right away. We had a dummy car there that was actually hit by an 18-wheeler to give us that moment of impact, and it actually turned a little bit, but it didnt do a full 360. So we knew we would have to do it in CG.Brendan Taylor, President and VFX Supervisor, Mavericks VFXOriginally, the car crash was going to be captured as a single shot but proved to be too much of a challenge. You are a slave to physics at that point, Taylor adds. They didnt end up doing it as a oner because there was too much information in the single shot that we needed to take in, so it did end up being in cuts. We did a lot of previs in Unreal, and it was great because Noah was on the call. It was all about how many spins, how many rotations and how fast it was all going to go. We sort of figured that out well in advance where we were going to put the car, where we would put Dot so that the spins [of the car] feel real. The kiss of death for us as visual effects artists is when we have to break physical reality. Everyone can spot it right away. We had a dummy car there that was actually hit by an 18-wheeler to give us that moment of impact, and it actually turned a little bit, but it didnt do a full 360. So we knew we would have to do it in CG. We ended up building the entire model of the Dodge Charger and replacing the one that was hit. Then we did effects simulations for the explosion part to match the practical, and we did all the animation. It took a long time to get it right. The hardest part was not the initial impact or the spin, but the moment when it hits Dot. There were lots of revisions, but finally we got there. Another really difficult part of the sequence was the snow, as snow is really hard to get right. It had this clumpiness on the ground, and if you make it too clumpy it feels like foam, and if you make it not clumpy enough if feels like sand. So it is about finding that sweet spot. I think we were overdoing the snow quite a lot because it looked good but didnt feel real. The actual effects simulations that are in there are quite subtle but really effective. Thats what really sells the shot. You have a layer of snow on top of the car that goes up in the air as it turns, which is really nice. I think its the two best shots in the series and one of the best that we have done at the company.Juno Temple as Dorothy Dot Lyon in Linda, Episode 7, Season 5. The VFX team ended up building the entire model of the Dodge Charger, replacing the one that was hit. Two versions of the car crash were shot, one which would allow for a CG car to hit Dot. The VFX team did previs in Unreal Engine. (Photo: Michelle Faye)For Kawzenuk, the most challenging shots were all unplanned. In Episodes 9 and 10, Dot is sneaking around the Tillman Ranch, trying to move around unseen, Kawzenuk details. Production made a schedule change that shifted scenes originally scripted as night to take place during the day. Once the sequences were cut together, it was clear the lack of suspense and mystery was hurting the narrative. We looked at what VFX could do to assist and decided that adding fog to each shot would sell the suspense. We shot the whole sequence over a 10-day span, and the weather changed drastically from overcast to full sun, to snow, to sometimes real fog. Getting the consistency without altering the original plate too much was a real challenge. Trying to layer the fog and create a series of large volumes moving across these open plains was difficult. The Editor, Regis Kimble, has such a strong eye and really guided the look of these scenes right from the editing bay. I think that Tom Turnbull and his team at Rocket Science VFX really executed these episodes effectively. Kawzenuk concludes, The most enjoyable part of being on Fargo was being a part of this family. This was my first time working in Calgary, and the crew there is phenomenal, a hard-working team that really wants to provide the resources required to make the VFX work effectively. In many situations, the planned VFX shots went away because the practical SFX led by David Benediktson was so strong. I think Fargo continues to be one of the best television series. Having spent over a year working on the show from pre-production planning right through to post-production has made me a much stronger VFX supervisor.0 Comments 0 Shares 324 Views
-
WWW.VFXVOICE.COMALIENS EARN UPGRADE, NEW YORK MOVES TO LONDON FOR A QUIET PLACE: DAY ONEBy OLIVER WEBBImages courtesy of Paramount Pictures.A Quiet Place: Day One is the third installment in the Quiet Place franchise and serves as a prequel to the events of A Quiet Place. Day One was directed and co-written by Michael Sarnoski (Pig) while John Krasinski (IF) returns as co-writer and producer for Paramount. Set in New York City, Day One follows Sam (portrayed by Lupita Nyongo), a terminally ill woman on an outing to Manhattan. After the city comes under attack from an alien invasion, Sam, along with other survivors, must navigate a way to safety. It soon becomes apparent that the aliens are drawn to even the slightest of sounds.The bulk of the footage was captured in Manhattan. A helicopter unit shot aerial plates around the city to provide more scope and authenticity. Production then shifted to London.ILMs Malcolm Humphreys (The Batman), who served as Visual Effects Supervisor on Day One, was familiar with the previous two films in the franchise as both were ILM projects. They were supervised by Scott Farrar, who is a huge character at ILM, so it was a bit daunting in terms of stepping into that [history], Humphreys says. The first couple of films were smaller budget, and there were a smaller number of creature shots, but they grew between the sequels. For Day One, there are aspects of the creatures that weve seen before that are definitely already established. For this film, we see them on a larger scale and a larger number of them.For Day One, there are aspects of the creatures that weve seen before that are definitely already established. For this film, we see them on a larger scale and a larger number of them. [T]here is a new mother-size creature that we developed, which has a whole different scale, size and weight, and the texture of that creature is very different. We also created little baby creatures that eat some of these strange mushrooms that they open up during the film.Malcolm Humphreys, Visual Effects Supervisor, ILMThe VFX team crafted fresh details to enhance the alien Death Angels and fleshed out how the aliens interacted with building and cars, and operated with a hive mentality.Humphreys and his team inherited some of the creatures from the previous films, but the base-level creature required a revision as well as some additional sculpting and detail work. There is a scene thats inside a construction site, and there is a new mother-size creature that we developed, which has a whole different scale, size and weight, and the texture of that creature is very different. We also created little baby creatures that eat some of these strange mushrooms that they open up during the film, Humphreys details.Data capture from Chinatown rooftops provided reference for the built-out bridge. Reference documenting the demolition of a bridge helped capture the huge explosion and displacement of water as the bridge collapsed.Humphreys and his team were given a lot of creative freedom when it came to the shot design of the film. Updating the creatures performance was one of the most important goals to achieve. If you watch the first two films, the creatures sort of exist in these two different states. The first is snatch and grab, which is super-fast and you cant really see whats going on, or they are moving very slowly, Humphreys notes. In this film, one area we fleshed out a little bit more was when you are seeing them in daylight for a lot longer, and how they work together in a herd-like manner towards a common goal was quite enjoyable to explore. If you watch the first two films, the creatures sort of exist in these two different states: The first is snatch and grab, which is super-fast and you cant really see whats going on, or they are moving very slowly. In this film, one area we fleshed out a little bit more was when you are seeing them in daylight for a lot longer, and how they work together in a herd-like manner towards a common goal was quite enjoyable to explore.Malcolm Humphreys, Visual Effects Supervisor, ILMLupita Nyongo (Us, 12 Years a Slave) stars as Samira, a terminally ill woman caught up in the alien invasion of Manhattan. An important atmospheric element was the massive dust coating caused by meteorites destroying buildings.Although Day One takes place throughout New York City, the majority of the film was shot in London. Shooting in New York, especially for the amount of augmentations that were needed, proved to be impractical. Theres a whole load of visual effects work we were doing to make that possible, by doing set extensions, for example, Humphreys explains. I love doing creature work, but then theres also bits of every other type of visual effect in the film as well. It was shot primarily at the backlot in Warner Bros. Leavesden Studios, just outside of London, and some additional location work around London. We were shooting locations and then augmenting them to look like New York. There was a mixture of some drama photography done in New York, a minimal shoot, and some additional plate photography, then quite a large data-capture of different neighborhoods in New York. The story is essentially a journey from the tip of New York in Chinatown all the way up through the neighborhoods and into Harlem. Thats quite a large distance, and the architecture and geography change quite dramatically between those locations. Keeping the look of New York is an added challenge on a project like this, and that was particularly difficult to achieve but was ultimately a rewarding process. I was very lucky, as we went and did the data capture on my first day in New York, partly just to immerse myself in the story. I actually walked from Chinatown all the way to Harlem, which is quite a long distance, about a four-hour walk. That helped me really engross myself in the geography and the feel of the city as it changes. That in itself was invaluable research.The architecture and geography between Chinatown and Harlem changes dramatically. There was a large data-capture of different neighborhoods in New York.It was shot primarily at the backlot in Warner Bros. Leavesden Studios, just outside of London and shooting locations [around London], then augmenting them to look like New York. I was very lucky, as we went and did the data- capture on my first day in New York, partly just to immerse myself in the story. I actually walked from Chinatown all the way to Harlem, which is quite a long distance, about a four-hour walk. That helped me really engross myself in the geography and the feel of the city as it changes. That in itself was invaluable research.Malcolm Humphreys, Visual Effects Supervisor, ILMLondon was the primary hub for the visual effects work. We had multiple sites working on it, Humphreys adds. We also had teams in Vancouver, San Francisco and Mumbai. Most of the work that we did and the way I like to work is segment the work into logical chunks that we could give to each of the offices so that they have a little bit more autonomy and ownership of the work. Its not always possible, but it does help make it a bit more manageable and enjoyable for everyone.Meteorites striking Manhattan in the daytime precede the alien invasion. There was a mixture of drama photography done in New York, a minimal shoot and some additional plate photography.Overall, the film consisted of roughly 450 visual effects shots. It was a nearly perfect job for me, in the sense that there was a lot of 2D work, creature work, environment work, and there are some really quite large-frame oners, Humphreys details. They were just an undertaking of finding the right artists and team to work on those shots because they run for so long. They are multiple shots tied together. All the shots we worked on were challenging in their own different way. The last sequence in the film was quite a technical challenge. That was made up of multiple locations, including an airfield at Bovingdon, a location on the River Thames on a pier, then a boat that was docked and some additional tank work. Our job was to stitch all those locations together so that Joseph and Lupitas characters could smoothly move from one location to the next. Then we also needed to make it feel like its all taking place in New York. That was quite a technical challenge, but was quite enjoyable to do. There were some really enjoyable shots, like one shot in the upper East Side for example. The backlot that we had was quite successful for some of the work, but when we got the one shots where we wanted to make it look quite different, it was trying to make the backlot look like a whole different environment or a different part of New York, and that was quite satisfying.The last sequence in the film was quite a technical challenge. That was made up of multiple locations, including an airfield at Bovingdon, a location on the River Thames on a pier, then a boat that was docked and some additional tank work. Our job was to stitch all those locations together so that Joseph and Lupitas characters could smoothly move from one location to the next. Then we also needed to make it feel like its all taking place in New York. That was quite a technical challenge.Malcolm Humphreys, Visual Effects Supervisor, ILMA Manhattan crowd scene moments before the aliens attack. Manhattan footage was mixed with the backlot shoots at Warner Bros. Leavesden Studios outside London where New York sets and extensions were created.A new mother-size creature was developed with dramatically different scale, size, weight and texture.Although Day One takes place throughout New York City, the majority of the film was shot in London.A major action sequence takes place in a submerged subway where a large underwater gimbal allowed the platform to be lowered.The creatures, their numbers, movements and mentality were updated for Day One, and some of their textures differed from the previous films.The film was shot primarily at the backlot in Warner Bros. Leavesden Studios outside London and at various locations around London.Dust and ash float through the streets of Manhattan. Keeping the look of New York was an added challenge for VFX Supervisor Malcolm Humphreys and his team in London.A large amount of physical debris was needed and more belongings like bags and shirts were added in post.Humphreys and his team were given a lot of creative freedom in terms of shot design, and one of the main elements was fleshing out the creature movements in more detail.Having the creatures flip over cars in New York City was one of the creative choices that Humphreys and his team employed. Most of the visual effects work consisted of 2D, creature and environment work and some large-frame oners.Roughly 450 visual effects shots were created for the film. ILM was the primary vendor along with Important Looking Pirates, Proof, Cadence Effects and an in-house team.Out from the shadows where they lurked in the previous films, the creatures in Day One hunt openly in daytime.In Day One, the creatures work together in a herd-like manner towards a common goal.VFX Supervisor Malcolm Humphrey walked from Chinatown all the way to Harlem to immerse himself in the story and to get an accurate feel of the geography of New York City.Humphreys kept a close, running dialogue with the crew throughout the production, which was key on a project of this scale. It was a tight-knit group of us discussing how we were going to achieve certain shots, he says. Then going into post with Michael and building a really strong relationship with him was quite enjoyable. Michaels fantastic to work with because he gives you enough room to offer up ideas and help add to his vision.0 Comments 0 Shares 337 Views
-
WWW.VFXVOICE.COMTHE VIEW TURNS 25: A SPECTACULAR CELEBRATION OF VFX MASTERY AND CREATIVITYBy JIM McCULLAUGHImages courtesy of the VIEW Conference, except where noted.Mark your calendars for October 14-19, 2024, as the VIEW Conference in Turin, Italy, celebrates its 25th anniversary. This international symposium on Animation, VFX, Games, AI, AR, VR, XR, MR, Virtual Production, Metaverse, Storytelling and Immersive Media promises to be a landmark event.VIEW CEO and Executive Director Maria Elena Gutierrez and VFX Supervisor Kyle Odermatt (Raya and the Last Dragon, Moana, Wish) onstage at the VIEW.Over the past quarter-century, the VIEW Conference has welcomed literally thousands of people into a supportive global family of creatives, innovators and visionaries, says CEO and Executive Director Maria Elena Gutierrez. Weve provided them with opportunities to learn from the industrys best in digital media. Our mission has always been to inspire, educate and innovate, all within an atmosphere of love, support and inclusivity. Were not just a conference were a community.The incredible quality of our speakers means our content is always of the highest standard.Maria Elena Gutierrez.CEO and Executive Director, VIEW ConferenceReflecting on the journey, Gutierrez muses, It seems like only yesterday when it all started those 25 years went like a dream! In the early days, our biggest challenge was putting the conference on the map. It was tough getting top talent to make the time commitment necessary to visit Italy, a country not known for its animation, VFX and games industry.German film score composer and music producer Hans Zimmer wows the crowd at the VIEW.Today, the landscape has transformed. I regularly get calls from filmmakers working on their latest projects, asking if they can present at VIEW Conference when the movies finished! When they come to Turin, they stay for the entire week remember, these are people whose time is so precious that its usually hard even to get 15 minutes with them. I believe this happens because weve succeeded in creating an experience that people genuinely love and want to prioritize.[The VIEW Conference] creates an environment where top-level speakers challenge themselves to deliver more. As a result, the VIEW Conference program is always gold standard, frequently filled with exclusives and occasionally delivers a stunning surprise.Maria Elena Gutierrez.CEO and Executive Director, VIEW ConferenceGutierrez attributes the conferences success to its unwavering commitment to quality and its unique approach. We dont just string presentations together, she explains. We spend many months curating a meticulous program of keynotes, talks, workshops and masterclasses designed to inspire and teach both professionals and students. This is supported by a year-round program of free online sessions, such as panels, that bring together awards contenders on the run-up to the Oscars. The incredible quality of our speakers means our content is always of the highest standard.American cartoonist ND Stevenson, animator Troy Quane, animator Nick Bruno and Ted Ty of DNEG Animation at the VIEW.The VIEW Conference goes beyond delivering top-notch content. It creates an environment where these top-level speakers challenge themselves to deliver more. As a result, the VIEW Conference program is always gold standard, frequently filled with exclusives and occasionally delivers a stunning surprise. Also, it has to be said that the timing is good. VIEW Conference takes place just as the industry is gearing up for awards season, making us the perfect showcase for the leading contenders to present their achievements. And it doesnt hurt that Turin in October is just so beautiful.Jeff Rowe, American writer and director, brings the VIEW audience into the world of Teenage Mutant Ninja Turtles: Mutant Mayhem.She adds, Hans Zimmer also blew me away with a rousing retrospective on his amazing career. Tom McGrath and Mireille Soria moved me deeply with their highly personal talks about the influence their families have had on their work. Peter Sohn and Doug Chiang wowed us with their dazzling creativity while at the same time speaking eloquently about their experiences growing up in immigrant families in the U.S. And I have a special place in my heart for the extraordinary hour I spent at VIEW Conference 2022, when Rob Minkoff, director of the original The Lion King, took to the stage with his ukulele and treated us all to a never-seen-before musical journey through his experience making one of the best-loved animated features of all time. If Im allowed a shameless plug, you can enjoy that session and more through our extensive online archive of recordings just visit the VIEW Conference website.Gretchen Libby, Director of Visual Computing at Amazon Web Service, addresses VIEW attendees.Shannon Tindel, American animator and storyboard artist, and Peter Ramsey, American illustrator and storyboard artist, share insights with the VIEW audience.Christina Heller, film producer known for Trolls, Hotel Transylvania 3 and Over the Moon, at the VIEW.French cartoonist and animator Benjamin Renner.Peter Ramsey, American illustrator and storyboard artist, and Portuguese-American animator and storyboard artist Joaquim Dos Santos.Doug Chiang, Senior Vice President and Executive Design Director at Lucasfilm, packs the VIEW auditorium.Danielle Feinberg, director of photography and visual effects supervisor, at the VIEW.German-American animator Andreas Deja.Director Chris Sanders will be at VIEW 2024 to discuss The Wild Robot along with Production Designer Raymond Zibach and VFX Supervisor Jeff Budsberg. (Image courtesy of DreamWorks Animation and Universal Pictures)Shannon Tindle, director of Ultraman: Rising, will be a featured speaker at VIEW 2024. (Image courtesy of Netflix)[W]eve succeeded in creating an experience that people genuinely love and want to prioritize.Maria Elena Gutierrez.CEO and Executive Director, VIEW ConferenceLooking ahead to this Octobers program highlights, Gutierrez remarks, There simply isnt enough time to list them all! Suffice it to say, VIEW Conference 2024 promises a unique lineup of award-winning creatives discussing their latest projects alongside top-level Silicon Valley innovators. Live on stage, well have the directors of this years big animated features Inside Out 2, Ultraman: Rising and The Wild Robot, to name but three but as always, we go deeper. For example, in addition to speaking about his movie, Josh Cooley, director of Transformers One, will also be teaching a special masterclass on the story, designed to enlighten and inspire professionals and students alike while passing on a wealth of practical advice and insight. We dive deep with VFX, too. Our exploration of Dune: Part Two boasts no less than five presenters, including triple-Oscar-winning Production VFX Supervisor Paul Lambert. Were also proud to welcome the VFX supervisors of big movies like Twisters and Ghostbusters: Frozen Empire and hit streaming series like Fallout and 3 Body Problem.Inside Out 2s director Kelsey Mann will be a speaker at VIEW 2024. (Image courtesy of Pixar/Disney)Production VFX Supervisor Paul Lambert and second unit VFX Supervisor Patrick Heinen will be speaking about Dune: Part Two at VIEW 2024. (Image courtesy of Warner Bros. Pictures)Jay Worth, Overall VFX Supervisor of Fallout, will be a featured speaker at VIEW 2024. (Image courtesy of Amazon Prime Video)Steven Fangmeier, Production VFX Supervisor of 3 Body Problem, will be a featured VIEW 2024 speaker. (Image courtesy of Netflix)Twisters VFX Supervisor Florian Witzel will be a featured speaker at VIEW 2024. (Photo: Melinda Sue Gordon. Courtesy of Universal Pictures)Jason Greenblum, VFX Supervisor for Ghostbusters: Frozen Empire, will be a featured speaker at VIEW 2024. (Photo: Jaap Buitendij. Courtesy of Columbia Pictures/Sony)Josh Cooley, director of Transformers One, will be speaking about the film and also teaching a special masterclass on story at VIEW 2024. (Image courtesy of Paramount Pictures)Additionally, she continues, We offer a comprehensive range of sessions on game design and animated short films not to mention special tracks dedicated to business, generative AI and education and youll quickly see that our 25th anniversary edition will be one to remember.For more information, visit the VIEW website at viewconference.it.0 Comments 0 Shares 340 Views
-
WWW.VFXVOICE.COMFILMMAKER PABLO BERGER MAY NEVER STOP HAVING ROBOT DREAMSBy TREVOR HOGGImages courtesy of Arcadia Motion Pictures, Lokiz Films, Noodles Production and Les Films du Worso.It was a pleasant surprise when the Spanish-French co-production Robot Dreams, which deals with themes of loneliness, companionship and people growing apart without a word of dialogue was nominated for Best Animated Feature at this years Academy Awards. Masterminding the adaptation of the graphic novel by Sara Varon was live-action filmmaker Pablo Berger (Blancanieves), who decided to try the world of animation, and in the process of doing so came up with some imaginative dream transitions. One transition in particular occurs when the Coney Island-stranded Robot is covered by snow, falls out of the frame, gathers himself and turns the image around to reveal a Hollywood set that resembles something out of The Wizard of Oz.A background of Coney Island where a major narrative turning point occurs.That is also one of my favorite scenes, Berger laughs. Jean-Claude Carrire, my favorite writer of all time, once said, Theres only one rule in cinema. You have to surprise the audience. I like the idea that whenever I write the script its almost like a Weeha! and I let myself go. In Robot Dreams, I brought a lot of references and inspirations from The Wizard of Oz to Buster Keaton and Busby Berkeley. I mix it all together. One of my favorite animators, Osamu Tezuka, is the king of breaking the fourth wall, and its also a homage to him. The aspect ratio had to be taken in consideration to effectively accomplish the visual gag. We were thinking about the previous shot so we could trick the audience. It should feel like a surprise, so we had to reduce the aspect ratio to be able to do that surprising effect, Berger remarks.I love the idea in animation that with just two circles and one dot you have an eye and with a single line you have a mouth, and how many things you can express with just one line. And how, with the mouth, Robot can teach the bird to fly; we worked a lot on that one. In the first storyboard, he did it with the eyelids going up and down, but then we realized it wasnt so clear. Suddenly, we decided to do it with the mouth.Pablo Berger, DirectorStoryboards are a major part of the filmmaking process. Even for my live-action films, I spend a year doing the storyboards. Especially when you make an animated film, the storyboard is the treasure map, Berger explains. Every single shot, the angle, size and duration are already planned. In a way, before the animation started, we had the film already finished. We made it with a small team five people working for a year and a half to make a rough animation. Every shot is clear. Of course, then you can do small changes when you begin to do the animation. The source material broke the fourth wall in way that worked within the confines of a comic book panel. Notes Berger, In the graphic novel you just see Robot getting out of the square, but it gave me an idea that I wrote in the script. When I started doing the storyboards, I had to decide where to put the camera for every single shot. Once you get animators and animation directors, they can bring you things. Animators are like actors and can propose certain actions.Research went into depicting areas of New York City during the 1980s such as Chinatown.Grunts and groans replaced actual lines of dialogue by the characters. I had great support from the voice director, who was also the voice of Dog, Ivan Labanda, Berger explains. All of the big Hollywood films that have young characters are dubbed by him. For us, it was important to have a little giggle or laugh. It was a different kind of acting, but in our case it was pleasurable. I was not alone doing this. Ivan cast a group of his friends to do it, which became a little ensemble. With six actors, we created all of the voices, grunts, laughs, screams and little noises. For example, during the sledge race in snow, the Anteaters had to make specific noises. It was definitely hard work, but we also had a fun time doing it.Nuances, such as cheese from the macaroni splattering against the glass window of the microwave, had to be meticulously orchestrated yet feel accidental. You have to realize, Im a live-action director, Berger notes. If directors have one superpower, we can close our eyes and see our film finished. However, in live-action, there are so many accidents that the film changes completely from what you were thinking about. In animation, the final film comes close to the dream, so I can feel my fingerprints in every single frame. Of course, Im working with a big team of artists, but I was involved with every single detail that happens because what you have to realize is, first you do the storyboard followed by layout, rough animation and then final animation; there are so many steps that you can make small adjustments.There are dream sequences in the film, with this being the background for the Flowerland, which resembles something out of The Wizard of Oz.An interesting creative choice was not to give Robot mechanical actions. The trending topic nowadays is AI and robots, but we started this film over five years ago, Berger observes. This robot is a metaphor for a friend, lover or companion. It doesnt have any chip inside. There is no software. Its like gears, wheels, bolts and springs. We wanted to do an old-style robot from the 1950s. It was also inspired by Hayao Miyazakis Laputa: Castle in the Sky. Introducing motion meant that some of the character designs had to be altered. Daniel Fernandez designed the characters, but then it was important when Animation Director Benot Froumon got involved. Initially, Robot was much rounder in the character design, but then early on Benot said, We need more rough edges so we can make it more 360. The designs at the end had to be approved by the animation director because hes the one who is going to have to animate.The interior of a Radio Shack where robot companions are on display.One of the clever moments is when Robot teaches a chick to fly through the motions of his mouth. I love the idea in animation that with just two circles and one dot you have an eye and with a single line you have a mouth, and how many things you can express with just one line, Berger reveals. And how, with the mouth, Robot can teach the bird to fly; we worked a lot on that one. In the first storyboard, he did it with the eyelids going up and down, but then we realized it wasnt so clear. Suddenly, we decided to do it with the mouth. Its a collaboration. The director gets the credit of the film, A Film by Pablo Berger; however, I had Benot Froumont, Sylvain Chomet [French writer and animator], Tom Moore [Co-Founder of Cartoon Saloon] and Art Director Jos Luis greda, who has made amazing films. You feel like you have good players next to you, and they helped me in the making a lot of the decisions.[T]here are so few animators in the world nowadays who know how to draw and create in 2D. Most animated films are using 3D. I also believe that when you want to create something new, you have to look to the past. I dont think you always have to look forward. Hand-drawn animation still has so much to offer to animation. We can talk about Spider-Verse animated movies, which use 3D, but theyre doing an old style so that it looks like 2D. Why not?Pablo Berger, Director2D animation was chosen over a 3D style. As a kid growing up in the 1960s and 1970s, everything was hand-drawn animation, Berger states. Robot Dreams takes place in the 1980s and is based on a graphic novel. For me, it was important that when people watch the film, they have to feel like its a graphic novel that is alive. We did not have any doubts thinking about 2D or 3D. 2D was definitely much more complex to find a team because there are so few animators in the world nowadays who know how to draw and create in 2D. Most animated films are using 3D. I also believe that when you want to create something new, you have to look to the past. I dont think you always have to look forward. Hand-drawn animation still has so much to offer to animation. We can talk about Spider-Verse animated movies, which use 3D, but theyre doing an old style so that it looks like 2D. Why not?A color script for the sequences when Dog receives his robot companion delivery and when he and Robot do a rollerblade dance in Central Park.In the first draft of the script, it was already September [September Song by Earth, Wind & Fire] because I needed a song that Dog and Robot could roller-dance to in Central Park. The producers were not happy in the beginning because music rights are expensive, and if you want one of the most popular songs in pop history it was an expensive song. But it was worth every penny that we spent. Definitely, the music was key. When you dont have dialogue, music becomes the voice of the characters.Pablo Berger, DirectorThe growing popularity of adult animation has led to a hybrid approach to animation. There is this amazing software called Blender, and you can combine with 2D, Berger remarks. You can get the best of both worlds. One of my favorite animated films in recent years is I Lost My Body, which has 2D and 3D, but you dont notice it. Whatever works to tell the story. In order to get the television screen to illuminate the face of Dog, some digital compositing was needed. The good thing about doing a 2D hand-drawn film now is you can use digital compositing software and different ways to provoke color correction. You use visual effects elements in a 2D world because in the end youre doing compositing as well.Exploring the different times of day for street exteriors.Along with making the transition to animation, Berger had to set up a pop-up studio called Lokiz Films, which has 100 animators working in Madrid and Pamplona. We were going to make this film with Cartoon Saloon, and suddenly we are not, Berger recalls. And its my first animated film. We had to make this big decision. I thought I was Mickey Rooney and Judy Garland, Lets put a show! Lets make a pop-up animation studio. That was the hardest thing. We looked for animators all over Europe. It was the time of COVID-19. We didnt want to work remotely. It was mandatory that the animators had to come to Madrid or Pamplona. It was absolutely hell!Also, a pipeline was being created from scratch. The first time I heard the word pipeline was in animation because we dont use that term in live-action, Berger reveals. Pipeline? What is a pipeline!? Someone explained to me the animation pipeline, and I went, But this doesnt work for me! It was hard work to create a pipeline that worked for me. For me, the scenes had to be truthful, good acting, and I had to see them in continuity. I couldnt think about a shot; in animation and visual effects we talk about the shot. For me, it had to be beats of 30 seconds or one minute to approve.Examining what the interior of Dogs apartment will look like under various lighting conditions.Not having to convert the methodologies of an established animation studio was beneficial. At the beginning we thought it could be suicidal to create a studio, but after three or four months of fighting we realized, This is working. Were doing things our own way, Berger states. We created our own pipeline. Although I love Cartoon Saloon and have been in Kilkenny, I know if we had done Robot Dreams with them, it would have had a Cartoon Saloon-like style, which is great, but this gave it a unique personality; we created our own style of animation. Benot Froumont worked on The Triplets of Belleville as an animation director, so the spirit of Sylvian Chomet is in the film. And Benot also worked with Tom Moore on The Secret of Kells, so he knew Cartoon Saloon as well. Definitely, Benot brought a lot of knowledge, and his style is in the film.Central throughout is the song September by Earth, Wind & Fire. In the first draft of the script, it was already September because I needed a song that Dog and Robot could roller-dance to in Central Park, Berger explains. Then I realized, what if this song becomes the theme of the film? It was great to have that decision so early on. The producers were not happy in the beginning because music rights are expensive, and if you want one of the most popular songs in pop history it was an expensive song. But it was worth every penny that we spent. My producers spoiled me, and were able to put it into the budget. Definitely, the music was key. Not only the diegetic music like September but also Alfonso de Vilallongas [Composer] created a beautiful, jazzy, original soundtrack with strong melodies of piano. I could say that Robot Dreams is musical. When you dont have dialogue, music becomes the voice of the characters.Conceptualizing the anatomy and shape language for Dog.New York City in the 1980s is a principal cast member. Without a doubt, New York is the third protagonist of the film next to Robot and Dog, Berger remarks. But in the graphic novel, New York is not there. Its just an American city. For me, to make New York a protagonist was key. I did a masters in film at NYU, then continued to live in New York for 10 years. This was my love letter to New York, and you cannot have New York without New Yorkers! What a better metaphor than to have a jungle of animals representing all New Yorkers. Sometimes with animated films, the characters in the background dont move, but for me every single character in the background had to be doing something and dressed in a specific way. They have to have personality. Thats a good thing about animation. You can watch an animated film many times because theyre so rich in details. For people who love Robot Dreams, its worth to see it a second time just looking at the characters in the background!There are some dark moments with Robot getting a leg chopped off and his head severed. I realized early on in the process that although it was an animated film, this was no different from my live-action films, Berger states. Instead of using a camera, I was using artists who were drawing. The tone connects with my previous films. All of my films are tragic comedies; there is humor but also a lot of drama. I love that my films are filled with emotion. I love to think that Robot Dreams is a roller-coaster ride of emotions. Like in real life, there is drama. I hope that Robot Dreams will move people in many deep ways and they can have an emotional connection with the story. Drama is a big part of Robot Dreams.Experimenting with a selection of poses for Robot and Dog for a variety scenarios.Inspiring the visuals was the source material. The graphic novel by Sara Varon has what is called ligne claire in French, which means clear lines, Berger notes. Its an aggregate style from French and Belgium comic books. The colors are solid and the palette is stylized. It was important to get the New York colors, the buildings and streets, and the colors of the cars. It was also inspired by the research to have an 1980s feeling in the film. It had to be like candy. The film had to be very appealing visually, so the color is a big protagonist as well. Close-up shots are part of the visual language. I made a film called Blancanieves, which didnt have dialogue. When you have to tell the story with images, you need numerous close-ups. Robot Dreams is filled with close-ups and dollies, but what you need is a lot of shots to make the film feel alive. Its a combination of how I make films and the needs of a dialogue-free film.Doing a version of a 2D turntable for Robot.A critical component in making the shots feel alive are the New York characters that populate the background.One of the more complex shots is Dog and Robot walking down the street with various New Yorkers going about their daily lives.Without a doubt, New York is the third protagonist of the film next to Robot and Dog, But in the graphic novel, New York is not there. Its just an American city. For me, to make New York a protagonist was key. I did a masters in film at NYU, then continued to live in New York for 10 years. This was my love letter to New York, and you cannot have New York without New Yorkers! What a better metaphor than to have a jungle of animals representing all New Yorkers.Pablo Berger, DirectorOne of the most complex shots is the first time that Robot and Dog walk on the street together, Berger states. There is a long wide shot that is almost 10 seconds of traveling, and you see Robot and Dog, the Boar dancing; there are so many characters at the same time, and we have cars. Also, to find how Robot walks was one of the first things we did. Maybe the next time, if I do a live-action film I would even do an animatic, a moving storyboard.Originally, the animation was going to be done by Cartoon Saloon, but director Pablo Berger had to create a pop-up studio called Lokiz Films.Great attention was paid to the timing of the eye movements to achieve the proper comedic and dramatic tone.Dog takes a break while assembling his newly-acquired robot companion.Robot Dreams was not made for a specific demographic. Its for cinephiles, Berger notes, for people who want to go to the movies every weekend and have a fun time, and for kids. Its a film that does not exclude audience members. Its made of different layers, and every type of audience will get a different layer. That is what made this film unique for me. Being an Academy Award nominee was something to treasure. It was a great adventure. It was fantastic to go to the Oscars and meet other fellow directors. I even got to meet the great Steven Spielberg. I chatted with him. That was my Oscar!0 Comments 0 Shares 343 Views
-
WWW.VFXVOICE.COMLIGHTWHIPS, DAGGERS AND SPACESHIPS: REFRESHING THE STAR WARS UNIVERSE FOR THE ACOLYTEBy TREVOR HOGGImages courtesy of Lucasfilm Ltd. and Disney+.When Jedis associated with a tragic incident that took place in a witch coven situated on Brendock are assassinated, a Jedi Master is drawn into the mystery. When it is suggested that his former padawan is responsible for the killings, the Jedi Masters investigation uncovers a much larger Sith conspiracy. A force to be reckoned on the High Republic Era set of the The Acolyte, which occurs a century before the Star Wars prequel trilogy, is creator, executive producer, showrunner, director and writer Leslye Headland. Consisting of eight episodes, the Lucasfilm and Disney+ production has Julian Foddy moving from being a visual effects supervisor at ILM to working client side where he was responsible for digitally augmenting 2,800 shots.Ive done quite a lot of big Marvel films and various other Star Wars projects where they had a lot more reliance on CG content. [Director] Leslye Headland wanted to try to keep everything real and practical. The mantra was, What would have George Lucas done when he was making the original trilogy? Creatures and sets should be real. I had to approach the project with the view to minimizing the number of visual effects in most shots.Julian Foddy, Production Visual Effects SupervisorApproximately 2,800 visual effects shots were produced for the eight episodes by ILM, Rising Sun Pictures, Luma Pictures, Hybride, beloFX and Outpost VFX, with previs created by Proof.This is the first show that Ive done where I was the overall Production Visual Effects Supervisor, so straight away it was a different experience, Foddy states. Ive spent a lot of time on set for various projects before but have never been involved quite as early as I was on The Acolyte. As the Production Visual Effects Supervisor, youre onboard from day one, so I was party to original drafts of scripts and had a creative voice throughout all of the prep. That was fantastic, and Im keen to stay in a production role.The forest world of Khofar, where the Wookiee Jedi Master Kelnacca lives in isolation, is introduced in The Acolyte.Amandla Stenberg plays twin sisters Mae and Osha Aniseya, and Lee Jung-jae portrays Jedi Master Sol Mogra under the direction of Leslye Headland.Extensive work was done by beloFX to achieve the proper mood and lighting for shots.Other differences were prevalent. It was a change from things Ive worked on over recent years. Ive done quite a lot of big Marvel films and various other Star Wars projects where they had a lot more reliance on CG content, Foddy notes. Leslye Headland wanted to try to keep everything real and practical. The mantra was, What would have George Lucas done when he was making the original trilogy? Creatures and sets should be real. I had to approach the project with the view to minimizing the number of visual effects in most shots. Despite the practical mantra, CG elements were unavoidable. One thing that is all CG are the ships with the exception of a shot where we used a miniature. Often, we had a practical ramp, or the inside of a compartment was a real set, but whenever you see the whole ship from the outside, whether its on the ground or out in space, those area always CG. The ships were designed by Kevin Jenkins [Production Designer] and the art department, and they always built physical models. Quite often, we had gray scale or balsawood models. We always had something there to visualize the form of the ship before getting it into CG; that allowed us to talk about things like the scale of the ship and what movement characteristics would they have when in space.One thing that is all CG are the ships with the exception of a shot where we used a miniature. Often, we had a practical ramp, or the inside of a compartment was a real set, but whenever you see the whole ship from the outside, whether its on the ground or out in space, those area always CG. The ships were designed by Kevin Jenkins [Production Designer] and the art department, and they always built physical models. Quite often, we had gray scale or balsawood models. We always had something there to visualize the form of the ship before getting it into CG.Julian Foddy, Production Visual Effects SupervisorThe mandate was to rely more on practical locations and sets, with beloFX doing some of the needed digital augmentation to the landscape and sky.CG also assisted in transitioning a signature pink Lightsaber from The High Republic novels into its cinematic counterpart. Full confession, declares Foddy. While all of the other Lightsabers do have tubes of LED lights, the Lightwhip itself did not have a practical component for safety reasons, as well as for technical and practical ones. The Lightwhip was something that Leslye desperately wanted to get into the show. I tasked the ILM art department with coming up with various concepts for how the Lightwhip might work and engaged in lengthy discussions with Pablo Hidalgo [a creative executive in story development at Lucasfilm] to make sure we got this right. Not only did we want to respect the fans of The High Republic novels but also discuss the technology of a Lightsaber and the physics of how the blade would work. We worked that out with Pablo to make sure that we did the whip so it felt plausible. We explored lots of different looks and pursued the idea that maybe the whip was made from small sections of a chain consisting of short Lightsaber blades or like a cat o nine tails that had more than one strand. Ultimately, we came down to the simplicity of the single strand. Its down to the nonchalance of how Vernestra Rwoh uses the whip as well. There is the touch of Indiana Jones in the way she flicks it. Leslye was keen to make sure we see that the Lightwhip is not only the badass weapon, but that Vernestra looks like a total badass when she uses it. It is a normal Lightsaber and has been modified to turn into the whip. There is an instance when Vernestra deploys or retracts the whip, and it snaps into a straight blade before returning into the hilt.Bluescreen was unavoidable but not heavily relied upon by the production.Martial arts and ninja tactics such as throwing daggers were incorporated into the lightsaber fight choreography. Wherever we could do it for real. we did it for real, Foddy explains. Obviously, you cant throw real knives, but we did have rubber-bladed ones that could be thrown; however, that is still dangerous. In Episode 101, when Mae [Amandla Stenberg] is fighting Indara Force [Carrie-Ann Moss], who snatches the knives from her that was a mixture of cunning switcheroos on the day [of shooting]. Amandla Stenberg would hold the knives in position, wed cut the camera, shed drop the knives, then we would run the camera again. There are some CG knives in the instances where theyre thrown or when Indara Force pulls them out of Maes belt. Most of the shots where you see Mae flipping the knives into her hands, thats all real. Amandla learned all of the knife skills and the manual dexterity to do those things, like twirling the knives from her palm into her grasp.For safety and logistical reasons, the daggers thrown by Mae were digitally added.Practical LEDs were used for the Lightsabers to get the correct interactive light, except for the Lightwhip.The architecture and spaceships of the High Republic are not significantly different from what are seen in the prequel trilogy, which occurs 100 later, as technology evolves slowly in the Star Wars universe.Anything that the cast would have to interact with was physically constructed and the rest extended in CG.A simple design language was utilized for the UI.Coruscant of the High Republic Era had fewer levels than what was seen in the prequel trilogy.Elements like hyper drives were treated as heavy bulk attachments for the spacecraft.The Trade Federation Fallon freighter ship is brought to life by ILM.Full confession. While all of the other Lightsabers do have tubes of LED lights, the Lightwhip itself did not have a practical component for safety reasons, as well as for technical and practical ones. The Lightwhip was something that [Director] Leslye [Headland] desperately wanted to get into the show. I tasked the ILM art department with coming up with various concepts for how the Lightwhip might work and engaged in lengthy discussions with Pablo Hidalgo [a creative executive in story development at Lucasfilm] to make sure we got this right.Julian Foddy, Production Visual Effects SupervisorTwins Leah and Lauren Brady were used for the scenes of Young Mae and Young Osha, while Amandla Stenberg portrays the adult versions of both sisters. You will see more of the adult twins together as the series progresses, Foddy reveals. It has been a mixed approach. Amandla had a performance-double named Shanice Archer, who was fantastic, and she spent several months in prep with a movement coach learning to adopt Amandlas physicality. All of the over the shoulders are Shanice. Quite a few shots have split screens. Also, we had quite a lot of involvement in the walk-throughs with the directors of photography and talked about how we could shoot scenes in a way that favored over the shoulders or seeing the two girls from behind. There are only very few instances where we had to worry about being able to do both of the twins faces. As well as split screens, we quite often shoot switcheroos as well, where I would literally have the girls swap places. In some cases, that gave us a facial element to put on the other actor who was done in the exactly same lighting conditions and camera position.A breakdown of a shot by ILM from plate to final compositing.Impacting the world-building was the time period of the High Republic. We are 100 years before the prequels. This is something we consulted with Pablo Hidalgo and Dave Filoni [CCO at Lucasfilm] about, Foddy states. Dave pointed out that the Republic has existed for thousands of years, so technology evolves at a fairly slow rate in the Star Wars universe. What that meant is, for the previous 100 years, things are not going to look radically different in terms of ship design, technology available or weapons. We did change a few little things to help sell that 100 years. If you look at Coruscant, the city is constantly building on top of itself, and the layers keep growing. A hundred years earlier, the surrounding buildings around the Jedi temple were all a bit lower. You can see that in some of the establishing shots. Its a subtle touch while still keeping it recognizable for the audience. A good example of different technology are the ships. While the ships are capable of hyperspace travel, hyper drives at this point in time are so big and bulky that theyre a separate attachment. Weve seen this in the Star Wars prequel films where ships have a hyper drive ring that they attach to the ship that allows them to travel at light speed. Every ship in this era has to have an attachment because theyre too big and bulky. We see that in Episode 101. When the Polan-717 arrives at Karlach, it detaches from its rear section and makes its way down to the planet.Ultimately, we came down to the simplicity of the single strand [for the whip]. Its down to the nonchalance of how Vernestra Rwoh uses the whip as well. There is the touch of Indiana Jones in the way she flicks it. [Director] Leslye [Headland] was keen to make sure we see that the Lightwhip is not only the badass weapon, but that Vernestra looks like a total badass when she uses it. It is a normal Lightsaber and has been modified to turn into the whip. There is an instance when Vernestra deploys or retracts the whip, and it snaps into a straight blade before returning into the hilt.Julian Foddy, Production Visual Effects SupervisorAmong the extensive simulations for ILM was fire involved with burning down the witch coven castle on Brendok.Full CG elements were among the most complex to design, create and execute. The Coruscant establishers were quite a lot of work to provide all the level of detail of the city and the ships flying in the background, Foddy remarks. For Episode 103, with the big fortress on the top of the mountain [which features fire and destruction], anytime you bring effects simulations into the mix and animation, thats where things get complicated. There were lots of layers going into the composite and multiple departments involved. There is also some complex creature work. In Episode 103, the look development for the translucent butterfly creatures that the girls are playing with at the opening took a long time to get right. There is an almost Jell-O-like feel, and you can see light scattering through them. They had an iridescence to them. That was complex work to get that looking right and feeling natural, in particular when the creature lands on her finger and sits there.Miniatures of the spaceships were created as points of reference by Production Designer Kevin Jenkins.The Acolyte forges its own way narratively and aesthetically. The Acolyte is the first live-action series to explore new characters in a completely new story arc outside of [LEGO Star Wars:] The Skywalker Saga. When Disney acquired Star Wars, there was talk about exploring the Star Wars universe, and that is what The Acolyte is doing. Its allowing existing Star Wars fans to see more of what goes on in the other planets and in the history of the galaxy prior to the times that weve seen in the films. Also, for people new to Star Wars, you come at this fresh and dont have to have seen any of the other films or books because its all new characters. Hopefully, it will open your eyes to the bigger galaxy.0 Comments 0 Shares 327 Views
-
WWW.VFXVOICE.COMWILLIAM SHATNER: HONORING AN ICONBy NAOMI GOLDMANCaptain James Tiberius Kirk in Star Trek: The Original Series.(Image courtesy of CBS Studios/Paramount)William Shatner has boldly taken audiences to the final frontier throughout his remarkable seven-decades-long career. As an Emmy-and Golden Globe-winning actor, director, producer, writer and recording artist, Shatner remains one of Hollywoods most recognizable figures. With his portrayal of Captain James T. Kirk in the legendary science fiction television series Star Trek: The Original Series and in seven Star Trek movies, Shatner is the originator of one of the most iconic science fiction characters in history.William Shatner has been at the center point of compelling stories that use visual effects to enhance unforgettable storytelling for decades, and his work continues to leave an indelible mark on the cultural landscape, said Nancy Ward, VES Executive Director.For his exceptional work in the epic Star Trek franchise and in recognition of his cinematic legacy that continues to touch new generations of filmmakers, creatives and audiences, the Society recently honored Shatner with the VES Award for Creative Excellence at the 22nd Annual VES Awards.The Twilight Zone episode Nick of Time. (Image courtesy of CBS Photo Archive)With James Spader in Boston Legal.(Photo: Carin Baer. Courtesy of ABC)Shatner became the oldest person to fly into space at age 90 after completing his mission on Blue Origin NS-18 on October 13, 2021. (Image courtesy of Blue Origin/Reuters)The Twilight Zone episode Nightmare At 20,000 Feet. (Image courtesy of CBS Photo Archive)T.J. Hooker. (Image courtesy of Sony Pictures Television)With friend Seth MacFarlane backstage at the 22nd Annual VES Awards after receiving the Societys Creative Excellence Award.(Image courtesy of the VES)Seth MacFarlane, award-winning actor, creator of Family Guy, The Orville and Ted, gave an epic tribute in presenting the Creative Excellence Award to his longtime friend: Star Trek laid out a vision as relevant then as it is today, of a future where we all aspire to be nobly forward-looking and to improve the human condition. It continues to live large in our collective consciousness and remains relevant for generations of viewers. But Star Treks center of gravity has always been William Shatner. Bill has done something we all can only hope to do. He has made a permanent mark on this industry that is all his own. His work will endure for as long as there is an entertainment industry. He is a colossal talent, a great performer who has never lost his sense of curiosity or adventure.In accepting his award, Shatner remarked, This industry is filled with the most creative people; every one of them is an artist who is always thinking ahead. The work is progressing at such a great speed, and its amazing what visual effects can bring to life. In the beginning [of Star Trek], it was just a flashlight and a cardboard Enterprise! Visual effects have become a truly visual organic and immersive experience, and I accept this award for those great artists men and women who work beyond the imagination. Thank you to the Visual Effects Society for bestowing me with this honor.0 Comments 0 Shares 301 Views
-
WWW.VFXVOICE.COMJOYCE COX, VES: CELEBRATING A PRODUCTION PUZZLE MASTERBy NAOMI GOLDMANLifetime Achievement Awards recipient Joyce Cox, VES with her award.Acclaimed Producer and VFX Producer Joyce Cox, VES describes her job as a wire walker between creative goals and financial restrictions, one who helps realize a filmmakers vision to create the best project possible on time and on budget. A self-described lover of puzzles, Cox built a brilliant career out of her innate talent for organizing people and moving parts, a skillset that has branded her a luminary in the world of visual effects producing and one of the most respected producers in our industry.With credits that include Titanic, The Dark Knight, The Great Gatsby, Men in Black 3, Avatar and The Jungle Book, Cox has been instrumental in shaping popular culture for decades, and her work has put VFX squarely at the center of big box office filmed entertainment. She has produced 13,000 visual effects shots with budgets totaling in excess of $750 million and won three VES Awards for her work on Avatar and The Dark Knight. In 2018, Cox was honored with the title of VES Fellow.In recognition of her exceptional career as an educator, changemaker and exceptional contributor to the visual effects craft and global industry, the Society honored Cox with the VES Lifetime Achievement Award at the 22nd Annual VES Awards.After receiving a standing ovation from her peers, Cox shared her appreciation: Im truly honored to be given this prestigious award from the VES celebrating my career, one of the opportunities to facilitate the work of the thousands of artists, technicians and visionaries it took to create these movies. Its been a privilege to work with and learn from so many brilliant, dedicated people who gave life to words on a page, transforming pixels and dreams into worlds that captivate and inspire, and that is nothing short of magic. This award celebrates not just my achievements, but the collective triumphs of a creative community, and shines a light on the value of Visual Effects Producers.Cox continued, Because being a VFX Producer is still a fairly new position in the film industry, we tend to disappear, with most of the emphasis on how VFX is made falling to the VFX Supervisor. But to produce and succeed in this job, you have to understand every departments role and absorb their demands and restrictions and precisely how VFX can support and achieve the end goal of producing the best movie. So having this role recognized by the VES, and me as a woman in this role, means so much.Harkening back to her early life, Cox recounted, Unlike many of in this industry who set their sights early in life for a career in film, I arrived along a circuitous path of happy accidents. I grew up in a small Kansas community in the 50s and 60s. A time and place where most girls, including me, were not mentored toward careers. Certainly not a career in film. Cox highlighted her parents as her first role models. My mother had a brilliant math mind, and my friends referred to my dad as a metaphysical cowboy a poet trapped in a laborers body. They married young and neither had a high school degree. Looking at my moms trajectory, she riveted nose cones on fighters, taught herself how to do the books in the aircraft industry and went on to become one of the first women executives at Boeing Military. That focus and drive to grow and achieve was a great source of inspiration.VES honoree Joyce Cox, VES backstage with VES Chair Kim Davidson, VFX Producer Richard Hollander and VES Executive Director Nancy Ward.I get to explore the diversity of highly creative and exceptionally smart people and be a part of how those minds take words off the page and realize them through an intense process into a beautiful film experience.Joyce Cox, Producer and VFX ProducerCox pursued her education in Kansas, taking classes at Wichita Business School and Kansas City Community College, and got an early exposure to business working in a series of office positions in everything from manufacturing aircraft parts to real estate. Then she was enticed to start her creative career. My brother worked in advertising as an art director in Chicago at hot boutique agencies and his life was really appealing, so I moved to Chicago and started representing artists. It was the mid-70s when I started my first company, Joyce Cox Has Talent, which was really the window into the creative process and the gateway to my future. I was smitten with the way concepts were realized into images and stories for the funniest person I have ever met. In addition to the value of humor, Jim taught me the value of the film professionals and what it takes to execute a production.VFX producing is difficult on both the vendor and client side. It is just amazing how Joyce was able to carve out the Joyce side by asking both the client and the vendor equally hard questions, sometimes in front of each other. I called it the Joyce quasi-state, a place between the two sides. She was able to walk that thin razors edge revealing, with her characteristic humor and wit, the underlying issues and keep the production on track.Richard Hollander, VESVES honoree Joyce Cox, VES shows off her VES Lifetime Achievement Awards.VES honoree Joyce Cox, VES hits the VES Awards red carpet with friends and family.VES Lifetime Achievement Award recipient Joyce Cox, VES shares a warm moment with VFX Producer Richard Hollander before the gala.Cox moved to Los Angeles in 1980 and over the next 15 years produced hundreds of commercials, eventually taking a position as the executive producer for Bruce Dorn Films where she had her first opportunity to work with digital visual effects. In the mid 90s, a time when digital technology was rapidly evolving into its present role as a creative and technical cornerstone for filmmaking, Cox transitioned from the role of commercial producer to producing visual effects for feature films.Several years producing commercials, many involving visual effects, from storyboard concepts to final delivery, proved to be the perfect primer for producing visual effects for movies. One day, a dear friend, Lee Berger, asked me to fill in for him on a project at VIFX, a digital facility that had recently been purchased by 20th Century Fox. Always looking for a new challenge, I said sure. That was the beginning of the career VES honored with this award. The timing was perfect. I stepped into this world at the beginning of its rapid growth into the massive industry we have today.For the next five years, Cox worked as a facility VFX producer on numerous film projects, including Titanic, Pushing Tin, Fantasia 2000, Harry Potter and the Sorcerers Stone and How the Grinch Stole Christmas.One of the first projects I worked on at VIFX was Out to Sea with Jack Lemmon and Walter Matthau, then I moved on to Titanic. I was there to help organize the facility to be more effective at a time when that was really needed. Richard Hollander, VES was the President and Senior Visual Effects Supervisor for VIFX under Foxs ownership and for my collaborator on Titanic. I just started asking Richard questions about digital art and how to organize a production workflow. He was a huge influence in providing knowledge and mentorship.Cox continued about her work on Titanic. It was like jumping into the pit of fire and learning under pressure, all at once. Jim [Cameron] was popular, but not like now, and we were looking at a runaway budget while he had the power to hold onto the reins. Plans constantly evolve. Movies are all theory until you shoot and cut and try to actually make them. This experience was intense and challenging, and coincided with my husbands cancer diagnosis, which actually helped me keep perspective on what matters most in life as I went about my job.Jim is one of those uncompromising directors who wants to push things to the edge with the use of technology. The drowning scenes were shot in a tank in Mexico, and since it was very hot, you could not get any visible cold breath coming from the actors. At the time, the capacity to render cloud particles to that degree was unreliable, so we built a black cold room and my husband shot it. We had an actor in black read the lines. We captured his breath and had compositors working on Flame roto-ing hands and placing breath. It was one of many shots that called for our best problem-solving to bring the directors vision to life. And it looked cool.In presenting the VES Lifetime Achievement Award to Cox, Richard Hollander, VES extolled her keen abilities. I began working directly with Joyce as my VFX Producer on several projects including Titanic and Harry Potter and the Sorcerers Stone. I was able to experience her skillsets first hand. She was able to glide through discussions with our clients, portray the situation and tell them the truth, which was not something they always wanted to hear. Even with this frankness, our clients trusted her. There it was. A natural in our VFX workplace. I knew then that her career was only beginning.Hollander continued, VFX producing is difficult on both the vendor and client side. It is just amazing how Joyce was able to carve out the Joyce side by asking both the client and the vendor equally hard questions, sometimes in front of each other. I called it the Joyce quasi-state, a place between the two sides. She was able to walk that thin razors edge revealing, with her characteristic humor and wit, the underlying issues and keep the production on track.VES honoree Joyce Cox, VES celebrates with friends at the VES Awards.Cox was honored with the title of VES Fellow in 2018, presented to her by former VES Board Chair Mike Chambers.In 2000, Cox moved to the production side. Over the next 20+ years, she worked with some of the worlds most talented directors and crews, creating beautiful, powerful and groundbreaking films, including: Superman Returns and X2: X-Men United with Bryan Singer; Avatar with James Cameron; The Dark Knight with Chris Nolan; The Great Gatsby with Baz Luhrmann; Men in Black 3 with Barry Sonnenfeld; and The Jungle Book with Jon Favreau.My time in digital facilities was instrumental because I now had the ability to understand and be compassionate and demanding of facilities. On the production side, I liked being one of the first hired and one of the last out, so I could participate and observe the entire creative process.Looking back at her decades in the film industry, Cox points to her takeaways and what she considers markers of success. I have learned something on every single movie Ive ever done because the technology is moving so fast and is antiquated by the time Ive jumped to the next project. I get to explore the diversity of highly creative and exceptionally smart people and be a part of how those minds take words off the page and realize them through an intense process into a beautiful film experience.During the making of the films, I see all the pieces thousands of times, but when all is done and were in the theater and the audience knows none of the pain it took to birth this project it feels good. It means were giving people something that inspires or enriches their lives.My job is not necessarily the most fun as the one with fiduciary responsibility, but it has also been my love of challenges, of puzzles that has made this such a rewarding career. Motivating people to the common goal of making the best movie on time and on budget is where I have had the opportunity to excel. When asked how I do it? I maintain altitude. I get my ego out of the way to help the team achieve. And together, we navigate the often-rocky journey and create something that is greater than what we could have achieved without this harmonic convergence.0 Comments 0 Shares 301 Views
-
WWW.VFXVOICE.COMIMAGINATION REIGNS SUPREME FOR KINGDOM OF THE PLANET OF THE APESBy TREVOR HOGGImages courtesy of Walt Disney Studios Motion Pictures.Capturing Owen Teagues distinct facial expressions allowed the actors face to come through in the character Noa.Where the original Planet of the Apes pushed the boundaries of prosthetic makeup and the prequel trilogy introduced photorealistic CG apes, Kingdom of the Planet of the Apes provided an opportunity to expand upon the digital cast members and their ability to speak without relying heavily on sign language.The story takes place approximately 300 years after War for the Planet of the Apes as Proximus Caesar attempts to harness long-lost human technology to create his own primate kingdom. This is about apes all the way through. The world is upside down and the humans are now these feral little creatures running around in the background, states director Wes Ball, who was responsible for The Maze Runner franchise and is laying the groundwork for another Planet of the Apes trilogy. Were going to have more talking, and the apes are going to be acting more human-like because this is marching towards to the 1968 version where they are full-on walking on two legs.Continues Ball, In terms of the visual effects of it all, youve got all of these amazing new developments that Wt FX has done from Avatar: The Way of Water. Rise of the Planet of the Apes came on the heels of the performance capture leap forward. [We looked at] all the tech on Avatar Wt FX had provided, and then we took it outside, Visual Effects Supervisor Erik Winquist recalls. From a hardware and technology standpoint, one of the improvements is now were using a stacked pair of stereo facial cameras instead of single cameras, which allows us to reconstruct an actual 3D depth mesh of the actors face. It allows us to get a much better sense of the nuance of what their face was doing.The beach sequence was inspired in part by the original 1968 movie.Avatar: The Way of Water used old Machine Vision cameras that straddled the matte box on the main hero camera. We did the same thing here in every instance, and it has allowed us to get a wider field of view and also a stereo mesh of whatever was standing in front of the camera, Winquist explains. If we need to harness that to help reconstruct the body performance of what the actors are doing, we can use that as an aid in terms of reconstructing what their limbs were doing that we couldnt see off-screen from the main camera. Unlike the previous three films, this was shot with Panavision anamorphic lenses, so we no longer had that extra real estate above and below the frame lines like we did when we were shooting spherical, so that came in handy there. The other obvious thing that we took from Avatar: The Way of Water was the water itself. There were literally two shots in War for the Planet of the Apes where Caesar goes over the waterfall and winds up in the river down below. Those shots were definitely a struggle back in 2017 when that was done. Since then, with all of the additional tech that had to be done for Avatar: The Way of Water to deal with the interaction of hair and fluids, we could leverage that in this movie to great affect.Kevin Durand had a great time portraying Proximus Caesar, as demonstrated by his vocal performance.The Eagle Clan was modeled on Mongolian cultures that are deeply tied to eagles.The most difficult character design to get right was Noa (Owen Teague), who is shown here with Dar (Sara Wiseman).Clean plates had to be shot without the motion capture performers, which meant that camera operator Ryan Weisen and actress Freya Allan, who plays the human character Mae, had to recreate the camera movement and performance from memory. Ryan has gotten really good in repeating the moves, states Cinematographer Gyula Pados. In the last couple of weeks, Erik came up with the Simulcam system where they can live playback what we shot overlayed on the camera so you could see the actors as simple 3D apes and play it back. It was equally difficult for the cast. Having to act against air is not an ideal situation, Freya Allan admits. That was probably the hardest part of it, of not being able to stare, like have a proper conversation with somebody when youre looking at them at least. I also had to do some bizarre things, like I had to hug the air. The suits and cameras didnt bother me too much. They embody the apes so well that I was more focused on that than what they were wearing or the camera on their head. Though sometimes they had to take the camera off because if they were too close to me, it would start bashing me in the face. I spent more time making fun of them, especially when they had to wear blue suits to interact with me.From a hardware and technology standpoint, one of the improvements is now were using a stacked pair of stereo facial cameras instead of single cameras, which allows us to reconstruct an actual 3D depth mesh of the actors face. It allows us to get a much better sense of the nuance of what their face was doing.Erik Winquist, Visual Effects SupervisorThe amount of dialogue has been increased, with one of the more talkative being the orangutan named Raka, portrayed by Peter Macon.Different cultures were represented by various ape clans. Originally, we were talking about they having their own coins but that never came necessary in our narrative, Production Designer Daniel T. Dorrance explains. The Eagle Clan is primitive and lives off of the land. Nothing from the human environment. Everything is organic, made from the earth. They never went beyond the Forbidden Zone because they knew once youre in the human world, theres danger. For when were traveling, for the most part, we did all of these different things along the way. Noa meets Raka, and were starting to see human elements creep in a little bit. Raka is a picker and has little stashes of things around his place. As we get to the end of the movie with Proximus Caesar, we see that theyre living off the human environment. Everything is made of metal that theyve taken from the ship, and they have turned it into things that help them to survive. Village scenes were not straight forward. You can only capture five people at a time, Dorrance reveals. Normally, in Maze Runner we have a street full of people, and theyre crossing the street doing the things that extras usually do. None of that happened. Youre sitting in front of a whole village set with everything that we dressed in that would normally be people chopping wood or whatever it might be. Those things were there, but no one was doing them on the day. All of that was done in post.Outside of a last-minute production change that saw principal photography take place in Australia rather than New Zealand, the trickiest part of shooting outdoors was the amount of greens required. Part of the fun of this movie is [observing up close how] so much time has passed that our world is slowly erasing, Ball states. There is this great story about these guys when they found all of these ruins in South America that at first looked like a mountain. They didnt realize that it was a giant pyramid until they cleared away thousands of years of overgrowth and trees. I loved that concept for our world, and thats how we get to the 1968 version where there are Forbidden Zones and whole areas of worlds that have been lost to time. Thats what were building in this world. This sense of the Lost World living underneath Noas nose, and one that he has to uncover and learn about and ultimately be affected and changed by it.Decommissioned coal factories and power plants were photographed and painted over digitally to create ruined buildings overtaken by centuries of vegetation. One of the things that I was looking at early on was the book The World Without Us that hypothesizes what would happen in the weeks, years, decades and centuries after mankind stopped maintaining our infrastructure, Winquist explains. You start pulling from your imagination what that might look like, and we had concept art to fallback on. We started from the bones of some the skyscrapers that Wt FX did for Wes The Maze Runner films, stripping away all of the glass, turning all of the girders into rust and then going crazy with our plant dressing toolset to essentially cover it up. The great thing is we still had that live-action basis that we could always refer back to. What was the wind doing? How much flutter in the leaves? You have a solid baseline for moving forward.A daunting task was getting the look of the protagonist right. When we first saw some of the concept art for what Wes had in mind for Noa, I was like, He looks a lot like Caesar in terms of the skin pigmentation and the specific way the groom sat,Winquist acknowledges. Some of that is deliberate, but Noa is his own ape in every way. We learned back on Gollum to incorporate the features of the actor into the character. Everybody has some amount of asymmetry to their face. but Owen Teague has this distinct slight difference in where his eyes sit in his face. What we ended up doing was mimicking a lot of those asymmetries. Often, when Owen would play frustrated or apprehensive, he does something distinct with his lips. There were some key expressions that we wanted to make sure that we nailed. When its working, its beautiful because you suddenly see the actors face coming through the character.Kingdom of the Planet of the Apes benefited from the stereo facial cameras that allowed for 3D mesh to be constructed of what the actors face was doing.The actor provides the soul of the character, but it is the animator who needs to figure out what that means in the context of an apes face.Concept art was assembled into a book by Wes Ball that was provided to the entire team and updated weekly.The culture and architecture of the Eagle Clan reflects their mantra of living off the land.Fortunately for Editor Dan Zimmerman, he had an established shorthand with Ball after cutting The Maze Runner trilogy and considering the mammoth task ahead of him, which saw him recruit his former first assistant editor Dirk Westervelt as a co-editor on the project. First of all, it was daunting because I have never done any version of this movie or production before in my life, Zimmerman reveals. I was like, They shoot a scene. I get the scene. I cut the scene. The cores of what you have are what they are. But sometimes you truly have limitless options. You can do what you want and not only with shot selection and performance. You can choose a word from one performance and put it into a different performance because someone didnt say that word right or flubbed it, or you can stitch performances together to create a performance that then goes into a shot. I had to wrap my mind around that whole aspect of cutting. I would turn the monitors off because what I would try to do is listen to the takes and try to figure out if I were to watch this scene what are the best performances of the scene that I want to make work, and the flow of it. I would basically do like an audio assembly of all of the different performances and go, That looks good. And then turn the picture back on and ask, What mess am I into now? And figure out from there how to manipulate it and then after that choose the shot that those performances go into. It was a whole process for me. There was a definitely a learning curve.Scenes and environments were mapped out in Unreal Engine. In terms of set work and set extension work, we used a lot of Unreal Engine in this movie, Dorrance explains. Every set that we designed and drew, and location, we would actually plug it into Unreal Engine and have it in real-time lighting. Wes likes to work in Unreal Engine so he can play with his camera moves. In doing that I have to extend it anyway in that environment, otherwise Im only dealing with the foreground. We continued to design the world beyond for every set possible. Cinematographer Pados also took advantage of Unreal Engine. Theres a big action sequence, which I thought maybe we can do in one shot, and I could build it and show it to everyone. It was like, This is what I thought. What do you think? That part is useful for me because before that you would start to talk and people were nodding, but you see that they cant see it. Sometimes I feel that using Unreal Engine has changed my life over the last couple of years. I can show them scenes, and its much easier for me to translate, Pados says. Improvements were made by Wt FX in profiling cameras. For the first time, we have actually built a device to measure the light transmission through the lenses in terms of what the lens are doing spectrally to work into our whole spectral rendering pipeline, Manuka, Winquist remarks. That has been one of those elusive pieces of information that we have never had before. Its not a huge thing visually, but it has been an interesting additional to our spectral rendering pipeline.All of the media in the cutting room was made available online for Wt FX. We were able to quickly hand over bins of cuts that would then relink to our media, which was the same on the Wt FX side in New Zealand, Zimmerman states. We call it Wttorial, and James Meikle [Senior Visual Effects Editor, Wt FX] was amazing over there. Basically, he and my visual effects editors, Logan Breit and Danny Walker, would communicate and say, He changed this. Were going to send you a bin, but then were going to send the paperwork with it. James could then open up that bin, and we could tag it in a way that he could see what the change was, or if it was a performance swap or something like that. James could then easily relay to Animation Supervisor Paul Story what the change was and when to expect the change and all of the data to make the change happen.Freya Allan is one of two human characters, with the rest of the principal cast being CG primates.Nature reclaiming areas once inhabited by ancient civilizations was a major visual motif.Proximus Caesar believes that ancient human technology is the key in being able to establish a kingdom.Some shots had to be done entirely in CG.The hardest part for Ball has been the sheer process of making Kingdom of the Planet of the Apes. Ball notes. To shoot something that isnt really the image, from the clean plates all the way to the end of making choices about shots, and looking at storyboards and not seeing that for six months until the last two weeks when you cant change it [is frustrating and difficult]. And it all has to come back together. I talk about this idea of the clich movie scene of the waiter with a whole bunch of stuff on a tray. He falls and it all goes up in the air and somehow it all comes back down and lands. Thats what were doing. How do you make something that feels organic, real, spontaneous and alive, but its so slowly pieced together by choices made over years? That has been a hell of a learning experience for me and a fun one. I enjoy a good challenge.0 Comments 0 Shares 304 Views
-
WWW.VFXVOICE.COMEMMY VFX HOPEFULS RISE TO THE CHALLENGE TO SERVE THE STORYTELLERSBy CHRIS McGOWANOne of the biggest VFX achievements on Masters of the Air was the focus on historical detail, which carried over to the depth of the effects highlighted by a complex choreography of hundreds of planes in battle, diving, zooming past and breaking contrails.(Images courtesy of DNEG 2024 Apple Inc.)Big-screen VFX continues to stretch the small screen, and the effects keep getting more essential and cinematic, as evidenced by the shows eligible in the Outstanding Special Visual Effects categories (in a Season or a Movie or in a Single Episode) of the 76th Primetime Emmy Awards on September 15. The contenders represent a rich segment of the VFX work that has become the backbone of high-end episodic television, including world-building, digi-doubles, face replacements, de-aging, simulations, environmental and invisible VFX. Following is a look at some impressive VFX-infused TV/streaming shows poised for an Emmy nomination, with VFX supervisors revisiting their VFX highlights.When it comes to world-building, no canvas is broader or more complex than sci-fi. The largest VFX challenge on Season 2 of Foundation was executing all of the different types of visual effects required on the show while maintaining the quality we had established on Season 1, states Visual Effects Supervisor Chris MacLean. The variety of visual effects was daunting given we had to complete complex CG environments, CG creatures, giant CG mechas, CG destruction, water simulation, CG vehicles, and holograms, just to name a few. If we are talking about a challenging sequence, I would have to say that the escape from Synnax in Episode 202 was one of the most difficult. There were a lot of practical sets and stunt work that had to seamlessly integrate with CG water simulation and stunt work. Beki, our domesticated Bishops Claw, was a huge win for us. Knowing how difficult it is to make a ridable CG animal feel grounded in reality, the team planned and executed this flawlessly.Jason Zimmerman, VFX Supervisor on Star Trek: Strange New Worlds for Paramount+, embraces the opportunity to add to the Star Trek legacy. A big challenge with any Star Trek show is always working with canon on some of the fans most beloved characters, ships and effects, he says. In the case of Star Trek: Strange New Worlds Season 2, we saw more of the Gorn than had been seen in many years. In addition to the stellar work from Legacy Effects in creating the practical Gorn, we worked to augment it with additional facial animation, drool, breath, etc. We also did entirely CG shots of the Gorn or CG Gorn with practical actor interaction to help tell the story in the final episode of the season. It was crucial to our showrunners that we seamlessly integrate the CG moments with the practical to aid in the storytelling.Season 4 of For All Mankind saw a huge expansion of the Happy Valley Mars Base, which included designing and creating dozens of new modules, landing pads, roads, terra-forming, vehicles and connecting infrastructure to show the huge growth over roughly a decade. (Images courtesy of AppleTV+)Zimmerman points to the Gorn fight and the conclusion of the last episode of the season as peak experiences. In addition to the full-CG Gorn, facial performance enhancements and fight sequence, we had the scene take place on a backdrop of a partially destroyed ship hull destined to crash. Combining the practical and CG fight action and set extensions inside, with the exterior full CG beats as the ship begins to enter the nearby planets atmosphere, was both challenging and fun to play with as a team and with our vendors. The full CG exterior shots and destroyed ship assets were massive, requiring quite a bit of simulation in the debris field, re-entry fire and smoke, etc. Cut together with the interior fight scene with CG Gorn, along with the eventual escape of our heroes to the exterior in what became a full-CG shot with digi doubles, was quite challenging but ended up as one of our favorite shots during our tenure on the show.Complex CG environments, CG creatures, giant CG mechas, CG destruction, water simulation, CG vehicles and holograms were just a few of the many VFX tasks required on Season 2 of Foundation. (Images courtesy of AppleTV+)Beki, the domesticated Bishops Claw, was a big success for VFX on Season 2 of Foundation, knowing how difficult it was to make a ridable CG animal grounded in reality.(Image courtesy of AppleTV+)Orchestrating the wide variety and different types of VFX work necessary to bring the story to life was a daunting task for Jay Worth, Visual Effects Supervisor on Fallout for Amazon Prime Video show. I have worked on shows in the past where they were primarily a set extension show or genre or world-building, Worth says. However, in this one we had everything. We needed to help create the overall look and feel of the world we were inhabiting. We needed to develop multiple real-time environments for use in Unreal for shooting on a volume. We had multiple creatures with various skins and textures. We had human characters that needed photorealistic replacements to portions of their face. We had characters we were de-aging using new and cutting-edge methods. And unique hard-surface vehicles that needed to match 1:1 to practical production vehicles as well as a lead character that needed to have a CG nose replacement throughout the entire series.Worth and his team fell in love with the Cyclops. I remember [Executive Producer] Graham [Wagner] calling me to say they wanted to do a cyclops, Worth recalls. When we started to talk about it, I realized how crucial this character was and how nuanced his performance would be. So, we tested a few methodologies a pure compositing approach and an AI-generated approach. However, both of those had limitations in terms of the variety of environments we were shooting in along with performance flexibility we knew we would need. Chris Parnells performance was the primary thing we knew we needed to nail if we were going to pull this off, so we partnered with our long-time collaborators at Important Looking Pirates in Sweden and were ecstatic with the results. We were able to capture Chriss performance, the humor, heart and nuance, while creating a full CG effect. I feel like we were able to push past the uncanny valley.Visual Effects Supervisor Douglas Purver adhered to an extraordinary level of detail for Season 2 of HBOs The Gilded Age. With a show this opulent in its set design, costumes and more, theres been a fine line to walk with the effects work. We dont ever want to call attention to ourselves while maintaining the level of detail that seamlessly blends in with whats captured on camera. Most of the time we can use elements from whats practically there. Im constantly taking stills of textures and architectural details, or we bring in a team to get high-resolution scans. But often we are creating things from scratch and finding a real-world reference is challenging, involving a deep dive into historical texts and postcards or a significant collaboration with our production designer and locations department to find elements that can fit into our world.The seasons climax found Purver and his team at the opening of the Metropolitan Opera House, which was filmed in three different locations, weeks apart from one another, at a stage in Albany, New York, the main Opera House in Philadelphia and a set of five opera boxes built on our film stages, Purver details. Getting them all to sit together, especially when the camera wraps around Bertha as she enters her box for the first time, was extremely satisfying. Building the CG crowd to blend with our tiled plates filled the entire venue and gave it the grand opening it deserved. Being able to collaborate with the production designer on how much to build and where, with the cinematographer on light placements how and when to move the camera the director on which story pieces to shoot where, along with the amazing VFX team who contributed countless hours to allow the viewer to stay in the moment and marvel at this climactic, cinematic moment in our story was just a fantastic experience.Adding to the legacy was an opportunity and a challenge for the VFX team on Season 2 of Star Trek: Strange New Worlds, as they explored new worlds and new ways to combine the CG moments with the practical to aid in the storytelling. (Image courtesy of Paramount+)Orchestrating the wide variety and different types of VFX work necessary to bring the story to life was a daunting task for the VFX team on Fallout, including the development of multiple real-time environments for use in Unreal Engine for shooting on a volume. (Image courtesy of Amazon Prime Video)Tim Crosbie, Visual Effects Supervisor on Season 3 of The Witcher for Netflix, called out Episode 6 where almost every shot required some form of VFX, from the spells cast during the battle to the many set extensions throughout all of the exterior fights, then the destruction of Tor Lara and Aretuza towards the end. Our on-set teams had their work cut out. We knew we needed to provide very accurate lighting and LiDAR data to ensure that post-production ran as smoothly as possible because the schedule was going to be tight to get all the shots through. All of our vendors came to the party and produced really beautiful work to help tell the story. This show was one of the more collaborative ones Ive worked on, with everyone pulling in the same direction. I think the most satisfying accomplishment for us in VFX was how much value we were able to bring to the story.Charlie Lehmer, Visual Effects Supervisor on All the Light We Cannot See for Netflix, cited the rampart run sequence in Episode 4 as the most challenging. Early on in pre-production, director Shawn Levy emphasized the need for its immense scale, leading us to explore numerous shooting solutions. However, as ambition grew, on-location filming became unfeasible. Constructing a fully-CG, period-accurate St. Malo [an historic port city in France] environment demanded rigorous research and planning. The core difficulty lay in marrying historical fidelity with our artistic narrative vision. Lehmer says. We dedicated a week to thoroughly scanning and photographing St. Malo. Archival footage, both pre-and post-bombing, was further analyzed to ensure as much accuracy as possible. ILM did an amazing job digitally transforming the modern town into its 1944 counterpart. The result was a full CG city of St. Malo, procedurally built to allow for extensive bombing and collapse of various buildings.Crafting the detailed destruction of St. Malo was a great source of pride for the VFX team. Much of our ground-based filming was set in the charming Villefranche-de-Rouergue, Lehmer reveals. Digitally transforming it into a ravaged postwar St. Malo presented a considerable yet rewarding challenge. We aimed to surpass conventional depictions of bombed-out cities which focus solely on brick and mortar, instead prioritizing granular detail for profound visual impact. Our rubble wasnt mere debris; it was imbued with poignant elements: teddy bears, pianos and intricate paintings lending an unsettlingly personal dimension.John Haley, Visual Effects Supervisor on Marvel Studios Echo for Disney+, was heavily focused on the main action sequences in Episode 2. Bushto and the train heist presented challenges due to their scope and complexity. Recreating the Choctaw Bushto environment for the stickball sequence, which takes place in the year 1200 AD in what is now Alabama, required careful research. The production and VFX teams worked with historians and cultural consultants to ensure that the sensitive historical details were correct. All the shots in the sequence were augmented with visual effects to make the game and scene as realistic as possible. The team at ILM thoughtfully created the environment and CG background characters to portray life in 1200 AD before the arrival of Europeans.An extraordinary level of detail was required for Season 2 of The Gilded Age. Working in close collaboration with the production designer, the VFX team referenced historical texts and postcards to produce effects that seamlessly blended in with what was captured on camera. (Images courtesy of HBO)Almost every shot required some form of VFX on Episode 6, Season 3 of The Witcher, from the spells cast during battle to the many set extensions throughout the exterior fights, to the destruction of Tor Lara and Aretuza. (Images courtesy of Netflix)Continues Haley, [For the train heist], combining live-action acting, stunt performances, digital doubles, face replacements, practical train cars, CG train cars and effects in photographed and digital environments seamlessly into the action-packed Train Heist sequence was no easy feat. When Maya Lopez plunges off the highway overpass onto the speeding train below, the VFX team used all of those resources to achieve the shot transitioning from the plate photography of Alaqua Cox to stunt photography on blue screen to a full Maya digital double, back to Alaqua on a bluescreen, all in a photorealistic all-CG environment. Whew! We wanted the train heist sequence to feel grounded and gritty, choosing camera positions as if we were shooting the scene on a fast-moving train or from a pursuit vehicle. Day-for-night train array plates were shot and color-graded, then used as a basis for the environment. Then, the nighttime environments were modeled and designed to give a sense of speed, danger and depth. Each shot was balanced and composited so it appeared as though it was photographed using only available moonlight and artificial practical light sources.Haley adds, Orchestrating the collaboration between ILM and Digital Domain to create and bring the photoreal Biskinik bird to life [was also an accomplishment]. We were very pleased with the look, animation and attention to detail of the final shots. With the Biskinik bird being such a big part of Choctaw tradition, and Mayas story, it was important that the bird be completely believable.The audiences emotional reaction, particularly from fans of the book, to a couple of moments in Episode 8, the season finale, stood out to Andy Scrase, Visual Effects Supervisor on Season 2 of The Wheel of Time for Amazon Prime Video. One was the death of Hopper. The performance of our Czech Wolfdog, Ka Lupinka, was fantastic! We supplemented that performance with an animated color flash to the eyes and added a pool of blood from Hoppers fatal neck wound forming around the head. I think that little addition emotionally pushed everyone over the edge! It was straight-forward VFX work, but it gave such a payoff because it complemented the performance and initiated sadness and horror among those watching.Not long after Hoppers death in the episode is Mat Cauthon blowing the Horn of Valere. Again, Scrase notes, this seemed to get a big reaction with the book fans, but at the other end [of ] the emotional scale. Seeing the Heroes of the Horn form and emerge from a localized mystical fog brought a certain degree of euphoria. The fog moment features in the second book [The Great Hunt], and so it felt important to keep that component. We then used influences from the Hindi festival of Holi, fireworks exploding in thick smoke, and some beautiful photography I found of dancers holding poses in clouds of powdered paint to inspire the heroes appearing in our CG fog. The low of Hoppers death almost immediately followed by the excitement of Mat blowing the Horn heightened the emotional reaction from those in the audience. For me, it showed how our work in the industry is not just about flashy effects or seamless additions; it can emotionally contribute to a scene and an audiences reaction.Christopher Townsend, VFX Supervisor on Season 2 of Marvel Studios Loki for Disney+, had many loose threads and strands to tie up for the finale of the limited series. Creating some of the CG environments so they still fit in with the lo-fi, analog visual style of the whole show was challenging, particularly when outside the TVA, with swirling prismatic flares, a disintegrating spaceman-like suit and a massive floating loom weaving threads of time. The unique and original spaghettification and time-slipping effects were designed to fit within the visual motif of time represented as lines, threads and strands. The final tree-like Yggdrasil galactic image, showing the transformed timelines with Loki at its heart, felt like a beautiful and epic moment to end the show.Constructing a fully-CG, period-accurate city of St. Malo in France for All the Light We Cannot See, ILM digitally transformed the modern town into its 1944 counterpart. The fully CG St. Malo was procedurally built to allow for extensive bombing and collapseof buildings. (Images courtesy of Netflix)The production and VFX teams on Echo worked with historians and cultural consultants to ensure the accuracy of sensitive historical details. The Creation Pools sequence was rooted in Choctaw lore. Digi-doubles were made for the main Choctaw characters. VFX enhanced the realism. (Images courtesy of Marvel Studios)For Jay Redd, VFX Supervisor on Season 4 For All Mankind for Apple TV+, the biggest challenge was the sheer variety of VFX work for Season 4 and for every season of the show, and keeping our feet firmly planted in real physics and science while bending the rules here and there to serve the storytelling, he says. While we are an alternate timeline show, our approach is hard science. We put a major effort into making sure things feel real in space, on Mars and on Earth. A lot of this work comes early in the previs stages, me working with The Third Floor in designing shots and sequences, working on physics, pacing, and scale. We work hand-in-hand with our astronaut and technical consultants to keep things as realistic and scientifically accurate as we can, knowing there are times when drama and story call for changing the pace and timings of certain events, like ships docking, landings, etc.Redd continues, This year, we had two big challenges: the huge expansion of the Happy Valley Mars Base and the Asteroid Captures. Happy Valley was a 50-fold expansion from Season 3, so there was a massive amount of work in designing and creating the dozens of new modules, landing pads, roads, terra-forming, vehicles, and connecting infrastructure to show the huge growth over roughly a decade. We worked very closely with production design to make sure we integrated our look from Season 3 while also showing the epic scale of growth in Season 4. The DNEG Montreal team, led by VFX Supervisor Mo Sobhy, did an amazing job in hitting a massive amount of detail across the base and multiple landscapes under varying lighting and atmospheric conditions.The Asteroid captures in the beginning and the end posed major challenges for Redd and his team. Once again, we needed to make things as plausible and scientifically accurate as we could while serving a dramatic and emotional story, he states. Working with very limited set pieces on small stages, we had dozens of shots that are fully CG, partial live-action and hybrid/mid-shot blends utilizing extensions, digi-doubles, face replacements and big simulations for asteroid pebbles, rock and dust. The designs of the asteroids are based on real existing asteroids, and capture ships and mechanisms come from real-world examples and future-looking potential endeavors. We had conceptual challenges in showing ships firing engines but appearing to be moving backwards, and slowing asteroids to enter Mars orbit. The Ghost VFX team in Copenhagen, Denmark, led by VFX Supervisor Martin Grdeler did an incredible job in working with me on a ton of scope and detail in the models and simulations, and very specific lighting cues to show the scale and reality of these scenes.The uniquely original spaghettification and time-slipping effects for Loki were designed to fit within the visual motif of time represented as lines, threads and strands. The final tree-like Yggdrasil galactic image, showing the transformed timelines with Loki at its heart, dramatically closed out the show. (Image courtesy of Marvel Studios)Daniel Rauchwerger, Visual Effects Supervisor on Silo for Apple TV+, found that his biggest VFX challenge was the open-space, curved mega structure of the silo, where in every shot we see, continuous to the plate, a crowd that behaves naturally and actively reacts to the actions of our characters and tensions in the silo, he says. We had to make the natural feel of a living, breathing underground city where 10,000 people live, and make sure that we get the organic texture of movement and life combined with the mechanics and inner workings of the silo seamlessly. We are very proud that we managed to bring the character of the silo to life in an invisible way and become something the audience does not think about and instead accepts the silo and its residents as real, hopefully not thinking about VFX.Most challenging for the VFX team on Silo was the open-spaced, curved mega structure and creating the natural feel of a living, breathing underground city where 10,000 people dwell. (Image courtesy of AppleTV+)For Ben Turner, Visual Effects Supervisor for Season 6 of The Crown for Netflix, fidelity to character and story was paramount and going unnoticed was an achievement. The story of Princess Dianas death brought with it perhaps the most expectations, and the greatest burden of responsibility, of any subject we tackled in the preceding 52 episodes. It was clear from the beginning that the subject would have to be handled sensitively and our VFX team was at the heart of achieving this.Explains Turner, One of our biggest VFX challenges of the final series came in Episode 3 [Dis-Moi Oui]. A central location to the scenes in this episode was the famous Ritz Hotel, located in Place Vendome in Paris. The art department built a partial set [for the doorway of The Ritz] on the backlot at Elstree Studios in London. Our team created the rest of the enormous square in 3D, using extensive LiDAR scanning and photography of the real location in Paris. We then tweaked the CG to better match the art department build in order to create a seamless environment. The scenes required a building sense of frantic claustrophobia; we helped to heighten this by adding crowds and additional photographers to the square surrounding the characters and their cars.The low of Hoppers death almost immediately followed by the excitement of Mat blowing the Horn [of Valere] just heightened the emotional reaction from those in the audience. For me, it showed how our work in the industry is not just about flashy effects or seamless additions, but it can emotionally contribute to a scene and an audiences reaction.Andy Scrase, Visual Effects Supervisor, The Wheel of TimeThe VFX for Season 2 of The Wheel of Time demonstrated that the work isnt about flashy effects and seamless additions, but contributing to the story to evoke an emotional response from the audience. (Image courtesy of Amazon Prime Video)The VFX for Episode 8, the season finale of The Wheel of Time, was designed to complement performance. (Images courtesy of Amazon Prime Video)A VFX highlight for Turner occurred in the same episode. It sees a teenage Prince William shoot his first stag in the Highlands of Scotland. We were tasked with creating the animal fully in CG, together with tweaks to the environment, for a scene in which our work was literally in the crosshairs. We also had to make the CG creature match a real stag used on location for close-up shots of the animal. This required sculpting and grooming the model, to have an exact match for the antlers and fur coloring. It was a short sequence but very satisfying, as I dont think people will question it for a moment. These invisible effects sequences typify the VFX work on The Crown. We help bring the writers and directors visions to life but aim to maintain a quality, which means that the viewer would have no idea of the enormous amount of work thats gone into our shots.Working on a high-flying, high-profile project like Masters of the Air for Apple TV+ was technically and creatively challenging for DNEG VFX Supervisor Xavier Bernasconi. There were months spent on virtual production, featuring air battles with hundreds of planes in a war theatre on a scale never done before. DNEGs VFX work covered thousands of shots taking place over thousands of kilometers, including accurate 1940s 3D landscapes and cloudscapes from Greenland and Algeria to Norway and the South of France, all with hundreds of plane models, liveries and damaged variations performing in extremely complex choreography while being truthful to every historical detail, he explains.Masters of the Air was the biggest launch ever for Apple TV+. Viewership climbed after the premiere. Bernasconi notes, This meant that with DNEGs work we were able to engage the viewers and tell a believable and compelling story, while wrangling thousands of people across the globe to deliver incredibly complex work. Historians, air pilots and veterans alike have praised the attention to historical details in the VFX work.One of the shows biggest achievements was keeping that laser focus on historical detail, which carries over to the depth of the effects. The show has so many incredibly stunning shots, Bernasconi says. Everyone was crafted with the highest level of detail. If I had to pick [one outstanding shot] Id say the wide shots with hundreds of planes raging in battle each crewed with digi-doubles, fighters zooming past at 600mph breaking the stillness of contrails, and with realistic choreography of the events are a visual testament to the incredible work that our DNEG team produced.Netflixs 3 Body Problem, One Piece and Avatar: The Last Airbender, Amazon Primes Gen V, FXs Shgun, AMCs The Walking Dead: The Ones Who Live and Disneys Percy Jackson and the Olympians are among the other series eligible to be nominated.0 Comments 0 Shares 307 Views
-
WWW.VFXVOICE.COMSTUNTS, SFX, PRACTICAL AND VFX TEAM UP TO IGNITE THE FALL GUYBy CHRIS McGOWANImages courtesy of Universal Studios, except where noted.In The Fall Guy, Colt Seavers (Ryan Gosling) a stunt double for action star Tom Ryder (Aaron Taylor-Johnson) suffers a horrific accident in a 12-story stunt fall in a building atrium. While recovering, Colt withdraws from both the film business and camera operator Jody Moreno (Emily Blunt), with whom he had been having an on-the-set romance. Eighteen months later, Colt re-emerges when Toms producer Gail Meyer (Hannah Waddington) invites him to do stunt work on Metalstorm, an epic sci-fi movie directed by Jody, whose career is taking off, and starring Tom. However, when Colt arrives at the shoot, he learns that Tom has gone missing. Gail tasks Colt with finding him or the studio will pull the plug on Metalstorm and, with it, Jodys ambitions. Meanwhile, Colt is doing all that he can to win back an aggrieved Jody.Stuntman Colt Seavers (Ryan Gosling) is suspended from a camera crane attached to a pickup driven by director Judy Moreno (Emily Blunt) in The Fall Guy, directed by former stuntman David Leitch.The plot of Universals The Fall Guy allows for a smorgasbord of dangerous feats, including a 150-foot fall out of a helicopter, a 225-foot truck jump, and a world record of 8 car cannon rolls. It is a spectacular ode to stunt work inspired by the 80s TV series of the same name. The frenetic action is directed by former stuntman David Leitch (Atomic Blonde, Deadpool 2, Bullet Train). While The Fall Guy is a tribute to old-school stunts, it is also proof that stunts, practical effects and visual effects can live harmoniously together.The over-the-top VFX for the Metalstorm movie-within-a-movie included fighting aliens, explosions and spacecraft. Flying through the air, leaping and sword-wielding were made possible by rigs and wires that were later painted out.David [Leitch] knows how to use VFX in his films to help the story. We had one rule when it came to the VFX on The Fall Guy Dont. Touch. The. Stunt. Everything else in the frame was pretty much fair game if it helped tell the story. The movie is stunt driven, a movie about a stuntman directed by a stuntman. The stunts are real. Thats the whole heart and soul of the film.Matt Sloan, Production VFX SupervisorProduction VFX Supervisor Matt Sloan comments, Working with David [Leitch], he is hugely inclusive and knows how to use VFX in his films to help the story. We had one rule when it came to the VFX on The Fall Guy Dont. Touch. The. Stunt. Everything else in the frame was pretty much fair game if it helped tell the story. The movie is stunt driven, a movie about a stuntman directed by a stuntman. The stunts are real. Thats the whole heart and soul of the film.To work on a movie inspired by a cult classic like The Fall Guy was a great experience, says Cinesite VFX Supervisor Jennifer Meire. The original series was known for its practical stunts and effects, so there was a delicate balance to strike between honoring the nostalgic elements and incorporating the more modern visual effects. She adds, Director David Leitchs background as a stunt performer and action designer brought a valuable perspective to the creative process, which led to some seriously mind-blowing action sequences. He and Matt Sloan had a clear vision for how the visual effects could enhance the storytelling and action sequences, and their expectations were focused on achieving a seamless integration of practical and digital effects to create great shots.The Fall Guy focused on executing difficult old-school stunts and practical effects that were cleaned up and amplified with the help of VFX.The Fall Guy was written by Drew Pearce and produced by Kelly McCormick, Guymon Casady, Gosling and Leitch. Jonathan Sela was Director of Photography, David Scheunemann helmed Production Design, Dan Oliver was Special Effects Supervisor and Chris McClintock served as Production VFX Producer. Contributing visual effects studios included Framestore (VFX Supervisor Nicolas Chevallier), Rising Sun Pictures (VFX Supervisor Matt Greig), Crafty Apes (VFX Supervisor Jordan Schilling), Opsis (VFX Supervisor Tefft Smith II) and Cinesite (Meire). Colt Seavers (Ryan Gosling) surfs with sparks flying on a tailgate across the Sydney Harbour Bridge in Sydney, Australia.[For the 225-foot truck jump] Logan Holladay jumped that car across the canyon on set. Utterly terrifying to watch. Naturally, there were a ton of cameras on that shot. For The Fall Guy movie [not Metalstorm] we cleaned a bunch of cameras out along with some crew vehicles and some distracting power lines in the BG. The Jody double got such a shock when the car suddenly roared over her that she did not pan her prop camera, so we fixed that as well. It was a perfect example of VFX being used to support a 100% real stunt.Matt Sloan, Production VFX SupervisorFor his work on The Fall Guy, Chris OHara received the first ever Stunt Designer credit from the Screen Actors Guild and Directors Guild of America. The recognition is well overdue, according to Sloan. If you think VFX has it bad when it comes to under-recognition, its nothing compared to the stunt department. These guys work hard. Like really hard. The planning and prep that went into the stunts on The Fall Guy were insane. Its a long and meticulous process, and it should be that shit is really dangerous. On top of that, the daily discipline required to maintain the levels of fitness and agility for each stunt performers skill set is incredible.Trucks on the beach readying for a cannon roll, which set a world record at 8 vehicle rolls. (Photo: Eric Laciste)Sloan continues, Of course the math part isnt sexy, so youll never really see that stuff in any of the behind-the-scenes footage. You just see a guy flying through the air, and you think, That looks cool. But in the end, the designer takes responsibility for a series of decisions and calculations that, if incorrect, could result in a member of their team and usually a friend being seriously injured or worse. Designing stunts is a high-stress, serious job and deserves any and all recognition that comes its way.Director David Leitch, left, readies a helicopter stunt with Ryan Gosling onboard for a Metalstorm sequence. (Photo: Eric Laciste)The Fall Guy had several death-defying stunts. The 225-foot truck jump was practical, Sloan says. Logan Holladay jumped that car across the canyon on set. Utterly terrifying to watch. Naturally, there were a ton of cameras on that shot. For The Fall Guy movie [not Metalstorm] we cleaned a bunch of cameras out along with some crew vehicles and some distracting power lines in the BG. The Jody double got such a shock when the car suddenly roared over her that she did not pan her prop camera, so we fixed that as well. It was a perfect example of VFX being used to support a 100% real stunt.Because the original Fall Guy TV series was known for its practical stunts and effects, Cinesite VFX Supervisor Jennifer Meire aimed for a balance between nostalgic elements and modern VFX. (Images courtesy of Cinesite and Universal Pictures)The high back-fall was again, real, Sloan explains. Troy Brown back-flipped from a helicopter and fell 150 feet into an airbag. Watching that happen in front of you is incredibly unnerving I did not realize I was holding my breath until the thumbs up came from the bag. Pure spectacle. VFX-wise, for safety reasons, we used a static helicopter buck as the platform for him to jump off. We replaced the helicopter buck so it could still be spinning in continuity with the rest of the sequence and added smoke from the gunshot damage. We did not touch the stunt itself!The Metalstorm shots were the antithesis of The Fall Guy, Massive, overwhelming VFX silliness. We threw everything in there. Fighting aliens, explosions, spacecraft and we even blew up the moon why not! A lot of the Metalstorm shots were shot deliberately as Metalstorm shots, but some were not. We had a VFX team getting data with cams and ref on pretty much every shot on the film because we really did not know on a shot-to-shot basis where we would have to step in.Matt Sloan, Production VFX SupervisorStunt double Ben Jenkin in flames on the set of The Fall Guy. Backup with a fire extinguisher is at the ready. (Photo: Eric Laciste)Each sequence is a series of different shots, each with its own VFX or stunt considerations, Sloan remarks. At the beginning of production, I told Chris [OHara] that I did not care how many wires/support equipment we needed to paint out as long as it was safe. He was great at minimizing the work we needed to do, but there were instances where we had a huge amount of rigging to remove, especially if it was Ryan and not one of the doubles. We opted for bluescreen for a lot of the interior car work; to shoot that practically would be too costly in terms of time. Time is the most expensive asset you have when you are shooting. David is super aware of this, and shooting that material on stage was just economically sensible. The work is perfectly serviceable to the story, and we could utilize that time-saving for larger, more complex shots/sequences. There was other bluescreen work on and off throughout the movie. Sloan notes, Our grip team had a bunch of fly-in screens that we could run onto the set if something came up that required them. In saying that, this movie used less bluescreen than any other Ive been involved in. [Yet it] also used more roto than any Ive been involved in!The stunt structure with wires for Ryan Gosling`s 12-story fall inside a building atrium.When it came to post-production [on the Metalstorm shots] and we started fleshing out the shots, it just kept going bigger and bigger. I always loved the telling silence from Nicolas Chevallier, our [VFX] Supe at Framestore, on our calls each time we would increase the scope in these shots. More explosions!, More aliens!, More lasers! and, of course, The moon should explode on frame 67. They took it in stride and produced some very fun and technically spectacular shots.Matt Sloan, Production VFX SupervisorThe use of visual effects was extensive for the movie within the movie. The Metalstorm shots were the antithesis of The Fall Guy, Sloan explains. Massive, overwhelming VFX silliness. We threw everything in there. Fighting aliens, explosions, spacecraft, and we even blew up the moon why not! A lot of the Metalstorm shots were shot deliberately as Metalstorm shots, but some were not. We had a VFX team getting data with cams and ref on pretty much every shot on the film because we really did not know on a shot-to-shot basis where we would have to step in. When it came to post-production and we started fleshing out the shots, it just kept going bigger and bigger. He continues, I always loved the telling silence from Nicolas Chevallier, our [VFX] Supe at Framestore, on our calls each time we would increase the scope in these shots. More explosions!, More aliens!, More lasers! and, of course, The moon should explode on frame 67. They took it in stride and produced some very fun and technically spectacular shots.One of the most dangerous stunts in the film is a 150-foot fall out of a helicopter. Here, Gosling hangs from a helicopter buck to capture the shot, though it was stuntman Troy Brown who back-flipped from a helicopter and fell 150 feet into an airbag.For Sloan, the Bin Truck chase was the most challenging sequence involving VFX. It was an incredibly complex, fast-moving sequence that made us utilize pretty much every trick we had, he says. We had multiple driving rigs, vehicles, doubles, locations and, of course, massive stunts. Face replacements, CG vehicles, background changes, adding sparks, huge rig removals. Added to that, we were shooting multiple cameras, so you never knew which angle was going to land in the cut. Some were great, and some were absolute monsters of roto and prep. There was a huge amount of work in that sequence, and the Framestore team did an amazing job.The third act with the helicopter work was a close second for most challenging, Sloan says. We utilized [a] helicopter buck for the majority of the sequence, and Rising Sun Pictures absolutely rose to the challenge, adding rotors, downwash, rotor flicker, smoke, explosions and backgrounds. Matt Greig and his team in Australia did some stellar work there.Face down in the sand for a Metalstorm sequence, Gosling gets flames added to his back courtesy of VFX, for safetys sake. (Images courtesy of Cinesite and Universal Pictures)Cinesite delivered over 350 visual effects shots to The Fall Guy, according to Meire. She notes, Fire was an important theme for much of our work, including a fire-breathing shot, a sequence where Colt is set on fire in a movie set stunt and a dramatic boat-chase explosion. Throughout our fire-related VFX, we utilized advanced computational fluid dynamics solvers and combustion models to simulate the turbulence of the flow of gasoline or relevant gaseous fuel. In addition, we generated realistic turbulence and heat release.Although the boat chase next to Sydney Harbour Bridge was largely captured on location and in-camera, Cinesite contributed some almost entirely digital shots. Meire says, The pontoon explosion was one of these. Colt speeds into a pontoon in the middle of the Parramatta River. The massive ensuing impact was created with a full CG boat, river, detonation, debris and smoke.Another sequence in a nightclub involved the creation of far more visible visual effects. Colts drink has been spiked, and he is attacked by some bad guys. A magical look was added to the subsequent fight, inspired by anime, with sparkles, lens flares and color effects to show the effect of the psychedelic substances Colt has unknowingly consumed. These effects were created by shifting the RGB channels blended with time warps. The visuals were made to be as fluid and organic as possible, with the addition of chromatic aberration, lens flares and the kind of optical effects we often add in typical visual effects, Meire comments.Colt (Ryan Gosling) in a Vietnam war scene in the Metalstorm movie being made in The Fall Guy. Cinesites contribution included the addition of multiple bullet hits, explosions, impact sparks and various composites. (Image courtesy of Cinesite and Universal Pictures)If it had been possible for the film to have been completed entirely using SFX rather than digital, I think that would have been David Leitchs preference. But there were many instances where SFX could take you most of the way there, but they needed help with the final 10%-20%. There were also instances where safety was a factor; for example the shot where we added flames to Ryan Gosling. They could not have achieved that shot seamlessly with the actual actor, close up, any other way.Jennifer Meire, Visual Effects Supervisor, CinesiteMeire notes, Other work that Cinesite contributed included the addition of multiple bullet hits, explosions, impact sparks and various composites. With regards to the action and stunt scenes, its important to emphasize that most of what the audience sees was generally captured in-camera, the result of fantastic stunts and SFX.Meire observes, If it had been possible for the film to have been completed entirely using SFX rather than digital, I think that would have been David Leitchs preference. But there were many instances where SFX could take you most of the way there, but they needed help with the final 10%-20%. There were also instances where safety was a factor; for example the shot where we added flames to Ryan Gosling. They could not have achieved that shot seamlessly with the actual actor, close up, any other way. Wherever possible, SFX were used, but digital effects are [now] an essential part of every film, action or otherwise.The Fall Guy is a spectacular ode to stunt work inspired by the 80s TV series of the same name. On set from left: Ryan Gosling, stuntmen Aaron Taylor-Johnson, Ben Jenkin, Logan Holladay and Justin Eaton, and director David Leitch.Sloan points out the hard work that so often goes into invisible effects. There is one shot Id like to call out for special mention. It was, on paper, a super simple VFX shot a stitch between two plates. Due to circumstances beyond our control, we could not get the camera on the B-side to match the angle of the A-side. Not even close. To the point, we added a cut initially because we had almost zero confidence that we could make it work. It had huge parallax issues, and the lighting was noticeably different. It took months of brute force, meticulous roto, warping, relighting and CG replacements of parts of the set and props, but it worked. Its the definition of invisible VFX. But if youre reading this and worked on that shot, know that the work was appreciated. VFX is a form of magic. Expensive magic, sure, but still magic.0 Comments 0 Shares 303 Views
-
WWW.VFXVOICE.COMFORGING A PARTNERSHIP AND AN ADULT ANIMATED SERIES THATS INVINCIBLEBy TREVOR HOGGImages courtesy of Skybound Entertainment and Prime Video.Subverting genres is something that American comic book writer, screenwriter and producer Robert Kirkman has done so well, whether it be zombies in The Walking Dead or superheroes in Invincible. The latter, which has been turned into an animated series for Prime Video. revolves around teenager Mark Grayson coming into his supernatural powers and having to deal with the revelation that his extraterrestrial father was sent not to protect but conquer Earth. The second season consisting of eight episodes was divided into half with Part 1 released in November 2023 and Part 2 in March and April 2024.To a certain extent I admit it is somewhat a detriment to my career that everything I do is different because a fanbase cannot go, I like A from that guy so Im going to like all of this other stuff, states Kirkman, Co-Creator, Co-Showrunner, Executive Producer. But I would be bored if I was doing things that were so similar to each other.Exploring the poses and colors for Angstrom Levy before and after his accident.We have the benefit of the series being completed and a 144-issue roadmap. Were able to say, This story is more important because it comes into play in seven different issues. We can spend a little time on that scene and put more foreshadowing to certain things that are going to come. I didnt have that when I was writing the comic book series. There were far reaching plans, but for the most part it was put together as I went along. It has turned the show into a second draft.Robert Kirkman, Co-Creator, Co-Showrunner, Executive ProducerInvincible is not the typical production in terms of size and scope. Ive never worked on a project that is this big, remarks Marge Dean, Head of Animation for Skybound Entertainment. Its 45 to 50-minute episodes as opposed to a 22-minute episode, so its twice the amount of content. But also, the nature of Invincible is that in every episode there is a big fight scene, lots of destruction, a high body count, even the main characters are constantly in flux in how they look because they get beat up or shot. What that translates into is an awful lot of pencil mileage. We have to draw all of those things. A major lesson was learned by Dean. With the scope and the way story is, its more like three times the [average] episode.Outsourcing animation to Korea has been going on since 1980s, so the industry pipeline has been refined. We have 70 people working on the show, and thats not counting the folks in Korea, Dean explains. You figure out the whole season and decide how each episode fits in there. Then you have those written, and thats given to the team. The directors are responsible for keeping the whole thing intact and on track. Once it gets to Korea, theyre replicating what we created. We have done a refined [version] and worked out a map for them of what we want the main animation to be, and this is in the form of our storyboards, animatics and model packs. All of the information that they need is there. Then we have conversations with them. When they send us the show, if there are things that do not fit the continuity or story or is not what we instructed them to do. we call retakes and make them do it again. When we make a mistake and do a creative retake, theyre given extra compensation.Experimenting with various facial expressions for Debbie Grayson.What has changed is the acceptance and popularity of adult animation around the world which has made shows like Invincible possible. In North America and Europe, people believed that animation was for comedy or kids, Dean notes. Then as anime started working its way out of Japan, and we have Millennials and Generation Zs who have grown-up with 24/7 animation on the Cartoon Network and Nickelodeon, and have a broader understanding of animation as a medium not a genre. Then Adult Swim launched Toonami, which introduced anime to the general public, definitely in the U.S. and other parts of the world, and that blew the lid off because in Japan, they totally get that you can do animation for all of the different forms of audiences that you have. Its okay to make stuff for people who have a specific interest. There can be mature themes and stakes, and the characters dont have to be cute little kids but can be grungier older people or people starting their professional life. A mouth chart for Debbie Grayson to illustrate how various spoken letters are to be animated.Quite a lot [of CG animation] was used in Season 1. We still use an element of CG, such as to build our backgrounds that are sent to the overseas studio, but, ultimately, the background is drawn into the animation. The overseas studio usually has a CG team, and they may decide on their own to turn some vehicles into CG and blend them into the scene. We allow that to happen. Thats the extent of the CG.Marge Dean, Head of Animation, Skybound EntertainmentGiven the strengths and needs of animation, the series is not an exact replica of the comic book source material. Visually, everything has to be animated. Were a hand-drawn animated program, Kirkman states. With comics, everybody is drawing those panels one or two times, so its easy to put crazy detail and little flourishes on everything. But there is a streamlining process for animation. Then, with the story, most of the changes come from hindsight. We have the benefit of the series being completed and a 144-issue roadmap. Were able to say, This story is more important because it comes into play in seven different issues. We can spend a little time on that scene and put more foreshadowing to certain things that are going to come. I didnt have that when I was writing the comic book series. There were far reaching plans, but for the most part it was put together as I went along. It has turned the show into a second draft.Great attention to detail was paid to Mark Grayson down to his eyes.The sensibilities of Kirkman were different when he began writing Invincible. There are a few off-handed jokes in the old issues where I was like, That doesnt play so great these days, Kirkman notes. We were able to broom those things out and make everything more modern and updated. There are things that are antiquated as well, like technology, because the comic book series is 20 years old. Another thing is I feel that Ive improved as a writer, and so theres a lot of, I dont like how this dialogue goes. Or, I can make this sequence flow better. There is also this notion of trying to top myself, so youll notice that a lot of the big memorable moments in the Invincible television series are enhanced. An action sequence will go on longer or some sort of gut-wrenching sequence will have an element added to it that makes it more off-putting or nerve-racking, or the tension is heightened more. Some of that comes from the scenes in the comics werent moving and didnt have sound, and the show has that, so there is more latitude you can take in the individual scenes to amp up the emotion, scares and intensity.Determining the look of a bruise on the face of Omni-Man.Lessons were learned from turning the demonic-exorcism comic Super Dinosaur into an animated series for Teletoon in 2018. That show was so unsuccessful people might not be aware that it even exists, but we did 26 episodes, a half hour, and it was CGI, Kirkman recalls. It takes a lot to build CGI assets because every single character is constructed, and there are a lot of things that go into that. A lot of the budget is adding new characters into the show. I knew that Invincible could never be limited with how many characters that I could introduce per episode because this is such a vast world, and there are sprawling aspects that would be hampered tremendously by a 3D pipeline. There is a rich history in superhero programs having a 2D animation look so were playing off of that. There is CG animation in Invincible, but the amount was purposely reduced for Season 2. Quite a lot was used in Season 1, Dean states. We still use an element of CG, such as to build our backgrounds that are sent to the overseas studio, but, ultimately, the background is drawn into the animation. The overseas studio usually has a CG team, and they may decide on their own to turn some vehicles into CG and blend them into the scene. We allow that to happen. Thats the extent of the CG.That whole episode [Episode 208] was difficult, if you think about every single trip into the multiverse as almost a completely different show. Theyre going to completely different environments, characters, color palettes, and its a lot to ask of a team. I dont remember the actual number of dimensions, but that was monumental. It is great to have a team that is willing to push things and take those risks and put the extra work into accomplishing those kinds of things.Robert Kirkman, Co-Creator, Co-Showrunner, Executive ProducerOne of the characters who has a unique set of powers is Atom Even, as she can manipulate matter and energy at the sub-atomic level.The personalities of the Immortal and Bulletproof are what separates them as they essentially have the same abilities in being strong and durable.The heart of the storytelling is dealing with the trials and tribulations that the various characters experience, in particular Mark Grayson, who is trying to balance his personal life with being a superhero.Violence is depicted within reason. I want to stay on a narrow path, Kirkman states. If you go too far into different things it can distract from your story. The cool thing is that we have never had any content restrictions placed upon us by Prime Video. Each season needs to be bigger than the previous one, not only in action but also in emotion, character arcs and the cast interactions with each other. We have to make sure were not repeating ourselves and hitting those same notes. Were shifting the storytelling and trying to make sure that different things are happening, so when we do huge moments of graphic violence, as we do frequently, we need to have in mind how to maintain that sense of momentum through the show. If someone is watching this show in Season 5 and goes, Theyre not reaching the heights of Season 3, weve failed. Were always trying to have something new to spring on the viewer to shock and excite them and keep them engaged. In Season 1, the violence that Mark Grayson experiences is at the hands of his father, so you want to show that. In Season 2, it looks at the violence that Mark is capable of. The fact that Angstrom Levy is receiving it isnt as important as the emotions that Mark is feeling having done it.Character designs were simplified to streamline the animation process.There is also this notion of trying to top myself, so youll notice that a lot of the big memorable moments in the Invincible television series are enhanced. An action sequence will go on longer or some sort of gut-wrenching sequence will have an element added to it that makes it more off-putting or nerve-racking, or the tension is heightened more. Some of that comes from the scenes in the comics werent moving and didnt have sound, and the show has that, so there is more latitude you can take in the individual scenes to amp up the emotion, scares and intensity.Robert Kirkman, Co-Creator, Co-Showrunner, Executive ProducerCo-Creator/Co-Showrunner/Executive Producer Robert Kirkman has a preference for 2D animation, which is the style of choice for Invincible.Superpowers do not reflect personality traits. Were trying to make these characters feel real and three dimensional; they have hopes, dreams and aspirations, Kirkman explains. The superpowers serve the story and keep things moving. If youre really paying attention, youll notice that a lot of characters have a similar powerset. Every now and then you have an Atom Even, Rex Splode, Duplicate or Robot who have a unique powerset. But Bulletproof, Invincible, Immortal and Omni-Man, these guys have the same powers! What differentiates them is their personalities, and thats what makes them interesting. A particular character has benefited from the animated series. Debbie Grayson [Marks mother and wife of Omni-Man] was a major part of the comics and had a lot of big storylines, but there were long periods where the storylines were dealing with superpower, not human level stuff, so she would recede into the background. There is a tremendous opportunity in the show to find ways to keep her present. Then you also know that Sandra Oh is there to embody whatever we do. If we try to put some emotion into something, she takes that nugget and expands it. Its a good example of an actors performance taking over and guiding the character at the writing stage.The animation is outsourced to Korea, which is a trend that has been happening in the animation industry since the 1980s.One of the reasons that 2D animation was chosen over CG is that Kirkman did not want to be limited by the number of characters he could introduce.Serving as the main protagonist for Season 2 is Angstrom Levy, who has ability to open portals from multiple dimensions.A character who has received more story time in the animated series is Debbie Grayson.The growing popularity of adult animation, due in part by the streaming services, means that Invincible does not have to pull punches when depicting violence.Invincible delves into the multiverse, which runs the danger of being the answer for every narrative problem, thereby negating any sense of peril. The sweet spot of multiverse storytelling is witnessing other aspects of what could have been or seeing what happened in other dimensions and wringing as much emotion out of it as possible, Kirkman notes. Were also doing our best to keep it as simple as possible. There is one character who accesses the multiverse in Angstrom Levy, and everything that we experience with the multiverse is through that character. A riddle to solve was the portal sequence in Episode 208. That whole episode was difficult, if you think about every single trip into the multiverse as almost a completely different show. Theyre going to completely different environments, characters, color palettes, and its a lot to ask of a team. I dont remember the actual number of dimensions, but that was monumental. It is great to have a team that is willing to push things and take those risks and put the extra work into accomplishing those kinds of things. For most shows it would be like, You only get six of those. Often times, there are no corners to cut and we have to knuckle down and do it.0 Comments 0 Shares 304 Views
-
WWW.VFXVOICE.COMTHE FINELY-CRAFTED LOOK OF RIPLEY IS A BLACK-AND-WHITE AFFAIRBy OLIVER WEBBImages courtesy of Netflix.Based on Patricia Highsmiths 1955 novel The Talented Mr. Ripley, and stunningly told in black and white, Steven Zaillians eight-part Netflix limited series Ripley stars Andrew Scott as a grifter living in New York during the 1960s who is hired by a wealthy man to bring his vagabond son home from Italy. Andrew Scott as Tom Ripley in Episode 101 of Ripley. Creator/director Steven Zaillian envisioned from day one that the show was going to be in black and white.Zaillian envisioned from day one that the show was going to be in black and white, so it was essential to develop a unique but efficient workflow for the internal VFX team and VFX vendors. Early on, we even considered a workflow in which vendors would deliver two deliverables of every single shot submission, one in color and one in black and white, VFX Producer Joseph Servodio notes. However, since we had over 2,000 shots, you can only imagine how much media management that would create for the vendors, VFX team and Editorial. Ultimately, what made the most sense was to work on the shots in color, then our team presented them to Steve, predominately within cut context in our black and white look. Occasionally during reviews, we would flip back to the color look just as a quality check. Since black and white tended to be more forgiving, we would sometimes be able to see flaws in the color that we otherwise didnt catch in the black-and-white viewing.[W]hat made the most sense was to work on the shots in color, then our team presented them to Steve [director Steve Zaillian], predominately within cut context in our black and white look. Occasionally during reviews, we would flip back to the color look just as a quality check. Since black and white tended to be more forgiving, we would sometimes be able to see flaws in the color that we otherwise didnt catch in the black-and-white viewing.Joseph Servodio, VFX ProducerVFX Supervisor John Bowers joined the show during the post-production stages. I was brought in by Executive Producer Ben Rosenblatt, who I have worked with for 11 years now, Bowers says. The show needed someone who could collaborate hands-on in New York with our director, Steven Zaillian, to take sequences to final, and someone who could translate his notes about overall look and feel into concrete visual terms.There were 2,146 visual effects shots in Ripley, most of which consisted of train windows, bus windows and apartment windows, as well as environment extension work.The VFX team worked on shots in color, then presented them to Zaillian cut in the context of a black-and-white look. Above: In Ripley, Tom (Andrew Scott) is sent to Europe to bring back a wealthy mans wayward son.EDI, led by VFX Supervisor Gaia Bussolati, was in charge of the part of Ripley set in northern Italy. Since everything was based on previous footage or existing locations, it was important for EDI to find documentation about Sanremo in the 60s and the paintings of late 16th-early 17th century Italian artist Caravaggio.Since Bowers was joining a show that was already in progress, he was able to watch rough assembled cuts. The visual effects when I joined were in varying states of completion, Bowers notes. Our director had actually been working on the show for years at that point, having started writing Ripley in 2019. My job was to come in and ask him as many questions as I could to understand his vision, both in general terms and for specific sequences and shots. We also always knew that Steves vision was to present the show in black and white. We did VFX work in color throughout, but editorial was cutting in black and white, so we were always designing shots with an eye towards how clearly elements and compositions would appear in their final form. It was coming in and understanding the conversations that had already happened between Steve and his editors and figuring out from that point how to carry it forward to final.We also always knew that Steves vision was to present the show in black and white. We did VFX work in color throughout, but editorial was cutting in black and white, so we were always designing shots with an eye towards how clearly elements and compositions would appear in their final form.John Bowers, VFX SupervisorBowers focused on period photography from the time, as well as black-and-white films from the 60s. Steve would frequently make reference to films from the period, for example La Dolce Vita, which was an important cultural touchstone for him, Bowers explains. A recurring visual motif throughout the show is Caravaggio paintings, which create visual drama through their use of light and dark. We took inspiration from that as well, but honestly, for shots where we were designing camerawork from scratch like our big CG sequence in Episode 3 the most important reference point was Robert Elswits cinematography from the rest of the show.Andrew Scott, who plays Tom Ripley, is in frame for almost every shot of Ripley. Shots were always designed with an eye towards how clearly elements and compositions would appear in their final black-and-white form.Fellinis lauded Italian film La Dolce Vita (1960) was an important cultural touchstone for Zaillian. From left: Andrew Scott as Tom Ripley, Johnny Flynn as Dickie Greenleaf and Eliot Sumner as Freddie Miles in Episode 102. (Photo: Stefano Cristiano Montesi)There were 2,146 visual effects shots in total, most of which consisted of train windows, bus windows and apartment windows, as well as environment extension work. Episode 3 had 400 shots in it, with as many shots in one 16-minute sequence as existed in some entire episodes, Bowers says. We had teams working all over the world: EDI in Italy, Wt FX in New Zealand and Redefine in Canada, India and Europe. In order to coordinate a team on that scale, you really have to have a consistent vision and clear communication. We always wanted to make sure that we were prioritizing and making progress on our most challenging shots, especially in Episode 3. With different vendors working on different things, we often had to work in batches to present the director with coherent sequences that were nearly complete before soliciting his feedback. It was risky, but for some sequences, it was the only path to success.The use of light and dark in Caravaggios paintings influenced the shots designed from scratch, inspiring Robert Elswits cinematography. (Photo: Philippe Antonello)Chris White was Wt FXs VFX Supervisor on the show. We were given a rough cut of the show when we began work, White explains. You could quickly tell it was a beautiful show with solid compositions and lighting. The visual effects would also need to support this aesthetic with attention to the subtle details of highlights, shadows, environment and composition. Creative reference is always an instrumental part of the process and gives us a clear target. Because the series had so many beautiful shots, we went through the season cut and requested footage of other shots to reference. We used the series cinematography as our first inspiration. I studied chiaroscuro at university many years ago, so I drew on my memories of those studies, along with references from the old masters and noir photography.Steve would frequently make reference to films from the period, for example La Dolce Vita, which was an important cultural touchstone for him. A recurring visual motif throughout the show is Caravaggio paintings, which create visual drama through their use of light and darkness. We took inspiration from that as well, but honestly, for shots where we were designing camerawork from scratch like our big CG sequence in Episode 3 the most important reference point was Robert Elswits cinematography from the rest of the show.John Bowers, VFX SupervisorWt FXs work for the show included two types of shots: the boat action sequence, where each shot had a unique action, and a series of shots in a static location on the water. For the action shots, our animation team crafted dynamic boats that could simulate motion through the water, White says. For the static shots, we invested in an automatic setup. Once the look of the sequence was established, we could run multiple shots through a setup and tweak each shot to taste. The setup was crafted as a primary lighting and comp scene, establishing the base look. The digi-double shots were the most challenging, but they also brought the most satisfaction. They needed to be spot-on likenesses to the actors, with the complexities of underwater clothing simulation, bubbles, hair motion and lighting while preserving detail. The way light exits skin underwater differs from in the air, so particular attention had to be paid to underwater digi-double rendering.The overall look of the series and its classic black-and-white aesthetic was one of the highlights of the show for Wt FX VFX Supervisor Chris White. (Photo: Philippe Antonello)For Bowers, the 16-minute boat sequence in Episode 3 was the most challenging sequence in the entirety of the show. Its the most important story point of the show. Everything in the first two episodes leads up to it, and everything that happens in the remaining five episodes happens because of it, Bowers explains. If people know just one thing about The Talented Mr. Ripley, its usually this murder scene. The success or failure of the show as an artistic endeavor really depended on the realism and the artistry of this one scene; that was the challenge for us, and those were the stakes. On the technical side, every shot of either Tom or Dickie underwater was a digi-double, and those were quite close-up angles. Andrew Scott, who plays Tom Ripley, is in frame for almost every shot of Ripley, so the audience would be intimately familiar with his face and expressions by this point in the show. To then hand off to a digi-double? In a close-up of his face, underwater? We had to get that exactly right.A main concern of production was related to the locations. All the required elements couldnt be found in one location at a time, so EDI focused on designing locations by assembling different components from different sources, like a collage, with great attention to detail.Continues Bowers, Wt FX did fantastic work throughout that sequence. The underwater effects they created to make the environment feel literally immersive was really outstanding work. There was one full-CG shot in particular near the end of the scene where the camera is pointing straight down into the water, and Dickies body is sinking down into the depths. That was definitely the most challenging visual effects shot to create in the entire series. It needed to feel still and quiet and sort of melancholy at the end, but we still wanted to have the feeling of the camera being 15 feet underwater and really present. The first version of that shot that we presented for creative approval was version 223, and the final comp was version 557, so it went through quite a lot of iteration.EDI was responsible for all of the Caravaggio shots, working on around hundred, spread over six of the eight episodes of the series.A recurring visual motif throughout the show is Caravaggio paintings, which create visual drama through their strong contrast of light and dark.Gaia Bussolati served as EDI VFX Supervisor on the show. We started work on this project in mid-2021. The inspection and pre-production began in June. We were involved from the very beginning, Bussolati says. Thanks to our well-established relationship with American supervisors and studios, the productions head of post-production visited us and asked us what kind of structure we had, if we had supervisors who could stay on set and follow their director, considering that there would be other main suppliers involved in the project. We were then told that we would take care of the part set in northern Italy.We always wanted to make sure that we were prioritizing and making progress on our most challenging shots, especially in Episode 3. With different vendors working on different things, we often had to work in batches to present the director with coherent sequences that were nearly complete before soliciting his feedback. It was risky, but for some sequences, it was the only path to success.John Bowers, VFX SupervisorWt FXs work for the show included two types of shots: the boat action sequence, where each shot had a unique action, and a series of shots in a static location on the water. (Image courtesy of Wt FX and Netflix)When Bussolati and EDI came onboard in the pre-production stages, they received the scripts for all the episodes, which they then read in order to understand the mood of the series. The main concern since the very beginning seemed to be related to the locations. Even the simplest, Atrani, which was the best preserved, did not seem to have all the elements that could reveal the directors vision. So we realized that the work ahead would have been designing locations by assembling different components from different sources. Like a collage. The directors vision was very clear; he knew what he wanted, and this was translated into a work with great attention to the smallest detail.Since everything was based on previous footage or existing locations, it was important for Bussolati and EDI to find documentation about Sanremo in the 60s and Caravaggios paintings. Our aim was to create the most realistic output for the story and the context, so any reference was based on real references. This meant a lot of historical and iconographic research on our end, says Bussolati. EDI was responsible for all of the Caravaggios shots and worked on around hundred, spread over six episodes out of eight of the series. We were also responsible for the St. Louis of the French Chapel in Rome, The Nativity with St. Francis and St. Lawrence in Palermo and David with the Head of Goliath at Villa Borghese in Rome, Bussolati adds.For the action shots, Wt FXs animation team crafted dynamic boats that could simulate motion through the water. For the static boat shots, Wt invested in an automatic setup. Once the look of the sequence was established, they could run multiple shots through the setup and tweak each shot to taste. (Image courtesy of Wt FX and Netflix)If audiences remember one thing about The Talented Mr. Ripley, its the murder scene. According to VFX Supervisor John Bowers, the success or failure of the show as an artistic endeavor depended on the realism and artistry of this one scene.We were given a rough cut of the show when we began work. You could quickly tell it was a beautiful show with solid compositions and lighting. The visual effects would also need to support this aesthetic with attention to the subtle details of highlights, shadows, environment and composition. Creative reference is always an instrumental part of the process and gives us a clear target. Because the series had so many beautiful shots, we went through the season cut and requested footage of other shots to reference. We used the series cinematography as our first inspiration. I studied chiaroscuro at university many years ago, so I drew on my memories of those studies, along with references from the old masters and noir photography.Chris White, VFX Supervisor, Wt FXIn Episode 3, Bussolati continues, we worked on recreating Sanremo, since the original scene was shot in Anzio. We also worked on other paintings, including Picassos, which can be seen both in Atrani and Venice. Sanremos scenes on the boat were shot with blue skies and sunshine. But the narrative was intended to start with a slightly cloudy sky and end with an almost stormy one to match in continuity with another vendors following sequence. This implied a very careful work on each and every nuance. A couple of other interesting scenes were related to Caravaggios paintings. The one in the St. Louis of the French Chapel in Rome was shot in Naples with a greenscreen, and then we have rebuilt the chapel based on photo reliefs and database images. Then The Nativity with St. Francis and St. Lawrence in Palermo, was quite challenging, since the original painting was stolen in 1969 it is in the world list of the 10 most important stolen masterpieces. According to the story, there was still the original, so we had to reproduce the brushstroke, the cracks, the texture of the canvas, and this was done thanks to an accurate analysis of documents and historical references.Dickies body sinking down into the depths was the most challenging visual effects shot to create in the series, according to VFX Supervisor John Bowers. Underwater effects were created to make the environment feel literally immersive. (Image courtesy of Wt FX and Netflix)Bowers enjoyed collaborating with the vendor side VFX supes at all the various companies that were working on the show. Working with Gaia at EDI, Chris White and Francois Sugny at Wt FX, Tehmina Beg and Eric Sibley at Crafty Apes and Teddy Wirtz at Powerhouse: just a great group of creative problem-solvers that were able to take sometimes vague, or seemingly contradictory direction, and work together to come up with solutions. That was, for me, the great pleasure of the show.Every shot of either Tom or Dickie underwater was a digi-double. The digi-doubles needed to be spot-on likenesses to the actors, with the complexities of underwater clothing simulation, bubbles, hair motion and lighting while preserving detail.For White, the overall look of the series and its classic black-and-white aesthetic was one of the many joys of the show. As someone who used to shoot black-and-white photography, I found the most enjoyable part of the show to be crafting images that reflected that aesthetic, White says.For Bussolati, the joy was in the depth of detail and quality of the result. We really loved the passion for the detail, the story, and we loved to work on historical documents about Italy, in particular about the landscape, the art and the buildings such as the churches. Were really glad that this series is getting the great worldwide success it deserves.0 Comments 0 Shares 310 Views
-
WWW.VFXVOICE.COMSUMMONING CREATIVE VFX TO HEIGHTEN REALITY IN THE SYMPATHIZERBy TREVOR HOGGImages courtesy of HBO.Given the dark, satirical nature of the Pulitzer Prize-winning novel by Viet Thanh Nguyen, where a police captain in Saigon who is a communist spy comes to America as a refugee at the end of the Vietnam War, Park Chan-wook was an ideal choice as a co-showrunner, director and writer to create a seven-episode adaptation of The Sympathizer for HBO.Split screens and table markers were critical parts that enabled the Captain to interact with his four dinner companions played by Robert Downey Jr.Park has a reputation for redefining genres, whether it is the vengeful Oldboy or psychological thriller The Handmaiden. Creatively, there are so much going on with him, notes Visual Effects Supervisor Chad Wanstreet (Dollface, S.W.A.T.). Director Park storyboards almost everything, and what isnt storyboarded he still comes with a specific shot list every day. There is very little deviation from that. Every once and awhile hell cut one and add something different.After we see Bons legs running down an airport runway while carrying his dead wife, the scene transitions into the Captains car driving on an American highway. At this moment, Bons legs are still visible as a reflection on the car body. This is a scene transition that brilliantly connects two timelines/two countries through the visual connection of running objects. The idea to show Bons legs on the car was entirely from Chad. Its not often that the visual effects team spontaneously brings new ideas, as it is hard enough to fulfill requests of the director. However, Chad has done so for this scene and many others.Park Chan-wook. DirectorThe Congressman portrayed by Robert Downey Jr. was partially inspired by Ronald Reagan.Visual effects are seen as way of elevating the dramatic or comedic aspect of a scene. There is a lot of punctuation, and he uses visual effects in that way to either set up a scene or environment, Wanstreet notes. We had lots of flies and rats, different small creatures that normally are throwaways, but he uses them in a manner to heighten the reality of whatever scene that youre in. We had one shot where the Captain and the General are in the latrine talking back and forth, and flies everywhere. The General is irritated; a fly comes in, lands on his cheek and then flies off. There are tons of subtlety and little things that director Park does where most people wouldnt think, Lets do a visual effects fly for comedy relief in this tense situation. But he is constantly doing these things that are contradictory to what is being established.Atmospherics, such as enhancing cigarette smoke, were important. There were moments where we matched cut one person blowing out smoke to another blowing out smoke, Wanstreet reveals. There is one where Sofia Mori is in the Vietnamese restaurant and Captain is outside talking to Bon. We transition from Sofia blowing out smoke to then Captain match cut blowing out the exact same level of smoke; these specific details where color and amount [of smoke] director Park wants them to match back and forth. Later, when Sofia Moi and Captain are lying in bed together and are smoking together, he wanted to have exact same cigarette smoke across the two of them so theres enhancement of cigarette smoke in there to make everything exact between the two.The rotating Happy Burger sign was achieved practically.Scene transitions are meticulously planned with a particular one standing out. After we see Bons legs running down an airport runway while carrying his dead wife, the scene transitions into the Captains car driving on an American highway, Park explains. At this moment, Bons legs are still visible as a reflection on the car body. This is a scene transition that brilliantly connects two timelines/two countries through the visual connection of running objects. The idea to show Bons legs on the car was entirely from Chad. Its not often that the visual effects team spontaneously brings new ideas, as it ishard enough to fulfill requests of the director. However, Chad has done so for this scene and many others.There is a lot of punctuation, and [directpr Park] uses visual effects in that way to either set up a scene or environment. We had lots of flies and rats, different small creatures that normally are throwaways, but he uses them in a manner to heighten the reality of whatever scene that youre in.Chad Wanstreet, Visual Effects SupervisorMany skies were digitally augmented to match the photographic vision of director Park Chan-wook.Digital assistance was provided to some of the scene transitions, such as when Captain is crossing names off of the evacuation list. We definitely helped out with that, Wanstreet notes. As you know, director Park is specific as far as lines being in parallel or how they converge. Composition is important to him. We took the footage and skewed it, realigned it and did some stuff to make it lay out better on the page. There is a little bit of elements to that ruler that are also visual effects. When Captain pulls it across, that wipe, were carrying that and roto his hand so as it comes in the background its there, but then the hand brings the rest of it in. That was one of the things he and I worked back and forth on to get that exactly the way he wanted because it was funky in the beginning.Fireworks proved to be complicated to execute. This is one of those [situations] where a lot of people go into a scene like that [and say], Were going to put fireworks in the sky, Wanstreet remarks. But director Park wanted the texture of the fireworks reflecting all throughout that scene, which is visually amazing, but you can imagine from budgetary standpoint, were like, Okay. We had big practical lights on the night. Those were great for interactive lights on the people and in some of the environments, but then they made nasty reflections on some of the metallics and the wet-down. We had to paint all of that out and then add fireworks reflections back in, so it was more organic and broken up. But at least we had something to start with. We took whatever was going on in the panels in the sky, then matched our colors of our fireworks to those so we had a match between practical interactive light and visual effects firework.A personal favorite of Visual Effect Supervisor Chad Wanstreet was the theater explosion.Having Robert Downey Jr. playing the roles of the priest, auteur. Claude (CIA agent), professor and Congressman harkens back to when Peter Sellers portrayed three characters in Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb. The team that came on to do the makeup had done Perry Mason previously, and we had an excellent prosthetic artist as well, Wanstreet explains. They were upfront when they thought there would be an issue, and we went in touched stuff up, but for the most part what you see is what they did. Robert in the steakhouse and lounge were thoroughly planned-out scenes because getting him into prosthetics took three hours for each one of the characters, depending on which character it was. In order to shoot that, we wound up breaking it into multiple days; there were two Roberts on one day and two Roberts on the next day. He would do all of his lines, and we had markers all over the table for where cameras were, so we could replay how we shot one for when the next one came in. When Robert went away to do the prosthetics for another character, we would film backgrounds, like crosses of waiters from the different camera angles. so when we did the split screens, there was always movement in the background.We had one shot where the Captain and the General are in the latrine talking back and forth, and flies everywhere. The General is irritated; a fly comes in, lands on his cheek and then flies off. There are tons of subtlety and little things that director Park does where most people wouldnt think, Lets do a visual effects fly for comedy relief in this tense situation. But he is constantly doing these things that are contradictory to what is being established.Chad Wanstreet, Visual Effects SupervisorBuildings had to be altered and removed in order to recreate Saigon in 1975.When the time came to shoot the lounge scene, face replacements were necessary, which led to some unique challenges. When we were scanning Robert, it came down from director Park that he wanted the Congressman to lick whipped cream off this escort with a face replacement, Wanstreet recalls. Because were doing FACS and all sorts of expressions, I go, Alright Robert, I have one more expression for you. I need you to give me a face like youre going to lick whipped cream off of a naked womans breasts. He looked at me as if to say, What? I said, Trust me. Youre going to lick whipped cream off of this woman. Robert sits there for a second and goes, Alright. Ive got it. He makes this crazy expression, we scanned him, and he goes, Ive got a different one. We did that one as well. Then Robert comes out, and I go, Youre probably one of the most scanned actors in the world. Have you ever had to pose for licking whipped cream off of a naked woman? He goes, No. Youve got the first, Chad.A personal first for Wanstreet involved a certain cactus. Because director Park has this specific visual style, there were a lot of little things that popped up all over the place, Wanstreet explains. After Xuande gets blown up in Episode 105, he brought a cactus, and there was a line about a prickly dick. The cactus that we had on set wasnt cactus enough and didnt have the right needles; that became a thing. It was like, Wait a minute. What? We had to add more needles to this cactus, which is something Ive never had to do before. Another fun shot featured an alligator in a swimming pool. We had a remote-control alligator head on the day to give an eyeline for Robert. We would rehearse and take it away. Robert did wonderfully, as far as acting as if theres an alligator in the water when there clearly is not. Those were complicated shots. Just getting the right number of bubbles and the cavitation correct.Modern elements like electronic billboards had to be removed to make the imagery period accurate.Robert [Downey Jr.] in the steakhouse and lounge were thoroughly planned out scenes because getting him into prosthetics took three hours for each one of the characters, depending on which character it was. We wound up breaking it into multiple days; there were two Roberts on one day and two Roberts on the next day. He would do all of his lines, and we had markers all over the table for where cameras were, so we could replay how we shot one for when the next one came in.Chad Wanstreet, Visual Effects SupervisorLocations were the primary focus. We filmed in Los Angeles, and when we went to Thailand it was all location shooting, Wanstreet states. We did one day of bluescreen, and thats where we have Xuande going through the air. We had found a little road in Phuket that looked more French and had similar tones of what you might see in Vietnam. We spent about a month and a half going in and around the Bangkok area, then ended up in Phuket, Phang Nga, Nakhon Si Thammarat and Ban Phai. We were moving all around the country for bespoke one-off spots that we would augment with visual effects in post by adding monuments and things that were particular to that time. In Episode 101, when they do the farewell tour and drive through downtown Saigon, specific landmarks were wanted. Theres the opera house, ARVN monument and palace. I had gone to Vietnam and taken some photography, but all of those buildings had changed a ton since current day and 1975.Bluescreens were pivotal in being able to achieve the desired environmental scope.Around 1,200 visual effects shots were created for the miniseries, with the main vendors being Ingenuity Studios and Barnstorm VFX, along with Zoic Studios, Yannix, Incessant Rain Studios, BOT VFX, Mr. Wolf, Van Dyke VFX and Pidantic VFX. I work differently than a lot of people in that I have all of my shots pre-tracked on my show, Wanstreet says. Yannix did all tracking and touched almost everything in every episode. We handed that off to whoever the vendor was and also did paintwork. There are lot of shots where one person is painting or tracking, and then somebody else is adding maybe the set extension. There were also elements like the flares that we were passing back and forth because thats something that showed up a lot and was a thematic element that was in Episode 101, and we see it again in Episode 107, tying those two anchor points in those episodes. The helicopter was also a shared asset that moved around with different vendors; the same thing with the jet.For safety and art direction reasons, explosions close to the actors were achieved digitally.Elaborate re-timed split screens were incorporated into a couple of the 4,000 frame shots. They dont sound gnarly, but one of them in particular was when we filmed in the interrogation room where the Watchman was and there were metal grates all over the place, Wanstreet recalls. When you start to do a re-time in a room like that, it warbles and distorts like crazy. We cleaned up all that stuff over 4,000 frames, and there was some stuff on the floor that we needed to remove. There were reflections of the camera guy and Kim Ji-yong [Cinematographer] in the glass; over 4,000 frames, youre removing him over Roberts face. Were painting all of that out and trying to keep all of the metal grates exactly in line. Lens aberrations were important. Obviously, if youre nighttime and an explosion goes off, as a DP you have two decisions: You can try to expose for the fire or your talent. We were constantly playing with getting this perfect blend where we would see the fire detail for a moment and our actors would go out and everything is black. You would get flares and then pull back so youre seeing the talent again, and the explosions goes out and theyre brighter.It was impossible to wet down the entire tarmac, so the responsibility was given to the visual effects team.The actual filming had to be done with a fake airplane that only had the tail reproduced of the C-130.The most complex visual effects sequence was the tarmac escape, which occurs in Episode 101.Insects are an essential part of the visual language of director Park Chan-wook.Getting the proper reflections for the fireworks was a complex task.A remote-control alligator head was replaced by a CG version created by Ingenuity Studios.As for the experience of collaborating with Park Chan-wook and Robert Downey Jr., Wanstreet remarks, Working with director Park and Team Downey, who have a clear vision, you could imagine that there is this potential for a clash. Thats not what happened at all. Team Downey loved director Park and gave him a lot of latitude to do what he wanted to do. The two of them worked hand-in-hand. When it comes to naming his favorite moments, Wanstreet answers, I love the explosion of the theater and the finale as well. The ocean at the end is pretty and beautiful. There is a lot of stuff that is great in Episode 107. When the Captain releases the alligator into the auteurs pool is wonderful work.Practical explosions were incorporated into digital ones.Chad Wanstreet and the entire Sympathizer team are saddened by the passing of friend and colleague Dan Lombardo. Wanstreet remarks, Dan was an exceptional friend and a bright beacon on the VFX team for the show. His loving disposition and kind heart always made even the hardest days enjoyable and his dedication to the team was infectious. Dans hilarious stories about his many years in the VFX industry will be missed, and we will all proudly cherish working with him.Watch dramatic VFX breakdown reels on the making of The Sympathizer from Ingenuity Studios and Chad Wanstreet, Click here: https://vimeo.com/949982071/ff40298c9a?share=copy. And here: https://vimeo.com/9463626680 Comments 0 Shares 309 Views
-
WWW.VFXVOICE.COMCHECKING INTO HAZBIN HOTEL TO CHECK OUT THE ANIMATIONBy TREVOR HOGGImages courtesy of Prime Video and A24.Collaborating with a group of freelance animators and aided by financial support provided by Patreon, American animator, writer, director and producer Vivienne Medrano released a pilot episode for Hazbin Hotel via her VivziePop YouTube Channel, which revolves around Charlie Morningstar, the Princess of Hell, setting up a rehabilitation establishment for demons to avoid the yearly extermination imposed by Heaven. Contributing to the 109 million views over the past four years was A24, the independent entertainment company responsible for the Oscar-winning Everything Everywhere All at Once, which in turn got Amazon MGM Studios interested to produce a new pilot and seven more episodes to stream on Prime Video.A challenging aspect of getting Vaggie to emote properly is the X placed over the left eye.Hazbin Hotel is different in the sense that it came from a proof concept that went viral and also had the benefit of a company like A24 that is risk-taking. Its definitely not easy to put into a box that exists in the adult animation world, and Im excited because the adult animation world is starting to bloom into something different, and were seeing more diversity in the shows.Vivienne Medrano, Creator, Executive Producer, Showrunner and Writer, Hazbin HotelI had spent most of my career as a freelancer, so I havent experienced the nitty-gritty of productions, states Vivienne Medrano, Creator, Executive Producer, Showrunner and Writer of Hazbin Hotel. Ive mostly done visual development and things that never got to see the light of day. Hazbin Hotel is different in the sense that it came from a proof concept that went viral and also had the benefit of a company like A24 that is risk-taking. Its definitely not easy to put into a box that exists in the adult animation world, and Im excited because the adult animation world is starting to bloom into something different, and were seeing more diversity in the shows. Putting together a new pilot was complicated. We only had eight episodes and a tight time to tell the story. I didnt want to waste an episode on redoing the original pilot. We had to get the same information across, re-establish the world and characters, but also tell a new story with new characters and villains, Medrano says.Catering to Creator/Showrunner Vivienne Medranos attachment to zanier, sillier characters is Niffty.How Bruce Timm draws women, like Harlequin and Poison Ivy, the exaggerated shape language of Tim Burton, cartoonist Jhonen Vasquez and classic Warner Bros. Animation and Disney were major influences. It was fun to explore how to find an identity for the show and the style, Medrano remarks. The other aspect of my style is that its detailed with a lot of stripes; thats a Tim Burtonism! I like specific outfits and accessories. Its catered to my sensibilities. Angel Dust was simplified, while a different idea for Charlie was introduced, especially with her braids. Because its a hand-drawn show, I wanted to take off some of the superfluous details that were in the [original] pilot. However, I wanted to maintain those iconic, striking designs. Also, there were changes I always wanted to make to begin with from [original] pilot but we were too far in to do that. Black and red dominate the color palette. The biggest challenge of the show for my art director, Sam Miller, is that its a lot of red characters on numerous red backgrounds. It was a challenge to find the right balance and the tones of the reds that we use. Im trying to lean more into contrast in the second season, Medrano explains.Given that Vox is, in essence, a TV screen, side profiles of his head were avoided.Concept art for a billboard promoting politeness.A Morningstar Family portrait painting.I would red-line the change that I want; thats usually the easiest because they can take that and finalize it. I do that with character designs and ever so often with backgrounds. For storyboards, I also direct, so I do thumbnails. I use a Wacom Cintiq tablet where you can directly draw on to. I have a bigger one that Ive worked on since college and a smaller version that I use often. You can have it on your lap and work. Its much more mobile.Vivienne Medrano, Creator, Executive Producer, Showrunner and Writer, Hazbin HotelRevisions are articulated by drawing over the work of artists. I would red-line the change that I want; thats usually the easiest because they can take that and finalize it, Medrano states. I do that with character designs and ever so often with backgrounds. For storyboards, I also direct, so I do thumbnails. I use a Wacom Cintiq tablet where you can directly draw on to. I have a bigger one that Ive worked on since college and a smaller version that I use often. You can have it on your lap and work. Its much more mobile. Adult animation has provided more of an opportunity to blend 2D and 3D techniques. The combination of 3D and 2D is cool. A great example is Arcane where they could have the 3D style, but it still has an artistic layering of texture. However, all of the effects are done in 2D, like fire and smoke. 2D effects are the coolest looking. For Hazbin Hotel, we dont utilize a ton of 3D because its outside the process and pipeline of being a 2D show. On my other series, Helluva Boss, we utilized 3D a couple of times, and its fun to stylize the 3D to match the 2D world.Logo designed for 666 News.Collaborating on her third project with Medrano is Skye Henwood. Its an across-the-ocean story of an online relationship working out, states Skye Henwood, Animation Director at SpindleHorse Toons. I needed a job, she had a job opening, and I joined her studio. We ended up in a studio chatroom, and it was like an instant click. Now, its an Amazon show, and everybody knows the name. Its crazy! SpindleHorse Toons animated Episode 108 while Princess Bento was responsible for the rest of the series. Its all done digitally. Even the storyboards. We use Wacoms and work in Toon Boom Harmony. Ive made like a million guides on how to do a basic shot. The stylization is easier in a way because the policy that Viv, Sam Miller, and I have is, as long as it looks cool or appealing, thats it. It doesnt necessarily have to look like the model sheet. You can stretch Charlie, make her eyes big, and if its cute or funny enough its getting in. There is a lot of creative freedom. This show is for artists by artists, Henwood states.A poster of Charlie with her showman father Lucifer performing in the background.A color key for the Pentious Airship.A color concept for the rebuild of the Hazbin Hotel.Its all done digitally. Even the storyboards. We use Wacoms and work in Toon Boom Harmony. Ive made like a million guides on how to do a basic shot. The stylization is easier in a way because the policy that Viv, Sam Miller and I have is, as long as it looks cool or appealing, thats it. It doesnt necessarily have to look like the model sheet. You can stretch Charlie, make her eyes big, and if its cute or funny enough its getting in. There is a lot of creative freedom. This show is for artists by artists. Skye Henwood, Animation Director, SpindleHorse ToonsStoryboards and animatics are an essential part of the creative process. The strength of the boards is the strength of the series, Medrano notes. Im excited because for the second season we have some incredible artists joining and have more action, so theres more of a uniqueness to the technique of the boards. The poses of the characters are figured out in the storyboards. We dont usually invent many new ones from the boards. Thankfully, for Season 1 we had some fantastic board artists, in particular for the song sequences. Then my studio [SpindleHorse Toons] was able to do a little of the animation plus the boards. Im glad that we got to do that because it heightened those moments. The songs are vital during the writing stage. The challenge for Season 1 was that I wanted the songs to be the length that they needed to be. Because we have a tight 22-minute run time, we had to work closely with the songwriters and figure out what part of the script was going to be music and what genre of music. What I love about musicals is that they go hand-in-hand with animation. The characters get to be really expressive, and the songs get to be bombastic and out there, Medrano observes.A pentagram city map of Hell.We have the music from the start, so we dont have to go back and change it, Henwood remarks. Dealing with a musical in animation is hard because musicals are beautiful and so out there and over-the-top, we have to get that Wow! feeling, and there are reasons for why things are done in live-action and animation. We have to compromise and find how we can fit the wonders of a musical into a 2D TV animation. We find a way to make the reason why the character sings believable. Its important to try to feel seamless when starting those musical moments. Henwood did the first shots animated for Season 1. I wanted to make it simple enough that any studio could do it, yet also capturing Vivs emotions and style of drawing, which is the hardest thing because we draw every frame ourselves, he says.Deciding upon the color palette for the Gates of Heaven.One of the many key props that had to be designed was the Valentinos favorite gun.Maintaining the proper poses is more difficult for some characters compared to others. There is a spider demon and he has four arms, so when youre having someone cross their arms, what are his bottom arms doing? Henwood comments. You have to come up with that. If he is angry or sassy, maybe on his hips. You have to make sure that looks appealing. We dont want to hide his arms behind him. Its thought out. My favorite trivia is Vox. We have a rule that he cant be seen sideways because hes a flat-screen TV. Youll notice that even if Vox should turn around to leave a shot, his head will stay forward. The one who has changed the most and for the better is Angel Dust. In older iterations, he was more monstrous and had a poison skull on his chest. But as Viv was realizing this character, Angel Dust became this cutesy guy. My favorite part of all of the design changes is, right before we started animating it, Viv let me have my opinion on how to simplify them. Jeremy Jordan has made clever vocal choices for Lucifer Morningstar. Jeremy gets into that booth and does the silliest little takes we have ever heard. My favorite thing he does is what we call the Lucifer wheeze. In Episode 105, Jeremy is coughing the name Charlie. Its really fun because that wasnt in the script, Henwood reveals.A lot of detail went into not only characters but also the background elements.A significant visual challenge was having red characters placed against red backgrounds.The strength of the boards is the strength of the series. Im excited because for the second season we have some incredible artists joining and have more action, so theres more of a uniqueness to the technique of the boards.Vivienne Medrano, Creator, Executive Producer, Showrunner and Writer, Hazbin HotelThe world-building for Hazbin Hotel has been a natural evolution. Something that Ive started working on is a bible of all the information that has been established about this world and rules that need to be maintained, Medrano says. Its challenging because I have my other series, Helluva Boss, which is a different side of this expansive world. We have to make sure that there is a consistency because our audience will notice and care. Adding life to the world is the background action. Its easier for the animators and less distracting to have a silhouetted character in the background; that is something stylistically I do a lot, Medrano adds. One of the changes from the original pilot is having an entirely different voice cast. I have the characters figured out and need to find the right voice. I have specific voices in mind. With Hazbin Hotel, we had the pilot where we established some voices that the audience was attached to. It was important to maintain a sense of cohesion between the two casts. When it came to casting for the new series, it was a re-audition for the original cast and new people. The actors we ended up going with were good at maintaining that original sound and vibe of the characters that I wanted and was attached to, but also bringing with it the musicality and singing talent that was needed for the show. Once we had the final cast, everything locked together, and it was exactly what I had envisioned but I am picky about casting.A pastel color palette was devised for Heaven.A number of characters are exaggerated representations of Medrano. Angel Dust is nothing like me, but I put a lot personal trauma and experiences in him, Medrano states. Charlie is a lot like me in the sense that she is a driven and determined person in a world that can be hard and naysaying. There is part of me in Vaggie when it comes to being more practical and reserved; shes more of a worrier and feisty. Like a lot of creators, I try to put in a little piece of real me, but it also came from things that I feel add to the story or character tropes that I enjoy. Niffty and Lucifer are catered to my sensibilities because I like the zanier, sillier characters as well. Showcasing the most action and characters was Episode 108.It was important to seamlessly integrate the musical numbers into the narrative.There is no shortage of characters and action in the battle sequence that takes place in Episode 108.Jeremy Jordan makes clever vocal choices for Lucifer Morningstar, such as having him wheeze.The character designs were influenced by Bruce Timms portrayal of women and the exaggerated shapes of Tim Burton.A film that greatly influenced the animation of Hazbin Hotel was Cats Cant Dance.A character that needed to be constantly checked by Animation Director Skye Henwood was Vox.Niffty stars in A24s first venture into adult animation with Hazbin Hotel, which started off as a proof of concept on YouTube.There is a shot where the camera is following characters through the battle, and it was comprised of two different shots that seamlessly come together as one, Medrano remarks. We start with Angel Dust and Cherri Bomb, who jumps up and throws a bomb. Then the bomb explosion transitions to another character. It was a challenge in the sense that the hookup had to be specific. Theres a lot of characters. There is a giant shield that had effects on it, so we had to make sure that the background was tracking and the camera was working with it while the effect was going on. That was a technical shot, but it turned out fantastic.0 Comments 0 Shares 309 Views
More Stories