Befores & Afters
Befores & Afters
A brand new visual effects and animation publication from Ian Failes.
3 people like this
242 Posts
2 Photos
0 Videos
0 Reviews
Recent Updates
  • How the motion capture worked on Better Man
    beforesandafters.com
    Go behind the scenes of the on-set performance capture for the film.Today on the befores & afters podcast, weve got a fun conversation with Wt FX visual effects supervisor Luke Millar and actor Jonno Davies about the film Better Man. Jonno, of course, played Robbie Williams in the film, and for the most part he wore different kinds of motion capture gear on set, with his performance then translated into a Robbie ape by Wt FX.Theres already a fun Q&A with Jonno and Luke at befores & afters, but we go much further in this chat, including breaking down the Rock DJ Regent Street scene, the final My Way sequence and also a more frenetic and intimate moment when Robbie is in his living room with the family. I really enjoyed getting the actor perspective on mocap here, and hearing how Luke and Jonno interacted in terms of VFX.The post How the motion capture worked on Better Man appeared first on befores & afters.
    0 Comments ·0 Shares ·35 Views
  • The river and 1.2 petabytes of disk space
    beforesandafters.com
    How the emotional Raka scene from Kingdom of the Planet of the Apes was made. An excerpt from befores & afters magazine.Raka, Noa and Mae are confronted by Proximus muscle at a river crossing in Wes Balls Kingdom of the Planet of the Apes. The FX simulations in that sequence would require 1.2 petabytes (1.2 billion megabytes) of disk space. To build up that scene, animation started with the original performance capture, some of which was carried out in partial sets with flowing water.Our work was a lot of back and forth with what shots retained plate elements for the water and what would ultimately be CG, relates animation supervisor Paul Story. We would be trying to keep as much of what the plate performance was there.A great example of that is a close-up shot where Raka pushes Mae up out of the water, advises VFX supervisor Erik Winquist. Thats the performance take. Special effects provided us with a small river tank that had this current flowing that they could control. Freya and Peter are there in that river flow. We were able to use Freya, paint out Peter, and replace him with CG, of course, but the water that was actually pushing up against Peters chest in his wetsuit that he was wearing is in the movie. It was great to actually take advantage of the plate water.Although Wt FX had gone through a major R&D phase for Avatar: The Way of Water to develop its Loki state machine for coupled fluid simulations, the river sequence in Kingdom presented some different challenges, in particular, that the water was of a nasty sediment-filled and dirt and debris type, with much surface foam. For the purposes of efficiency and flexibility, we were leaning less on the state machine approach and bringing back in some of the older ways of working on water sims, notes Winquist. One thing wed do is run a primary sim at low-res first that would give Paul and his team a really low-res mesh that they could at least animate to. So for example, theyd translate Peter as Raka, who was sitting in an office chair getting pulled around on a mocap stage, and work that into a very low-res sim. In the meantime, the FX team would direct the flow and get a rough version of that in front of Wes. This essentially involved art directing the current and camera, specifies Winquist. Does the camera dip under for a moment and come back over? What does that mean for having to play water sheeting down the lens? Once we had that art directed river in low-res, animation could go off and start animating apes against that current. Then also our FX team could go in and start looking at up-resing that into a much higher resolution sim.The next steps were a back and forth of animation animated to a low-res mesh. The benefit is that the animation done using the low-res mesh matches well to and integrates with the subsequent high-res fluid simulations, although tweaking is always required. Once these steps occur, the creatures team would take the flow fields of the simulation to affect the hair of the apes.For that were using Loki for the hair of the creatures and water to all interact, says Winquist. Then we take the creature bakes, bring them back into the sim, and then FX has to go in and do a super high-resolution, thin-film simulation against the hairs, because now we need to make sure that were taking into account volume preservation of the water. If they jump out of the water, we also need to show that the water is now starting to drain out of their hair.To help with simulating Rakas complex fur in its wet state, visual effects supervisor Stephen Unterfranz pitched the idea of placing the digital character in the water, running a simulation and seeing what would happen to the fur and then sculpting a bespoke groom just for use when he is in the water. It helped with establishing a characteristic parted hair look on the fur of Rakas arm and body, owing to the pressure of the rushing water.issue #27 Kingdom of the Planet of the ApesInterestingly, Rakas fall into the raging rapids of the river was something Wt FX had to revisit a couple of times, owing to a change in the line the character delivers. It was originally filmed in the Sydney backlot set with the scripted line of The work continues, said to Noa. Wes came to the realization that that was the wrong thing for that character to say as his last words, relates Winquist. The switch to Together. Strong. essentially echoes the Apes. Together. Strong. line from from Caesar, it lands so much harder as the last thing were going to hear from this mentor character.With only around a month before delivery, Macon re-delivered the line by simply recording himself on video on his iPhone. The audio from that is actually whats in the movie, reveals Winquist. It really came down to an animator having to look at what we were seeing from that iPhone footage and put that in there. It was a, My God, this is kind of devastating, this moment.The post The river and 1.2 petabytes of disk space appeared first on befores & afters.
    0 Comments ·0 Shares ·17 Views
  • Mocap actor and VFX supervisor: the Better Man Q&A
    beforesandafters.com
    Wt FX visual effects supervisor Luke Millar and motion capture performer Jonno Davies discuss bringing Robbie Williams to life.Wt FX is well-known for its close collaborations on projects with motion capture performersthink Andy Serkis, Toby Kebbell and a long line of other actors who don motion capture suits and HMCs for a role, with the VFX translating that performance into some kind of CG creature.Its a task Wt FX carried out once again for Michael Graceys Better Man, this time taking the original on-set performance of Jonno Davies through to an ape version of Robbie Williams.Here, visual effects supervisor Luke Millar and Jono Davies tell befores & afters what that partnership was like, the toughest scenes from on-set and in post, and how they crafted the more intimate scenes in the movie.b&a: Luke, certainly Weta FX has such a vast experience in performance capture, but what kinds of conversations did you have early on about the best way on set to bring Robbie to life as a digital ape?Luke Millar: Before we started shooting principal photography, I arranged to sit down with Michael Gracey and all of the films department heads to provide an overview on what would be involved in the VFX process and how it might influence everyone elses job. Wt FX has very robust systems that we can setup pretty much anywhere and capture performance data on set, on location even during live concerts! However, most of the other departments on this film had never worked with it before. From hair and makeup applying dots to Jonnos face each day to costumes providing proxy mesh clothing that Jonno could interact with but that we could still capture through. We dont work in isolation and so having the collaboration of all involved really helped with bringing Robbie to life! b&a: Jonno, how did you actually come on board Better Man? And, what was your first memory of seeing what you had done on set be translated to a digital Robbie ape, even if something very early?Jonno Davies: Kate Mulvany, who plays my Mum in the film, recommended me to Michael. Wed worked together on the Amazon series Hunters a few years back, no actual scenes together but just got on really well. Production were struggling to find their Rob and she showed them my Instagram which had some videos from when I played Alexander DeLarge in A Clockwork Orange on stage in New York. It was a vastly different interpretation to the Kubrick film, sort of physical theatre meets Gladitorial peacocking, which thankfully piqued Michaels interest. From there, MG pitched the film and showed me some pre-vis including Feel, Let Me Entertain You and My Way, and even from those basic renderings I knew that he and Wt FX were onto something special. I then auditioned over zoom with him and co-writer Simon Gleeson over the next few days, basically workshopping ideas and thankfully the role ended up being mine!Cut to about a year later when I first saw a digital ape standing in my place and I was blown away. It was actually quite an emotional moment. I think part of me always worried that Id just be used as a reference and my performance would get lost in the wonderment of it all, but seeing myself in that chimp my expressions, ad-libs etc, and then combining that with such artistry from Wt, it was very special.b&a: Luke, what was the Weta FX footprint for capturing Jonno on set? In terms of cameras, mocap gear, other measurements/survey etc?Luke Millar: VFX had, by some margin, the largest department on set! We had a VFX team of 6 for capturing LIDAR, set reference, wrangler notes and HDRIs. Five witness camera operators, three Pas, and then a team of eight purely to manage the mocap work. I would shoot GoPro videos from tech scouts and then brief the team on the scenes so that they could rig sets/locations the day before to ensure we had full coverage of the space. The system has a 3mx3m scaling volume and then around 4-5 carts that we would have to come everywhere with us! Its funny because all we are doing is collecting data at that point. By the time shooting wrapped, everyone was celebrating finishing the show and we were only just starting!b&a: Jonno, what was your prep process like for this? Can you break down how you got into a Robbie mindset in terms of consulting reference and then actual conversations with Robbie and Michael?Jonno Davies: I was brought on really last minute, I think I landed in Melbourne about 8 days before we started shooting, so you can imagine that week was crazy: from rehearsals to choreography, as well as tech prep with Wt FX like facial scans etc.Theres an acting technique I use for a lot of my work called Laban, its a brilliant way to explore how a persons character influences how they move and vice-versa, so I started from there and then added Robs idiosyncrasies.I think I kept YouTube afloat during that time, just cramming in as many of Robs performances and interviews as I could, studying how his voice (his accent and pitch really shifted between 15-30), his physicality and energy changed over time. But it was really important for me to see what hes like when the cameras arent rolling, and luckily Rob was really giving with his time and allowed me to see that difference between Robert the human and Robbie the popstar. b&a: Can you both talk about the Regent Street Rock DJ sequence? The energy in that sequence is just amazingwhat was that experience like with such a long rehearsal time, and also limited time each night for the shoot? Jonno Davies: Absolutely wild. What I loved about Rock DJ was that it was one of the rare musical numbers where Robbie isnt plagued by his demons, so I was allowed to really enjoy myself and properly soak in the spectacle of what we were collectively trying to achieve.As you say, there was a limited time each night, plus its not like we could just add another day at the end of the shoot if we didnt have everything we needed. Thats why rehearsals were so extensive, the muscle-memory needed to be second nature by the time we reached set, and thats not just for main cast and dancers, it includes the camera department too, they had their own choreo to stick to.That sort of militant prep really instilled a confidence in us though and allowed us to let rip on for every take.Luke Millar: So much prep went into Rock DJ! We previsd, techvisd, shootvisd, re-techvisd and then rehearsed. By the time we were on that street, I have never felt more ready but you never know what will happen. We wanted as few wipes as possible and never an obvious extra walking closely past camera, so that required many takes to get things as tight as possible. The tricky thing was we had to shoot it in order as we need to join onto the the previous nights work. The downside to a oner is, if we didnt manage to nail one piece then none of it would work! Having an onset editor was essential as we could capture takes live and cut them over the previs to ensure that our timing and camera work was spot on. That said, we still had to use pieces from 36 plates to stitch the whole thing together! If anyone is contemplating trying to shoot a musical number with 5 synchronized mobility scooters, DONT! They are the most temperamental things ever!b&a: The concert and dance moments are incredible, but I also love more intimate scenes, such as Robbies time with Nan. Can you both discuss how making these types of moment differed from the much larger ones?Jonno Davies: Yeah, these moments are so important, theyre what make the ape feel innately human. Plus its those sort of cherished relationships that people can relate to. We had a lot more stillness in these type of scenes and when you pair that with the fact theres no microphone or grand performance to hide behind, it suddenly becomes very vulnerable and exposed. Thats when Michael and the camera get properly up-close to Rob, and you can really appreciate not just the fragility of whats going on his mind, but also the incredibly nuanced work of what the artists at Wt FX have achieved.Luke Millar: I was always acutely aware of the intimacy and sensitivity behind some of the scenes and so for me, my biggest concern was whether any of our gear would affect those moments. Jonno wore a dual mounted face cam and helmet but if he needed to get close, it would be in the way. Robbie is the only digital character in the shot so we couldnt compromise any other performance in the frame. This meant a lot more work from the animation team to replicate the subtlety and nuance in Jonnos performance, however once it clicks into place everything works. b&a: Jonno, do you have any specific advice youd give Luke about his own motion capture appearances in the film, i.e. things he did well or could even do better ?Jonno Davies: If this all goes tits up, Luke would make an excellent bus driver. I feel like he really committed to the character.b&a: What was the hardest scene for both of you to perform and execute?Jonno Davies: Probably Land of a Thousand Dances, which is the montage sequence that follows Robbies meteoric rise to solo stardom. Theres a specific section where we show a duet that he did with Tom Jones at the Brit Awards and Ashley Wallen (choreographer) wanted us to go like-for-like with the movement. You can tell that Rob was absolutely wired during this performance, so I obviously had to recreate that take after take.I remember this very specific moment during that shoot when the dynamics shifted: I went from this adrenal glee of entertaining our hundreds of extras, feasting off the buzz of the crowd, to suddenly hitting around take 15 and realising that the adrenaline was wearing off, and was running on fumes knowing we have probably another 15 angles to shoot. It brought a sort of fight-or-flight sensation and gave me a greater understanding and respect for what Rob went through back then.Luke Millar: Definitely Shes The One. Close interaction with Robbie is by far the hardest work and the dance in Shes The One is nothing but close interaction! Robbie has longer arms than a human and so all of those contact points have to be reworked to fit. We need accurate 3D representation of Nicole so that when she touches Robbie, his hair and clothing move in sync with the plate and there is no other way to do this but a lot of back and forth between Animation and Simulation.We also had some complex match cuts and transitions which needed massaging together as well as some insanely fluid camera moves that required parts of the boat set removing and then replaced with digital versions in post. It was also our only real bluescreen scene in the film too, so we had to extend the boat, create a digital environment and then blend that into a cyclorama that I shot on the Cte dAzur. Even the neighboring boats have CG dancing New Years partygoers on them! The amount of detail is really incredible.The post Mocap actor and VFX supervisor: the Better Man Q&A appeared first on befores & afters.
    0 Comments ·0 Shares ·54 Views
  • Dune: Part Two wins BAFTA for Special Visual Effects
    beforesandafters.com
    The BAFTA for Special Visual Effects was awarded to Paul Lambert, Stephen James, Gerd Nefzer and Rhys Salcombe for Dune: Part Two.Check out all the coverage of Dune: Part Two at befores & afters here.issue #23 Dune: Part TwoThe post Dune: Part Two wins BAFTA for Special Visual Effects appeared first on befores & afters.
    0 Comments ·0 Shares ·63 Views
  • The making of Chistery and the monkey guards from Wicked
    beforesandafters.com
    An excerpt from befores & afters print magazine.In Jon M. Chus Wicked, the Wizards monkey guards were CG creatures created by ILM. Jon wanted a powerful-looking creature, outlines ILM animation supervisor David Shirk, so art exploration led us to combine elements primarily from larger apes like chimpanzees, baboons and orangutans, with a characteristic monkey tail. Rather than waddle upright on two legs, a more powerful quadruped walk was developed and was the principal locomotion along with a physical size that made them feel intimidating next to the human characters.Early ILM animation testing explored an orangutan-based walk, says Shirk. But the characteristic balancing on the sides of their feet was traded for a more grounded and much heavier soldier-like feeling. From our main hero monkey, we developed multiple variations to populate the army of monkeys featured heavily in the films third act.ILM concept art.On set, stand-in performers rehearsed and worked on-camera with the principal actors to aid with interaction, eyelines and framing. Shirk notes that any extreme stunt performance was left to animation. For acting beats, he says, particularly in the case of Chistery, who is captain of the monkey guards, the on-set team gave us a starting point for physical performance and placement but acting choices were left to post-production and grew organically from the edit as it developed.We used an unusual approach to arrive at the acting beats, continues Shirk, who notes that a proprietary Face Select toolset was used by ILM. In collaboration with the director, I worked with the animation team to create close-up live performances, delivering multiple options per shot that were used in editorial to define Chisterys acting performance, then used that as the template for animation. Final animation consisted of hero keyframed action.At one point, Elphaba reads from the sacred Grimmerie spellbook, resulting in the monkey guards transforming to sprout blue wings. There was a lot of talk about transformation because it was obviously something that was very painful, recalls visual effects supervisor Pablo Helman. Jon directed the animators to do certain things in Zoom sessions, working with David. In fact, for one of the shots, Jon kept saying, I can see David Shirk right there!The transformation scenes were a bit of a tightrope, weighs in Shirk. The filmmakers wanted the effect to be visceral and scary but not excessively grotesque or too horrific. The on-set performers gave us a strong starting point for blocking, especially in defining how Chistery would travel through the space as he scuttled, rolled and writhed. As we had many monkeys to depict in this process, an exploratory mocap session was also invaluable to try out many types of actions quickly.We learned that playing up confusion, fear and bewilderment but being judicious in depicting pain in the crowd reactions helped to soften the edge, adds Shirk. It was a rule that carried over to any close-up facial performances throughout the scene. As always, for key beats involving emotional performance, delivering multiple vid-ref takes helped us to home in on what the filmmakers wanted from the characters.For the wings, ILM animated these to emerge from under costumes, bursting as they unfold, rather than showing them emerging directly from the body. Over 5,100 feathers per monkey had to be groomed. Shirk notes that staging was handled carefully so feathers grew and multiplied across bodies while never being shown emerging from skin.issue #26 WickedFor shots of the monkeys taking flight, ILM first collected reference. Eagles and owls were primary sources of flight and takeoff/landing inspiration, advises Shirk. A major obstacle was that rather than the wings growing from shoulders as they do with birds, ours grew from the middle of the back, creating an especially tricky challenge in making natural-looking flight movement. Many motion tests were produced to refine the look of their flight and even though our monkeys had full heavy limbs, and, eventually, cumbersome armor as well, the director wanted their entire body to feel engaged during flight, so limbs never hung or dragged. When in full flight, the legs are played lightly and have a strong secondary dynamic reminiscent of a tail while the arms have a sort of pump, staying engaged and feeling like the shoulders are helping to motivate the wing action.The post The making of Chistery and the monkey guards from Wicked appeared first on befores & afters.
    0 Comments ·0 Shares ·53 Views
  • The Better Man VFX Notes show is here
    beforesandafters.com
    Hugo and Ian discuss the film, the VFX, and Robbies eyebrows.This week on VFX Notes, a new entry in our season on the 2025 VFX Oscar Nominees. Hugo and Ian discuss Michael Graceys Better Man, the biopic where Robbie Williams is played by a CGI chimpanzee. We discuss the film, talk about the cinematography, the VFX, compositing, and simulations from Wt FX, the tech, and talk about some of our favorite sequences.You can help support VFX Notes at the dedicated Patreon, too.The post The Better Man VFX Notes show is here appeared first on befores & afters.
    0 Comments ·0 Shares ·22 Views
  • See Rodeo FXs breakdown for Sonic The Hedgehog 3
    beforesandafters.com
    Go behind the scenes.The post See Rodeo FXs breakdown for Sonic The Hedgehog 3 appeared first on befores & afters.
    0 Comments ·0 Shares ·86 Views
  • Here are all the winners from the 23rd Annual VES Awards
    beforesandafters.com
    Major prizes went to Kingdom of the Planet of the Apes, Civil War, The Wild Robot, Shgun and The Penguin.The winners for the 23rd Annual VES Awards in 25 categories are as follow:OUTSTANDING VISUAL EFFECTS IN A PHOTOREAL FEATUREKingdom of the Planet of the ApesErik WinquistJulia NeighlyPaul StoryDanielle ImmermanRodney BurkeOUTSTANDING SUPPORTING VISUAL EFFECTS IN A PHOTOREAL FEATURECivil WarDavid SimpsonMichelle RoseFreddy SalazarChris ZehJ.D. SchwalmOUTSTANDING VISUAL EFFECTS IN AN ANIMATED FEATUREThe Wild RobotChris SandersJeff HermannJeff BudsbergJakob Hjort JensenOUTSTANDING VISUAL EFFECTS IN A PHOTOREAL EPISODEShgun; AnjinMichael CliettMelody MeadPhilip EngstrmEd BruceCameron WaldbauerOUTSTANDING SUPPORTING VISUAL EFFECTS IN A PHOTOREAL EPISODEThe Penguin; BlissJohnny HanMichelle RoseGoran PavlesEd BruceDevin MaggioOUTSTANDING VISUAL EFFECTS IN A REAL-TIME PROJECTStar Wars OutlawsStephen HawesLionel Le DainBenedikt PodlesniggAndi-Bogdan DraghiciOUTSTANDING VISUAL EFFECTS IN A COMMERCIALCoca-Cola; The HeroesGreg McKneallyAntonia VlastoRyan KnowlesFabrice FiteniOUTSTANDING VISUAL EFFECTS IN A SPECIAL VENUE PROJECTD23; Real-Time RocketEvan GoldbergAlyssa FinleyJason BrenemanAlice TaylorOUTSTANDING CHARACTER IN A PHOTOREAL FEATUREBetter Man; Robbie WilliamsMilton RamirezAndrea MerloSeoungseok Charlie KimEteuati TemaOUTSTANDING CHARACTER IN AN ANIMATED FEATUREThe Wild Robot; RozFabio LigniniYukinori InagakiOwen DemersHyun HuhOUTSTANDING CHARACTER IN AN EPISODE, COMMERCIAL, GAME CINEMATIC, OR REAL-TIME PROJECTRonja the Robbers Daughter; Vildvittran the Queen HarpyNicklas AnderssonDavid AllanGustav hrenNiklas WallnOUTSTANDING ENVIRONMENT IN A PHOTOREAL FEATUREDune: Part Two; The Arrakeen BasinDaniel RheinDaniel Anton FernandezMarc James AustinChristopher AnciaumeOUTSTANDING ENVIRONMENT IN AN ANIMATED FEATUREThe Wild Robot; The ForestJohn WakeHe Jung ParkWoojin ChoiShane GladingOUTSTANDING ENVIRONMENT IN AN EPISODE, COMMERCIAL, GAME CINEMATIC, OR REAL-TIMEPROJECTShgun; OsakaManuel MartinezPhil HanniganKeith MaloneFrancesco CorvinoOUTSTANDING CG CINEMATOGRAPHYDune: Part Two; ArrakisGreig FraserXin Steve GuoSandra MurtaBen WiggsOUTSTANDING MODEL IN A PHOTOREAL OR ANIMATED PROJECTAlien: Romulus; Renaissance Space StationWaldemar BartkowiakTrevor WideMatt MiddletonBen ShearmanOUTSTANDING EFFECTS SIMULATIONS IN A PHOTOREAL FEATUREDune: Part Two; Atomic Explosions and WormridingNicholas PapworthSandy la TourelleLisa NolanChristopher PhillipsOUTSTANDING EFFECTS SIMULATIONS IN AN ANIMATED FEATUREThe Wild RobotDerek CheungMichael LosureDavid ChowNyoung KimOUTSTANDING EFFECTS SIMULATIONS IN AN EPISODE, COMMERCIAL, GAME CINEMATIC, OR REAL-TIME PROJECTShgun; Broken to the Fist; LandslideDominic TiedekenHeinrich LweCharles GuertonTimmy LundinOUTSTANDING COMPOSITING & LIGHTING IN A FEATUREDune: Part Two; Wormriding, Geidi Prime, and the Final BattleChristopher RickardFrancesco DellAnnaPaul ChapmanRyan WingOUTSTANDING COMPOSITING & LIGHTING IN AN EPISODEThe Penguin; After HoursJonas StuckenbrockKaren ChengEugene BondarMiky GirnOUTSTANDING COMPOSITING & LIGHTING IN A COMMERCIALCoca-Cola; The HeroesRyan KnowlesAlex GabucciJack PowellDan YargiciOUTSTANDING SPECIAL (PRACTICAL) EFFECTS IN A PHOTOREAL PROJECTThe Penguin; Safe GunsDevin MaggioJohnny HanCory CandrilliAlexandre ProdhommeEMERGING TECHNOLOGY AWARDHere; Neural Performance ToolsetJo PlaeteOriel FrigoTomas KoutskyMatteo Olivieri-DanceyOUTSTANDING VISUAL EFFECTS IN A STUDENT PROJECTPittura (entry from ARTFX Schools of Digital Arts, France)Adam LauriolTitouan LassreRmi VivenzaHellos MarreThe post Here are all the winners from the 23rd Annual VES Awards appeared first on befores & afters.
    0 Comments ·0 Shares ·83 Views
  • On The Set Pic: Sonic The Hedgehog 3
    beforesandafters.com
    Director Jeff Fowler on the set of Sonic The Hedgehog 3.The post On The Set Pic: Sonic The Hedgehog 3 appeared first on befores & afters.
    0 Comments ·0 Shares ·94 Views
  • Watch Unions VFX breakdown for Tattooist of Auschwitz
    beforesandafters.com
    Full of invisible effects.The post Watch Unions VFX breakdown for Tattooist of Auschwitz appeared first on befores & afters.
    0 Comments ·0 Shares ·114 Views
  • Whoa, Wt FXs full VFX breakdown for Kingdom of the Planet of the Apes is now available
    beforesandafters.com
    Behind the scenes of the film.issue #27 Kingdom of the Planet of the ApesThe post Whoa, Wt FXs full VFX breakdown for Kingdom of the Planet of the Apes is now available appeared first on befores & afters.
    0 Comments ·0 Shares ·113 Views
  • The directors of Wallace & Gromit: Vengeance Most Fowl break down the art of stop-motion
    beforesandafters.com
    Go behind the scenes of the Aardman film in this befores & afters video interview.In this new video interview, befores & afters Ian Failes chats to Nick Park and Merlin Crossingham about their stop-motion animated film, Wallace & Gromit: Vengeance Most Fowl. Learn about the craft, including puppets, stop-motion animation, live action videos (LAVs) and timelapses. The Aardman film is now streaming on Netflix.The post The directors of Wallace & Gromit: Vengeance Most Fowl break down the art of stop-motion appeared first on befores & afters.
    0 Comments ·0 Shares ·86 Views
  • Go behind the scenes of The Falcon and The Winter Soldier
    beforesandafters.com
    Marvel has made the entire Marvel Studios Assembled: The Making of The Falcon and The Winter Soldier featurette available to watch on YouTube.The post Go behind the scenes of The Falcon and The Winter Soldier appeared first on befores & afters.
    0 Comments ·0 Shares ·134 Views
  • VFX Notes breaks down Kingdom of the Planet of the Apes
    beforesandafters.com
    Hugo and Ian continue the 2025 VFX Oscar nominees season of VFX Notes.This week, Hugo and Ian continue with our season dedicated to the 5 nominees for this years Oscar for Best Visual Effects. In this episode, we discuss Wes Balls Kingdom of the Planet of the Apes. We review the film, talk about the franchise, discuss the Wt FX tech and pipeline and go in-depth into the biggest sequences of the film.Tons of fun detail in the ep. Get even more insights into the films and we projects we cover at VFX Notes by supporting us at the dedicated VFX Notes Patreon. The post VFX Notes breaks down Kingdom of the Planet of the Apes appeared first on befores & afters.
    0 Comments ·0 Shares ·98 Views
  • Framestore breaks down Dr. Dillamond VFX from Wicked
    beforesandafters.com
    Scanning, the build, lip sync and performance.Be sure to catch the full coverage of Wicked in the befores & afters print mag.issue #26 WickedThe post Framestore breaks down Dr. Dillamond VFX from Wicked appeared first on befores & afters.
    0 Comments ·0 Shares ·106 Views
  • How ILM made the audience-favorite character Dulicbear for Wicked
    beforesandafters.com
    An excerpt from befores & afters print magazine.In Wicked, the audience learns of the birth of Elphaba and her unnaturally green skin, and her subsequent upbringing by the nanny, Dulcibear (a talking bear voiced by Sharon D. Clarke). ILM was responsible for this character. Looking to maintain a sense of anthropomorphism to Dulcibear, ILM kept her locomotion on all fours as much as possible, discusses ILM animation supervisor David Shirk.When motion studies showed that cradling a baby in one arm resulted in a three-legged motion that resembled a limp, she was allowed to rise to her hind legs, but real-world bears are quite adept at walking on two legs, resulting in a strong person in a suit vibe. This was remedied by intentionally giving her a slightly exaggerated, lumbering gait.Dulcibear had a quick screen introduction and needed to quickly establish as a nurturing and appealing character, adds Shirk. To support this, subtle but important tweaks were made to her physical characteristicsher face and snout rounded, teeth reduced by varying degrees from shot to shot, and her claws rounded, smoothed, and shortened, and carefully posed to avoid direct contact with the baby in shots. In addition, we gave her the ability to twist her wrist to cradle and a small separation in her toes that allowed the impression of a thumb.The facial animation for Dulcibear was tailored for delivering musical dialogue with custom shapes designed to allow rapid phrasing weighted to the front of her snout. Says Shirk: This avoided excessive movement from her very large mouth and always with an overlay of characteristic bear nuances and movement in her facial animation to ground her as an animal. It was very helpful that bear nose movement is also quite appealing! Her non-singing acting was heavily based on human reference for overall timing and nuance. In Dulcibears interactions with the children, we always tried to balance realistic handling of her size and weight with a gentle style of moving.ILMs creature work extended further to the wolf doctor in Elphabas birth scene, as Shirk discusses. For our wolf doctor, we made a creative adjustment as natural wolf movement tends to strongly indicate their predatory nature, so she was instead mainly based on domesticated dogs like huskies, featuring a relaxed and less controlled way of moving. issue #26 WickedThe post How ILM made the audience-favorite character Dulicbear for Wicked appeared first on befores & afters.
    0 Comments ·0 Shares ·100 Views
  • Go deep into Kingdom of the Planet of the Apes in latest print mag
    beforesandafters.com
    Full of before/after photos and tech info on Wt FXs visual effects.Issue #27 of befores & aftersmagazine(which is now out!) covers the visual effects of Wes Balls Kingdom of the Planet of the Apes.Youre the first to catch a look inside and be able to grab the magazine.It goes in-depth on Wt FXs process for handling the performance capture of actors on set, through to translation into CG apes and the build, animation, simulation and final steps involved.This issue features a multitude of before and after images, progressions and on-set behind the scenes photos breaking down how key scenes were made.Find issue #27 at your local Amazon store:USAUKCanadaGermanyFranceSpainItalyAustralia JapanSwedenPolandNetherlandsThe post Go deep into Kingdom of the Planet of the Apes in latest print mag appeared first on befores & afters.
    0 Comments ·0 Shares ·127 Views
  • How that crazy car crash oner in Carry-On was made
    beforesandafters.com
    Including filming actors on a bluescreen rotisserie rig.Its the scene everyone is talking about. Director Jaume Collet-Serras Carry-On features an intense action moment when LAPD detective Elena Cole (Danielle Deadwyler) is in a car en route to LAX airport with DHS agent John Alcott (Logan Marshall-Green), when she realizes the agent is an imposter.The resulting scuffle results in gunfire, near misses with many vehicles and eventually the car crashing and tumbling over, before Cole is able to shoot Alcott dead while they remain upside downwith the entire action playing out all from inside the car and all in one long oner.The visual effects for this dramatic scene was orchestrated by Wt FX, under visual effects supervisor Sheldon Stopsack. The studio and Stopsack had previously worked with Collet-Serra on Black Adam. When we were doing some additional shooting on that movie, recounts Stopsack, Jaume mentioned Carry-On and that he wanted to try something new, which was this car chase sequence. He described it to me on-set and he was wondering how one would go about doing it.Planning the onerThe car crash would ultimately be filmed with the actors in a partial car buck set against bluescreen, with Wt FX building the world, the vehicles and even the inside parts of the car around the characters. To get there, Day For Nite started by previsualizing the oner. If you go back now and look at it, shares Stopsack, you can literally take the shot, take the previs and play it side by side and theres so many similarities. Theres so many things that were established really early on.Early on, too, stunt coordinator and second unit supervisor Dave Macomber helped plan out the scene. He and his team jumped into Xsen suits and into a car mock-up and captured their motion, outlines Stopsack. Then he put that into Unreal Engine and they previsualized this themselves.These efforts aided Wt FX in realizing that there would be a great deal of action happening outside of the car, as well as everything going on inside the car, with moments of cross-over. To help establish what would be outside the vehicle (ie. the roadway, landscape, other vehicles), the team began a scouting phase. The scene takes place on the I-105 Interstate Highway in Los Angeles, says Stopsack. We said, Lets go on Google Maps and Street View and lets see what takes us down the road.It was also about working out speeds, adds Wt FX visual effects sequence supervisor Ben Warner. We had to work out, how much freeway do you actually travel in that amount of time, and how speed indicates the travel. For as much as it feels like youre traveling a long distance, they actually dont travel that far in the end, only about a 2.5 kilometer stretch.Shooting an array in LAThat scouting process was also done in preparation for a camera array shoot of the highway, the results of which would inform the backgrounds created by Wt FX. We would literally have a car go down that interstate to capture panoramic footage, explains Stopsack. Production hired a car with a video team that had Blackmagic cameras mounted on top of the car, capturing panoramic footage for us to stitch together.When the array footage was being captured, a front-mounted GPS was also installed. It meant you could see what they were seeing at the front of the car, notes Warner. You could match it up with the array but you could also match up the GPS locations. We could use that to always work out where we lived on the freeway and where we wanted to end up. It was somewhat important to have that GPS information, continues Stopsack, because when we shot the array footage, we traveled down at a reasonably consistent speed to have a reasonably steady capture of the stretch of the highway. But with the action beats of the actual Dodge Charger car going down, it wouldve gone with a different speed. So we roughly needed to know how to re-time that footage in order to get us to the geography that we were in based on the speed of the travel of the car in the film. It was an exercise of knowing exactly where it was captured and at what speed it was captured.One interesting aspect of the array shoot was how to manage the traffic on the highway during the shoot. You cant just lock down a 2.5 kilometer stretch leading up to LAX, points out Stopsack. But the highway patrol was kind enough to allow us to drive down it and keep the traffic as far away as possible. They couldnt stop the traffic and lock it off all together, which made for some interesting footage because you had the on and off ramps and sometimes there were cars coming from the on ramp and then the highway patrol blocking them off.Filming the actorsPrincipal photography of the actors on bluescreen made use of a specially-designed car rig, and a Technocrane for shooting, advises Stopsack. We broke down the whole shot into four beats, effectively. Its worth noting that the original plan was for the oner to be a lot longer, with a whole back section where Cole lowers herself out of the seatbelt, crawls out of the car, stops another car and drives off, again, all in the oner. Things were shortened for editorial reasons. The original four beats were, everything leading up to the crash, the crash itself, the post-crash moment with them being upside down, and then Elena crawling out. Breaking it down into these manageable chunks allowed us to target how we were going to shoot it.Everything leading up to the crash was a very well orchestrated stunt performance with Dave Macomber taking the lead there and really hitting the beats, adds Stopsack. They shot this on a bluescreen stage with what we called the rotisserie, which was basically a skeleton of a Dodge Charger mounted onto an absolute beast of a steel metal frame, which weighed a ton. It was mounted in a way that the car could effectively roll over. We had control over it to suggest leaning angles and which side of the car would be swerving. For the crash itself, we could flip it upside down. Indeed, we shot a whole portion of the crash with them literally being rolled over two and a half times with our actors in there flailing their arms.During the crash, the car comes incredibly close to other vehicles; at one point another cars side mirror smashes into the side of Cole and Alcotts cara gag achieved on set with a bluescreen-covered mirror prop that would later be replaced by a CG one. That coordination on set was all Dave Macomber and his stunt team, says Stopsack. The actors had to rehearse all this a few times. Dave would be sitting on a ladder in front of the car on the rotisserie rig, slightly elevated, yelling the beats. It was actually quite the dance because the camera team needed to do its thing, the actors needed to do their thing, and the stunt team was yelling the beats at them. View this post on InstagramA post shared by Netflix Geeked (@netflixgeeked)Assembling the shotEditorial brought together the different takes from the bluescreen shoot, with Wt FX then responsible for figuring out how to seam it altogether. We took on the footage and tried to exactly figure this puzzle out ourselves, says Stopsack. Takeover points was a big one to consider. We also started to figure out how to orchestrate the beats that then affect the world around them. Again, theres the inside world and the outside world, and for this we had an approach called a local space workflow. We tracked everything that happened on stage, including the movement of the buck and the movement of the actors. We were supported by our on-set team here, who came along for the shoot and they provided us with a stereo rig that was mounted to the hero principal camera. It was a similar approach to what was used on Avatar: The Way of Water where an additional stereo pair was shot.This was very useful for us, continues Stopsack, because it allowed us to not only get effectively a camera track from it, but we could also get some spatial awareness of what was happening in front of us. We had our actors in the car and we effectively could generate a mesh from that. We utilized that on our end to determine, say, when is the arm in the right spot for a takeover, or, is the car leaning left and what does it do exactly?Once everything had been stitched and blended together, artists lined up the local space to what was needed for the highway. We still needed to cater for the world outside of the car, says Stopsack. This is where our animation team kicked in who did a lot of the orchestration of the traffic outside. They needed to work towards all those timings wed already established. So, again, the dance continued because a lot of these pieces just needed to continuously be orchestrated.We spent a lot of time taking that local space camera and putting it into the world space, notes Warner. Theres a moment where they cross over onto the other side of the road. Geographically, we knew where they needed to end up. When we backed it up to where they crossed, that point crossing was where the roads were split on two bridges. There were a lot of timing considerations to make sure that when they did cross, they actually were crossing where the roads would actually make sense in terms of the geography that we were using. There was a lot of time spent upfront just making sure all those beats, both the internal and external, were all hitting the same thing. The action inside the carWith the camera moving around a great deal, and with the car buck only being a portion of the vehicle, Wt FX would ultimately replace much of the car interior with a digital version. The brief was, the camera had to stay inside the car, states Warner. Jaume really didnt want a notion of a camera coming in through the window and out the other side. He wanted to keep it internal. One of the big things we had to do was replace the roof. Now, when people are in cars, they shadow from the top and that was something we had to adjust.The seats are replaced almost the entire time, too, says Stopsack. Then theres the dashboard. The interior of the car was probably the most advanced interior that has ever been built here at Wt. A little Easter egg is, try to see if you can spot what radio channel theyre listening to, its playing Wt FM.For the roll-over crash moment, not only was the car interior digital, so too were the characters inside the car. That helped us transition from the first part of the crash to go into an all-digital portion with all-digital characters, describes Stopsack. Then when it ends with them upside down, we transition to the live-action actors hanging in the rotisserie rig upside down. For our digital doubles, we even adapted the facial rigs and facial system used on The Way of Water, which is our Anatomical Plausible Facial System (APFS).Breaking windshield and window glass and then small tumbling pieces of glass were a feature of the roll-over. Wt FX began with a physics-based simulation, says Stopsack. However, youd be surprised how physics doesnt want to give you what you want to see. We ended up cheating there a little bit, more in favor of creating the chaos and the violence and the energy.Stopsack and Warner credit compositor Robert Hall with leading the compositing of the oner. Says Warner: Rob picked and chose where we used the CG, where we used the plate. He found the blend points. He found what was the best solution. It wasnt always CG. It wasnt always plate. It really wasnt a traditional comp. Luckily, he had an input early on about how it was all going to come together, which was nice.The action outside the carFor the outside roadway, the array footage became a starting backdrop for Wt FXs CG highway build. This included all the roadway paraphernalia, side walls, barrels and, of course, other vehicles. Effectively, reveals Warner, Wt FX had to animate everything all twice by first making the scene work on the inside of the car, and then making the outside scenes support what was happening inside. Theres a whole other story being told on the outside! says Warner. Bringing those two together into one continuous thing was fun.At one point, a truck flips in front of the now out-of-control car. Eagle-eyed viewers may have caught that the moment the truck begins edging sideways matches up to a gunshot coming from inside the car, as Stopsack describes. Cole is fighting with Alcott, and Cole has a gun in her hand, Alcott smashes her hand and a shot goes off. Theres a muzzle flash that pierces a hole in the windshield which triggers a tire to be punctured in the truck in front of them, which is why it starts flipping over.If you look for it, says Warner, youll see the tire really goes pop, and that it goes pop maybe two or three frames after the muzzle flash goes off. Then we found the most amazing amount of reference on YouTube for exploding tires, and airbagsall sorts of crazy crashes, says Warner. We just made it bigger and bigger until everything basically explodes. The post How that crazy car crash oner in Carry-On was made appeared first on befores & afters.
    0 Comments ·0 Shares ·117 Views
  • Wicked VFX supe Pablo Helman breaks down the films effects in this new video
    beforesandafters.com
    Covering environments, creatures, flying, the train, the Wizard and so much more.Go in-depth into the VFX of Wicked in befores & afters dedicated print magazine to the film.issue #26 WickedThe post Wicked VFX supe Pablo Helman breaks down the films effects in this new video appeared first on befores & afters.
    0 Comments ·0 Shares ·114 Views
  • Going behind the scenes on the visual effects of Alien: Romulus
    beforesandafters.com
    Alien: Romulus visual effects supervisor Eric Barba about the VFX Oscar nominated film.Today on the befores & afters podcast, were looking at the VFX of Alien: Romulus, with VFX supervisor Eric Barba. This is actually the interview I did with Eric for the print edition of befores & afters magazine. In the chat, we talk about the mix of practical and digital in the film, including the miniatures, the Xenos, Facehuggers, the Offspring, and Rook and then of course how the digital effects teams, led by ILM and Wt FX, helped craft many aspects of the film. We also talk about Metaphysics machine learning VFX for Rook, which I think is a fascinating aspect of the movie.This episode is sponsored by Suite Studios. Ready to accelerate your creative workflow? Suites cloud storage is designed for teams to store, share, and edit media in real-time from anywhere. The best part? With Suite, you can stream your full-resolution files directly from the cloud without the need to download or sync media locally before working. Learn more about why the best creative teams are switching to Suite at suitestudios.ioThe post Going behind the scenes on the visual effects of Alien: Romulus appeared first on befores & afters.
    0 Comments ·0 Shares ·98 Views
  • On The Set Pic: Back in Action
    beforesandafters.com
    Cameron Diaz as Emily on the set of Back In Action.Cr. John Wilson/Netflix 2024.The post On The Set Pic: Back in Action appeared first on befores & afters.
    0 Comments ·0 Shares ·125 Views
  • Watch this virtual production b-roll for Flight Risk
    beforesandafters.com
    The b-roll showcases the shooting of plane bodies on motion bases against an LED wall for the film. The post Watch this virtual production b-roll for Flight Risk appeared first on befores & afters.
    0 Comments ·0 Shares ·121 Views
  • More of Skeleton Crews Mama Crab from Tippett Studio
    beforesandafters.com
    Watch the new video.The post More of Skeleton Crews Mama Crab from Tippett Studio appeared first on befores & afters.
    0 Comments ·0 Shares ·132 Views
  • Get all the details on the upcoming Virtual Productions Gathering 2025
    beforesandafters.com
    Its happening in Breda, Netherlands April 9-10, 2025. Find out how to get discounted tickets with a special befores & afters code.The 5th Annual Virtual Productions Gathering (VPG25) is being held April 9-10, 2025 at Breda University of Applied Sciences in the Netherlands.It is made up of two days of activities:April 9 Educational Day: Designed for academics and students, this day will explore VPs role in education through live demos, best practices, panel discussions, and student project showcases.April 10 Industry Day: Featuring exclusive talks from leading VP innovators, a cutting-edge live demo, a VP marketplace, and a panel on the future of digital storytelling, this day concludes with a networking mixer.Heres some of the key speakers who will be attending:Erik Wolfie Wolford Shooting Europa an Exploration Into the Latest ICVFX Techniques Shaping the Future of FilmmakingKathryn Brillhart Advancing Cinematic Artistry Through VP Paradigms, Emerging Trends, and TechnologiesAdrian Weber Beyond the LED Wall: The Expanding Role of Real-Time Tools in Media ProductionJoan Da Silva Unreal Engine in Classrooms Way Beyond GamesAs a special offer, befores & afters readers can access an exclusive 20% discount using the promotional code BA_VPG25 when purchasing tickets.Learn more & register here: vpgathering.comThe post Get all the details on the upcoming Virtual Productions Gathering 2025 appeared first on befores & afters.
    0 Comments ·0 Shares ·136 Views
  • Heres ILMs VFX breakdown for Wicked
    beforesandafters.com
    The post Heres ILMs VFX breakdown for Wicked appeared first on befores & afters.
    0 Comments ·0 Shares ·114 Views
  • Our VFX Oscar nominees season of VFX Notes is here, starting with Alien: Romulus
    beforesandafters.com
    Go behind the scenes of the creatures and space environments of Fede lvarezs film.Big things have changed at VFX Notes! We are now season-based, which means youll be able to watch collections of episodes around a single theme. Our first new theme is the 2025 VFX Oscar nominees. Each week ahead of the Oscars, youll be able to watch new episodes on the nominees for the visual effects Oscar. Thats right, weve got 5 individual eps on Alien: Romulus, Better Man, Dune: Part Two, Kingdom of the Planet of the Apes and Wicked.PLUS, a very special thing were doing with seasons is launching a dedicated VFX Notes Patreon! Its a way of supporting what Hugo and I are doing with VFX Notes. Sign up and get a WHOLE BUNCH of bonus content each week. Youll get: Early access to upcoming VFX Notes episodes Your name on the credits of the Podcast Bonus episodes from Hugo and Ian Extra VFX content Access to the Patreon community to chat with Hugo and Ian, and other Patreon members Youre able to give us direct suggestions for themes and episodes Access to special Zoom Patreon drop-ins that will be hosted by Hugo and IanTheres just one simple tier to choose from! Check it out at https://www.patreon.com/VFXNotes.In the meantime, heres our first episode of the 2025 VFX Oscar nominees season on Alien: Romulus. Weve gone deep into the practical creatures, the digital visual effects, sims, ML tools, and all the incredible work on this film. Thanks for watching!The post Our VFX Oscar nominees season of VFX Notes is here, starting with Alien: Romulus appeared first on befores & afters.
    0 Comments ·0 Shares ·118 Views
  • Old-school and new-school tech in Wallace & Gromit: Vengeance Most Fowl
    beforesandafters.com
    Will Becher, supervising animator and stop motion lead at Aardman, on Wallace & Gromit: Vengeance Most Fowl and the tech used in the film.Today on the befores & afters podcast, were chatting to Will Becher, supervising animator and stop motion lead at Aardman, about Wallace & Gromit: Vengeance Most Fowl, which is directed by Nick Park and Merlin Crossingham. Now, if youve read the latest animation issue of befores & afters magazine, youll see that Will was featured in that issue discussing the tech used by Aardman in the film. I thought the conversation was a really fun one, so Im also presenting it here as an audio podcast for you. Here we talk about 3D printing, motion control, CG, using clay, cameras, and all sorts of other stop-motion tech related things. This episode is sponsored by Suite Studios. Ready to accelerate your creative workflow? Suites cloud storage is designed for teams to store, share, and edit media in real-time from anywhere. The best part? With Suite, you can stream your full-resolution files directly from the cloud without the need to download or sync media locally before working. Learn more about why the best creative teams are switching to Suite at suitestudios.ioThe post Old-school and new-school tech in Wallace & Gromit: Vengeance Most Fowl appeared first on befores & afters.
    0 Comments ·0 Shares ·109 Views
  • Adding to the Offspring, and crashing a massive space station
    beforesandafters.com
    Behind Wt FXs killer visual effects on Alien: Romulus.Two of the main visual effects aspects of Fede lvarezs Alien: Romulus that Wt FX worked on related to the Offspring and the space station crashing into the planet rings.The Offspring, a humanxenomorph hybrid, was played on set by Robert Bobroczkyi in a Legacy Effects suit. Wt FX made a few digital augmentations and enhancements to the suit, such as adding a tail, while also dealing with the Offsprings skin disintegration when it faces the vacuum of space.Meanwhile, the space station crash involved simulating the planet rings and also orchestrating the spectacular destruction of the craftsomething Wt FX did in CG but by approaching it as if the space station was a physical miniature containing all manner of small miniature pieces inside.Both these aspects were discussed by Wt FX visual effects supervisor Daniel Macarin and FX supervisor Michael Chrobak at SIGGRAPH Asia 2024 in Tokyo. befores & afters got to sit down with the pair at the event. We dive here into the Offspring and the space station, as well as looking at how the VFX team crafted a sense of danger into shots, and how a new machine learning lighting tool was developed to re-light plates.A sense of danger versus safetyb&a: In your talk at SIGGRAPH Asia, I really liked something you talked about, which I dont always think about in terms of visual effects, but it was crucial for the story of Alien: Romulus and what Fede was trying to do, which is, create a sense of danger versus safety. Can you talk a little bit about that in the context of how you approach the work?Daniel Macarin (VFX supervisor): Well, its the idea of storytelling and how a lot of times when we think of visual effects, we think of digital replacements of characters or something very animated or big environments and big effects. But, often what were trying to do is just make the story better. Here, Fede had very specific ideas and was coming up with new ideas all the time, which were, how can we just add these little things to make it better? For every shot, we looked at, well, what would reinforce those ideas? What could we continue pushing? How would you feel if you were in this environment? And what would make you feel unsafe in this environment? We dont want to just throw it all at you. Jump scares are fun and you always throw them into every kind of horror film. But if you have a jump scare every five seconds, they start wearing thin and the audience loses connection. You lose story for spectacle. It becomes hokey.Instead, the idea was to maintain the audiences anxiety by driving them along the path of the story by adding atmosphere or lighting or subtle amounts of liquid along walls or in their path, or tiny bits of blood just at the side of the camera that the audience might catch. Its these really, really subtle effects that visual effects can add that enhance a story at a level that people arent really generally used to. Theyre used to the large spectacle, and we want to actually drive performance and story.b&a: How is that also in the back of your mind as FX supervisor, Michael?Michael Chrobak (FX supervisor): Dan sits us all down and hell go through a lot of the sequence and hell say what the feeling of the scenes needs to be from the mind of Fede. And the first thing is, Fede is a storyteller. Fede has a very unique vision. It was about feeling a lot of the different moods of the characters, making sure that youre always uncomfortable or you are comfortable for only a second. Then when were doing the effects, we just have to make sure that they match whats going on in the scene. I remember we were doing some volumetrics and they had to be almost like dry ice, just making you feel like it was like a calm before the storm and then all of a sudden something happens. It comes up a lot with the Offspring, this approach, in terms of whats light and dark and even whats oozing out of his body.The Offspringb&a: Tell me about conversations you had with Fede and [production visual effects supervisor] Eric Barba about augmenting and adding to the Offspring character to make it scarier, or to tell certain story points?Daniel Macarin (VFX supervisor): The prosthetic suit was scary in itself. As a base, they did an absolutely phenomenal job. So, when we first saw it, it was like, what are we even doing to this thing? It looks amazing. Fede said, Please keep as many pixels as you possibly can. We want to hold onto this.There were very, very simple things that were straightforward, things like seam lines in the suit. We needed to obviously get rid of those. Or, there were areas where two shots earlier, they had dripped some blood on his face and it dripped onto the suit. Well, its not like they have 50 of those suits on set, so we would get rid of that blood for continuity. Thats also why Eric decided to stop putting things on the suit and just let VFX do it because we can make it more continuous, we can add more or less at any time. There were also things on his hands and feet that they were concerned about. The feet ended up looking like shoes, so we replaced pretty much from the calf down all the time.Then Fede wanted it to grow a tail. He said, I want that to be a thing, I want it to be able to use it, I want it to have this evolutionary function. We talked about pipes growing out the back of it, to match those holes on his back. The initial conversation was around pipes growing out and being a little bit more Xenomorph-like. We talked about the back of the head extending out and ripping open. We did a lot of tests and even went through and animated a whole bunch of different sequences where he had evolved further. In the end, it just became a thing of, well, he doesnt have a huge amount of screen time. We dont really get to know this character. If we change him too much too quickly, the audience loses that connection. So, we had to pull a lot of our augmentation back and decide, what can we augment that helps the story but doesnt change the character?The main augmentation that would always be there was the tail, but everything else we drew back to subtle amounts of blood on his face or extra ooze or hand and feet fixes, but left the rest as the character was. Even when the decompression event happens and we pull him into the outside, there was a lot of talk about all of his skin ripping off and having a different look underneath. But after going round and round for a while, it was decided that the creature looks fantastic, lets not lose the connection with it. Fede said, You can rip him apart, and you can do something horrific, but I want the audience to know that this is the same character theyve been seeing the last minute or so.b&a: When you did have to disintegrate him because of the decompression event and his skin and face is peeling off, what approach did you take to that in FX?Michael Chrobak (FX supervisor): We had a couple of initial discussions and I looked up a lot of reference from old horror movies. There were ones where there was bubbling coming off of the skin from peoples faces, and exploding heads. We then pitched a couple ideas, asking things like, Do you want his skin to completely rip off and you just see the internal muscular structure underneath? We did a couple of little tests with a very basic model of the initial Offspring and generating a second layer of skin with stuff falling off.In the end, we settled on a modeled approach, supplemented with FX and then a procedural layer on top of that. We asked the models and creatures departments to give us a second skin. So, we had the underlying muscular structure, they put a second skin over top and then they ended up giving us almost like pre-fractured geometry in a certain amount of places with a couple of pre-sculpted little rips and tears on the skin that needed all that high fidelity,In FX, we did some supplementary work on top of that where we ripped other parts, and then we procedurally moved those flaps that the creatures and models departments gave us. Then we added a procedural bubbling system under the skin where we had control of the size and how fast they bubbled. Every so often youll see one of those bubbles rip open and some ooze come out to help sell that disintegration and add to the scene. But, we always played it just enough so that it made you feel uncomfortable. It was also about making sure it felt like the Offsprings life was in danger.issue #22 Alien: RomulusRe-lighting the Offspringb&a: In your talk you also mentioned a machine learning lighting tool Wt FX devised for adding to changes in the lighting, for when the Offspring is stalking Rain, with the idea being that you could place it in shadow or affect the mood where you needed to. Can you explain that further?Daniel Macarin (VFX supervisor): Our comp team was looking at some scenes we were working on and saying, If we were doing this all CG, what would we do? Well, we split things out in light groups quite often, and that enables us to animate things if we need to in comp or in post. We thought, well, itd be really amazing if we could do that with the on-set lighting. The sets were shot so well, and to try and mimic that and get all the detail that DOP Galo Olivares had gotten in there would have been an absolutely tremendous task. We did a quick test. Luckily, for most of the shots the camera moves are relatively still. The idea was, if we paint a reference frame and we split all those areas out and we identify each section, maybe we could teach the computer to do that for the other frames.It didnt work perfectly because we had to do it very, very fast. There was a lot of artistry after the fact, but as the base level of what it gave us, you could clean up the rest and it turned out exactly what we hoped for. It gave Fede a lot of creative freedom. He could add in flashes of light wherever he needed it. He could go into darkness. Suddenly the ideas started really flowing, realizing he had that ability after having shot something. So, he started looking at other sequences, where, say, the timing of a flash was a little off, and he could line it up instead to a different moment.b&a: Is it very bespoke or is it inside of CopyCat or Nuke, for example?Daniel Macarin (VFX supervisor): Its a Nuke-based thing. CopyCat can do similar things. The way it trains is a little bit different to how we managed it.Crashing a space stationb&a: Lets turn to the space station that crashes into the rings of the planet. Its incredible that whole sequence. I particularly loved when you were talking about when the space station actually starts crashing and what you put inside it to make it feel like it was a miniature with things insidebits and pieces and detritus and popsicle sticks. And then not only that, you were also deforming them.Michael Chrobak (FX supervisor): We got a brief from both Dan and Fede that they just wanted the nostalgia to be there. They wanted everything to feel like it was shot like it would have been on set back in the original Alien days. I have a very good team of talented artists, and the one who was a huge Alien fan, Gray Horsfield, he said, I want those ship destruction shots. I will make them exactly how you expect them to be. So, what we did was, we took the ship, which we ingested from another studio. We had to modify with the modeling department parts of the ship just to make sure that it was destruction-ready. We didnt have the ability to do the whole ship, so we had to pick and choose our parts. Its just such a big model. We ended up settling on one half of the ship and re-modeled it, although we didnt have the ability to really do any interiors. When it got into the FX department, Gray created a whole bunch of internal procedural Houdini pieces. His whole rig was procedural; no matter what the shot, no matter where the ship was broken, how we needed to crash it into the rings, he was like, Yep, Ill load it in. All of this procedurally generated internal scaffolding. There were struts between the scaffolding and these internal little corridors that he made. It was nothing that you would see through the render side, but it was essential for all of the force propagation and for getting a more physically accurate result of what you would do if you had a real life ship or a model that you would just throw into an angle grinder or something back in the day.We spent a lot of time looking at iterations, probably hundreds and hundreds of iterations. We finally said, this is great, but we need a bit more liveliness to it. So, instead of just doing specific instancing, Gray had the force propagation going through the ship having constraints interact with other constraints and then interacting with other constraints. For example, an asteroid from the outfields ring would hit the ship, rip off that panel, deform that panel, pull off the panel behind it, and then hit another part of the ship, and then rip off those panels. It was all very, very tied together.Now, a panel in the ship that we had was basically just a square. It didnt have as much detail. So, Gray added instancing behind it as if there was some sort of metallic substructure to it. But that needed to have some life to it as well. So, he said, okay, lets take those instances that we would have put on there and add some initial deformation on them to make them feel like theyre ripping and bending, and then add those on as deformed instances to add more life to the shot.Daniel Macarin (VFX supervisor): Gray is old school. He did the Balrog on the original Lord of the Rings. The way he approaches things is a lot of out of the box thinking. In some of the shots in Alien: Romulus, hed say, I want to do more. I want to do more. Wed say, It was amazing five versions ago! Hes like, No, I have three more elements to put in there. You absolutely love that dedication. This meant so much to him. He likes his shots, in that sense, to be remembered. He wants people to see the beauty of it and it to be like, I remember that shot, it was gorgeous. He puts all of these extras into it and you really cant stop him. We threatened to shut his machine off by the end!b&a: You mentioned in your talk that the look and feel of the rings, and the carpet of rocks and ice, went through a bit of an evolution.Daniel Macarin (VFX supervisor): It would have been so easy for us to destroy the station with more of an asteroid field. If you want to destroy a space station, a really large, fast moving object smashing a space station is very easy to believe that that would cause the damage. When you go with a more flat, grinding ring, you run into a lot of problems of, well, we now have a space station falling into the Roche limit of the rings, which is basically the gravitational force of the rings. But at that point, wouldnt the rings and the space station be going at the same speed? And then the space station would really just hit the rocks and continue getting pulled by the planet. It has its own speed from its decaying orbit, but at the same time, its not the space station destroying or breaking up the ring, its the speed of the ring trying to destroy the space station. We had to balance a believable speed, where we didnt turn this into just motion blurred streaks. We wanted to see that it was ice, see the specular hits of it, see all the detail that we had put in, but make it fast enough that you would believe that it could cause the damage that it does to the space station. We thought, can we make a few icebergs within the ring? Can we make a few valleys and peaks? Can we add something extra? Does it have to be perfectly flat? Most of the rings of Saturn are actually about 10 meters thick, until you get to the edges, which start to then become these very large mountainous peaks, which are about two kilometers tall. So, we started with the correct width, and then at the scale and the distance that were at with the space station, it really becomes a flat plane. So, we tried lowering the sun. We lowered the sun all the way down to the horizon, which sort of gave us a little depth and a little breakup on the rocks. The thing is, we were putting hundreds of millions of these rocks in, so we really wanted to get something out of it.Michael Chrobak (FX supervisor): One of the issues we had was, we originally started with just an asteroid belt which had bigger and bigger chunks. We got some good destruction out of that because theyre big impactors. We had some really nice looking destruction on the spaceship, but it wasnt the creative direction. We got told that they wanted more of that carpet of death lookreally small stuff. It toned down and made the spaceship destruction more boring because it was really only destroying the little bottom edge of it.Thats when we were said, okay, lets put some keyframed impactors in there, maybe an iceberg comes by or one of those bigger pieces comes into the ring and really takes off a chunk off the station. We also thought of adding some peaks and valleys because, if its just flat, it really just results in the bottom of the ship being hit and you see just a little bit of that bottom edge getting destroyed. So, by adding in those undulations, from certain angles, you end up getting certain parts of the ship that will get pulled in and some of them that will get kicked out.Daniel Macarin (VFX supervisor): A lot of that was decided in conversations with Fede. Theres a oner at the end of the sequence where we pull through the container and Rain is hanging and shes watching the final destruction-there were those kinds of moments where Fede said, it seems a little stale, theres nothing going by her, shes just over a ring, and theres no sense of danger, she feels too safe. Meaning, she has no reason to hurry up and get up to the ship. So Fede said, lets add some extra iceberg paths. Lets add some more things.Michael Chrobak (FX supervisor): One interesting thing on that shot was the speed of the rings. Because of the movement of the camera, if we had the rings going like they were in the other shots, they actually looked static. They didnt look like they were adding any danger. So we had all these flow lines and cluster patterns to add different speeds. I actually had to reverse some of the speeds of the rings so that when the secondary piece of the ship hits, it would actually tear it. With the camera move, it still looked like the rings were going away from us, but I actually had to have the speed patterns on the rings going against each other to actually cause almost like a torque into the destruction of the ship to add more interest. This was versus it just hitting and toppling, and this way it ripped it apart and gave you more visual interest to the shot as well.Daniel Macarin (VFX supervisor): That was particularly handy because the system that Mike had developed allowed for really fast turnaround of those kinds of ideasWed say, Can you change the speed?, and theyd be able to say, I dont need to re-sim anything, absolutely, we can adjust that. Or, Can you change the speed on just this one piece? and itd be, Yes.Michael Chrobak (FX supervisor): We also did a lot of variations with the peaks and valleys and having the raking light coming across to get all the shadows coming across the rings. Because, if it was flat, you got no visual interest on that ring whatsoever. The light was just hitting it, and it just looked like you just got highlights.Daniel Macarin (VFX supervisor): And that actually goes into something else. Initially, the idea was that Rain had never seen the Sun and that her thing about leaving the planet and going to this other planet was that she would get to see a sunrise. This made sense while shes on the volcanic planet, which maybe is dense enough that you never actually get to see the sun; youre just in a perpetual kind of dusk or night. Then, once we got out into space, some of the initial shots had a sun in it, and it was like, well, so theyre just trapped on the dark side of the planet at all times?We went back and forth on this about, how do we show that theyre close or far from the ring without shadows? And this became a very difficult concept to deal with. One of our compositors came up with a look where you just had the sun raking across the rings and the space station was in the shadow section and there was this open, bright section away from it. We talked with Fede about it and how this actually helps this idea of, were in danger, were in the dark section, and safety is the light.So, as the sequence is going on, youre kind of tracking closer and closer to that area and her idea of where shes trying to head is into the sunlight, and if you go backwards, you fall deeper and deeper into the shadow of the planet. So it was this use of lighting to try and help tell that story of, again, safety versus danger. Where, everything is being destroyed here, and everything is nice and intact out here. It was a nice thing that the rings actually really helped guide that idea of how we do that.b&a: Its a really nice connection to the safety/danger thread that you had in your work.Daniel Macarin (VFX supervisor): We werent actually intending on it and then he showed the first version it was like, thats it. Thats it.The post Adding to the Offspring, and crashing a massive space station appeared first on befores & afters.
    0 Comments ·0 Shares ·109 Views
  • How One of Us gave a stop-motion feel to their digital visual effects on Beetlejuice Beetlejuice
    beforesandafters.com
    Behind the scenes of the Sandworm, the shark attack, and those face-stretching shots.Beetlejuice Beetlejuice from Tim Burton includes around 300 visual effects shots from One of Us. These related to a few different scenes. For the Sandworm, One of Us augmented stop-motion animated plates by Mackinnon & Saunders. The VFX studio did something similar for the plane crash, while also orchestrating a CG ocean that had the look of stop-motion. Finally, One of Us crafted a series of disturbing face-stretching effects for a group of influencers during the films finale.Here, One of Us visual effects supervisor James Brennan-Craddock, who worked with production visual effects supervisor Angus Bickerton, shares with befores & afters the different techniques used to bring that old-school aesthetic that audiences delighted in from the 1988 original film, into this new adventure.Enter the SandwormEarly on, One of Us produced a pitch video of a fully CG sandworm to show Angus Bickerton. The idea was to replicate a stop-motion or miniature style in the animation, inspired by what had been done for the Sandworm on the first film. We built a digi-double of a puppet, outlines Brennan-Craddock. Instead of skin and muscle, it was latex and foam. We then had very talented animators working on the test to give it that sense of stop-motion and that incredibly charismatic, imperfect jitter people are familiar with. We were even trying to think in terms of where fingers would go and how fingers would move the puppet.Ultimately, a full stop-motion hand-animated approach was employed for the film (orchestrated by Mackinnon & Saunders). One of Us was still heavily involved in Sandworm shots, however, for the wasteland sequences and for the scenes at the church. For the wasteland moments in which the characters enter a portal out to a desert-scape, production filmed with a small square patch of sand and rocks. A yellowy-orange screen was placed around the set to provide for a sandy-like spill and a wind machine utilized as well. A door set piece was also there on set.The first thing we did was postvis, notes Brennan-Craddock. Because we knew McKinnon & Saunders were going to animate a stop-motion version, we provided them with stabilized versions on the postvis shots that they could use as reference. On their side, they use Dragonframe for the animation, so they could overlay our previs on their monitors. We also gave them a whole bunch of extra data. We figured out distances from camera to subjects in real-world scale and translated that to stop-motion scale. For any moving plate, anything with a pan, we stabilized the plate and gave them a massive overscan version and a frame for where the plate should be. They would then match to that.We would then invert that transform to get that back into our shots, continues Brennan-Craddock. The funny thing was, the stop-motion plates we got back, we then had to body track to be able to create the effects to interact with the Sandworm and get the correct lighting and shadows.One of Us then took care of the Titan environment and sand interaction. Seeking to pay homage to the original films look and feel, elements like sand needed to be particularly art directed to ensure the sand still felt practical. Explains Brennan-Craddock: Step one was, we just looked at the original film for the look of the sand effects. When the Sandworm breached the desert surface sand would be thrown up that looked a lot like a buckets worth of sand being thrown up in front of a blue screen. So in Houdini, we scaled down our simulations to that kind of bucket-sized scale. For a while, we were playing with almost a wet sand look, creating cracks and clumps in the ground where the grain was separating. There were little trails coming off, too, and we mixed in some 2D elements on top just to give certain hero shots a bit of extra character.Our environment again was inspired by what was in the original film, but Tim wanted something that felt grander. So alongside the abstract rock formations (that we called noodle rocks) seen in the original, we extended dunes out and built larger mountain ranges on the horizon. Aesthetically we aimed for something that felt like a miniature or slightly cheesy set, with a little bit of realism in the proportions and scope of the environment.One challenge One of Us faced related to the lack of motion blur (as intended) on the stop-motion animation. The Sandworm itself is stop-motion and is jittery and has no motion blur, says Brennan-Craddock. But that doesnt mean everything else had that look. You couldnt do a stop-motion sand explosion. You would just use an element. So we needed to keep that in mind anything that could be filmed wed render with motion blur, anything that would be animated would not.When the church Sandworm scene occursthat is, when it makes an appearance at the weddingOne of Us was also responsible for integrating Mackinnon & Saunders stop-motion animation into live-action plates. The challenge with that was that physically it shouldnt work at all! observes Brennan-Craddock. The worm is quite a lot bigger than anything in that church. On set, Tim Burton literally had a meter long green toy snake that he was acting out the Sandworms motions. We had to work out, how do you translate that to a 20 meter long snake crashing through a 10 meter wide room? Postvis helped us here. We used our test sandworm asset and figured out the action. We would re-scale and re-frame the shots as much as we could without breaking the illusion.A further complication came from the fact that the on-set lighting in the church was deliberately made to oscillate between blue to teal to green to cyan. So, Mackinnon & Saunders actually produced their stop-motion in two passes; a green-lit one and a blue-lit one. One of Us then incorporated the passes by blending between the two to match what was going on in the real world lighting in the plate, states Brennan-Craddock.Orchestrating a plane crashOne of Us again collaborated with Mackinnon & Saunders to reveal what happens to Charles Deetzs plane, and the subsequent shark attack in the ocean. Their team animated Charles and the passengers and they built a little of the aircraft interior, details Brennan-Craddock. Together, we worked with them to figure out the design of that and then we extended the interior.A significant aspect of the plane crash work was the ocean, which was realized as a digital environment. The idea here was for One of Us to produce an ocean that appeared as if it was something completed practically. The way we dealt with that was to split the ocean into foreground, midground and background, says Brennan-Craddock. Each of those depths had different requirements. In the foreground was where Charles would be and youd see him interacting with the water. We had to think, how would you get a character in some water in a stop-motion environment? The answer we came up with was plasticine. You could smush it around a person, and you could create, with your thumbs, waves and splashes. So, in Houdini, we developed a plasticine style for water.This involved making the fluid base very viscous. Then, on top of that base, One of Us effects artists created a system to add physical details. The system would create fingerprints and tool marks in fast moving areas, i.e. areas that would have been manipulated by an animator, advises Brennan-Craddock. In terms of splashes, we treated that as a separate thing where we essentially made little teardrops that would emit from Charles splashing around. Bigger details in a small enough quantity that looked like they could feasibly be added one-by-one by an animator to denote splashes. On top of that, we had a whitewater generator which we built to look like a little sheet of cellophane that was being tracked from position to position. The look of that changed over time to become more resin-y, more transparent and subsurface-y by the end.The mid-ground water was the bulk of the water in the frame. Obviously you wouldnt create a 10 meter by 10 meter chunk of plasticine and try and animate that, that would be madness, notes Brennan-Craddock. But we found a reference quite early on of a short film called Two Balloons that created this amazing mechanical ocean, which was basically a cloth on these pistons. The pistons moved in a sinusoidal motion, pushing the top up to create waves. Inspired by that, we created a system of virtual cloth on screws and the screws would spin and then push up on the cloth to create waves of travel and we had different screws in different depths with different frequencies of the screw to create a forced perspective look. Thats because, again, you wouldnt create a gigantic stop-motion contraption, youd create a smaller one with a forced perspective approach in real life. So we did a similar thing there.When the shark bites Charles, this becomes a comical moment showcasing a large amount of blood that then immediately cuts back to his wife Delia discussing the crash and shark bite. Here, a Mackinnon & Saunders stop-motion animated Charles and Shark were augmented by One of Us to add some very stop-motion-looking blood. We replicated the look of resin, polystyrene, plasticine and cellophane for that, says Brennan-Craddock. Then we had a 2D element on top that hits the screen and starts to come back down again, horror film style.Face-stretchingIn the first film, replacement animation was utilized to showcase some of the characters being able to stretch their faces. A similar moment occurs with influencers inside the church. For this scene, One of Us built 22 influencer digi-doubles based on cyberscans of the on-set actors which were then subject to a range of face-stretching and body-morphing effects. It was the biggest single thing we worked on for the film in terms of artist hours, describes Brennan-Craddock. The influencers were all very elaborate. They were in their Sunday best with shiny, reflective clothes, with a lot of accessories and crazy hair. It was a fun challenge for the assets teamits not often that groom artists get to groom mohawks or elaborate haircuts.With that digital base, One of Us then embarked on the face-stretching. Initially, says Brennan-Craddock, it was going to be much slower and much more painful. We were looking at how to squeeze a body through a screen sized hole. With that same kind of practical aesthetic of the film, we decided it could feel like a latex or silicone prosthetics of some kind and that they could feel like dummies, almost like ragdolls. So, when they get pulled through the hole, theres cloth-like stretching and folding as they are pulled in.Later, the effect for the face stretching was revised to be more of a longer face stretch, with the body contortion only happening over a few frames. Tim was very keen that the effects feel painful, but at the same time, its also a kind of horror/comedy, advises Brennan-Craddock. To do that, we added in a lot of variety in how the faces distort and stretch. Sometimes we led by the nose, or led by the chin, or led by the forehead. We realized that a lot of the pain came from the characters expression, whereas the comedy came from the shape of the stretch.To then make it feel a bit more unsettling, a bit more practical, wed add a lot of skin stretch or cloth stretch effects in the softer parts of the face, eg cheeks or the eyelids or the lips. Wed add some jittering and vibration to make it feel like its painful or like the characters were resisting.With the characters wearing so many accessories, One of Us used this as an opportunity to add a bit of fun variation, like earrings or necklaces being pulled forward, alongside popping eyeballs. The final effects were a mix of CG blend shapes, effects sims, animation, sculpting and compositing. Advises Brennan-Craddock: Essentially they were transitions within the shot from a live action face to CG face. So we came up with a whole bunch of clever AOVs and displacement maps to pipe into comp. But it also came down to comp eyeballing it and artistically blending frame by frame. Theres a little bit of 2D plate warping as well to help with that, a lot of plate retiming and reconstruction. But it was a real artful process. We had fun dailies where people were presenting their bizarre stretching faces.The post How One of Us gave a stop-motion feel to their digital visual effects on Beetlejuice Beetlejuice appeared first on befores & afters.
    0 Comments ·0 Shares ·127 Views
  • See the speaker line-up at IAMAG Master Classes 25
    beforesandafters.com
    Includes artists from Arcane, Avatar: The Way of Water and The Wild.IAMAG has released its program for IAMC25, where artists in illustration, animation, VFX, and games will present on projects such as Arcane, Avatar: The Way of Water, Love, Death & Robots and The Wild.The event takes place March 79, 2025, in Paris.Heres a list of just some of the speakers:Dylan Cole, Production Designer, Avatar: The Way of WaterDaniel Cacouault, concept artist for Netflixs Love, Death & Robots and The Wild Robot,Ian McQue, concept artistChristian Alzmann, concept artistAsh Thorp, directorAllan McKay, visual effects supervisorSo many more, including Aaron Limonick, Alex Alice, Alessandra Sorrentino, Armel Gaulme, Azusa Tojo, Ayran Oberto, Christophe Lautrette, Craig Mullins, Daniel Cacouault, Daniel Orive, Daria Schmitt, Didier Graffet, Ed Laag, Ehsan Bigloo, Feng Zhu, Frdric Pillot, Gaelle Seguillon, Gaetan Brizzi, James Paick, Jama Jurabaev, Julie Melan, Julien Delval, Julien Gauthier, Kenny Carvalho, Marc Simonetti, Mike Morris, Nicolas Sparth Bouvier, Nicolas Weis, Raphael Lacoste, Shaddy Safadi, Stan Manoukian, Stephan Martiniere, Thomas ScholesThere will also be a special presentation from Fortiche on the making of Arcane, s2Early Bird tickets are available until January 30, 2025.Find out more about IAMC25 here: https://itsartm.ag/iamc25-journeyThe post See the speaker line-up at IAMAG Master Classes 25 appeared first on befores & afters.
    0 Comments ·0 Shares ·142 Views
More Stories