-
- EXPLORER
-
-
-
-
Animation World Network – the largest animation and visual effects-related publishing group on the In
Mises à jour récentes
-
WWW.AWN.COMSonic the Hedgehog 3 Featurettes and Character Posters ReleasedKeanu Reeves stars as Shadow the Hedgehog, a mysterious villain that Sonic, Knuckles, and Tails must reunite against with an all new, highly unlikely alliance dare we say the name, Dr. Robotnik!; the animated/live-action feature hits theaters December 20.0 Commentaires 0 Parts 21 VueConnectez-vous pour aimer, partager et commenter!
-
WWW.AWN.COMMetaphysic De-Ages Robert Zemeckis Here Via Generative AIMiramaxs recently released feature, Here, directed by Robert Zemeckis and based on the graphic novel by Richard McGuire, portrays multiple families over time, in the special place they inhabit.To help convey the characters at various ages throughout their lives, VFX Supervisor Kevin Baillie enlisted creative studio Metaphysic to digitally augment performances via its proprietary generative AI technology. The process involved a talented team of digital characters artists led by Metaphysics VFX Supervisor, Jo Plaete.Working with forward-thinking filmmakers like Bob Zemeckis and Kevin Baillie was a privilege, said Plaete. Their faith in this novel technology pushed us to new heights and allowed us to deliver on their ambitious vision.Metaphysics proprietary process involves training a neural network model on a reference input with artist refinement of the results until the model is ready for production. Multiple models were trained for each actor to meet the diverse needs of the film; Tom Hanks is portrayed at five different ages, Robin Wright at four ages, and Paul Bettany and Kelly Riley at two ages each. Plaete and his team were able to run the workflow in real-time during the filming, providing a visual reference of what the performers would look like. This allowed Zemeckis to view both the raw camera feed and the digitally augmented feed with the actors younger faces while on-set, with only about a six-frame delay, and provide direction accordingly.Metaphysic also set up a camera and monitor system during the shoot that allowed the actors to rehearse while seeing themselves as their younger selves in real-time. The youth mirror system only had a two-frame delay and provided the actors with feedback that helped them fine-tune their performances to better match their younger selves.While the neural network models used for the real-time outputs generated photoreal results, Metaphysic artists then further enhanced them to hold up to cinematic 4K standards.Source: Metaphysic Journalist, antique shop owner, aspiring gemologistL'Wrenbrings a diverse perspective to animation, where every frame reflects her varied passions.0 Commentaires 0 Parts 49 Vue
-
WWW.AWN.COMDreamWorks Drops How to Train Your Dragon FeaturetteAudiences can finally get their first look into the making of How to Train Your Dragon, the live-action reimagining of the first film in the hugely successful animated franchise launched by DreamWorks Animation back in 2010.The new film, hitting theaters June 13, 2025, is written, produced and directed by three-time Oscar nominee and Golden Globe winner Dean DeBlois, who along with The Wild Robot director Chris Sanders, directed the original. It is also produced by three-time Oscar nominee Marc Platt (Wicked, La La Land) and Emmy winner Adam Siegel (Drive, 2 Guns).Christian Manz is serving as production VFX supervisor; VFX studios on the film include Framestore and Clear Angle Studios.The film is set on the rugged isle of Berk, where Vikings and dragons have been bitter enemies for generations, Hiccup (Mason Thames; The Black Phone, For All Mankind) stands apart. The inventive yet overlooked son of Chief Stoick the Vast (Gerard Butler, reprising his voice role from the animated franchise), Hiccup defies centuries of tradition when he befriends Toothless, a feared Night Fury dragon. Their unlikely bond reveals the true nature of dragons, challenging the very foundations of Viking society.With the fierce and ambitious Astrid (BAFTA nominee Nico Parker; Dumbo, The Last of Us) and the villages quirky blacksmith Gobber (Nick Frost; Snow White and the Huntsman, Shaun of the Dead) by his side, Hiccup confronts a world torn by fear and misunderstanding.As an ancient threat emerges, endangering both Vikings and dragons, Hiccups friendship with Toothless becomes the key to forging a new future. Together, they must navigate the delicate path toward peace, soaring beyond the boundaries of their worlds and redefining what it means to be a hero and a leader.The film also stars Julian Dennison (Deadpool 2), Gabriel Howell (Bodies), Bronwyn James (Wicked), Harry Trevaldwyn (Smothered), Ruth Codd (The Midnight Club), BAFTA nominee Peter Serafinowicz (Guardians of the Galaxy) and Murray McArthur (Game of Thrones).Inspired by Cressida Cowells New York Times bestselling book series, DreamWorks Animations How to Train Your Dragon franchise earned four Academy Award nominations and grossed more than $1.6 billion at the global box-office.How To Train Your Dragon is part of the Filmed For IMAX Program, which offers filmmakers IMAX technology to help them deliver high-quality immersive movie experience to audiences around the world.Source: NBC Universal Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.0 Commentaires 0 Parts 47 Vue
-
WWW.AWN.COMAvatar: The Last Airbender Rounds Out Season 2 CastChin Han, Hoa Xuande, Justin Chien, Amanda Zhou, Crystal Yu, Kelemete Misipeka, Lourdes Faberes and Rekha Sharma join the Netflix show currently in production in partnership with Nickelodeon.0 Commentaires 0 Parts 38 Vue
-
WWW.AWN.COMHiccup and Toothless Return in Live-Action How to Train Your Dragon Teaser TrailerAudiences can finally get their first look at How to Train Your Dragon, the live-action reimagining of the first film in the hugely successful animated franchise launched by DreamWorks Animation back in 2010. Like so many productions, it was delayed by last years SAG-AFTRA strike, but is now set to fly into theaters June 13, 2025.The new filmis written, produced and directed by three-time Oscar nominee and Golden Globe winner Dean DeBlois, who along with The Wild Robot director Chris Sanders, wrote and directed the original. It is also produced by three-time Oscar nominee Marc Platt (Wicked, La La Land) and Emmy winner Adam Siegel (Drive, 2 Guns).Christian Manz is serving as production VFX supervisor; VFX studios on the film include Framestore and Clear Angle Studios.The film is set on the rugged isle of Berk, where Vikings and dragons have been bitter enemies for generations, Hiccup (Mason Thames; The Black Phone, For All Mankind) stands apart. The inventive yet overlooked son of Chief Stoick the Vast (Gerard Butler, reprising his voice role from the animated franchise), Hiccup defies centuries of tradition when he befriends Toothless, a feared Night Fury dragon. Their unlikely bond reveals the true nature of dragons, challenging the very foundations of Viking society.With the fierce and ambitious Astrid (BAFTA nominee Nico Parker; Dumbo, The Last of Us) and the villages quirky blacksmith Gobber (Nick Frost; Snow White and the Huntsman, Shaun of the Dead) by his side, Hiccup confronts a world torn by fear and misunderstanding.As an ancient threat emerges, endangering both Vikings and dragons, Hiccups friendship with Toothless becomes the key to forging a new future. Together, they must navigate the delicate path toward peace, soaring beyond the boundaries of their worlds and redefining what it means to be a hero and a leader.The film also stars Julian Dennison (Deadpool 2), Gabriel Howell (Bodies), Bronwyn James (Wicked), Harry Trevaldwyn (Smothered), Ruth Codd (The Midnight Club), BAFTA nominee Peter Serafinowicz (Guardians of the Galaxy) and Murray McArthur (Game of Thrones).Inspired by Cressida Cowells New York Times bestselling book series, DreamWorks Animations How to Train Your Dragon franchise earned four Academy Award nominations and grossed more than $1.6 billion at the global box-office.How To Train Your Dragon is part of the Filmed For IMAX Program, which offers filmmakers IMAX technology to help them deliver high-quality immersive movie experience to audiences around the world.Source: NBC Universal Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.0 Commentaires 0 Parts 31 Vue
-
WWW.AWN.COMWatch: VTuber BearTrice Kumanos Nibi-iro Kirameki Music VideoBandai Namco Music Live releases the video, animated by Akio Takami, which features a mix of cell animation, 3DCG, and live performance.0 Commentaires 0 Parts 21 Vue
-
WWW.AWN.COMIce Age 6 Replaces Star Wars on Disneys 2026 Theatrical SlateAn untitled film in the long running sci-fi fantasy franchise, set for a December 18, 2026 release, was just replaced by the 3DCG film that resurrects the Ice Age theatrical franchise.0 Commentaires 0 Parts 24 Vue
-
WWW.AWN.COMPaul Lambert Returns to Arrakis for Dune: Part Two VFXAfter striking Oscar gold in the spice fields of Arrakis for the visual effects in Dune, Paul Lambert returns to helm the VFX on the sequel, Dune: Part Two, which further expands the storytelling and the epic vision of filmmaker Denis Villeneuve. The story picks up with House Harkonnen taking over the prized planetary possession and decimating the House Atreides which has sought refuge and gathering rebellion support from the Indigenous desert dwellers known as Freman. 2,156 shots were produced by DNEG, Wylie Co., and Territory Studio with concept art provided by Rodeo FX and previs by MPC. Denis Villeneuve sees the movie and of course there are times hell embrace a slightly different approach, but he knows what he wants, states the production VFX supervisor. Thats what makes it such a pleasure and joy to work with him. Because you know when you do a particular shot that the background isnt going to become something completely different. We have a certain trust with each other as to how we approach the visual effects. I know that if there has been a concept, were going to stick to that, and that allows me to setup the shots in a way which is good for the over-all composite. The idea of shooting sand screens came from me knowing what that background or a proxy version of it was going to be.Unlike Ridley Scott, who is known to use as many as 12 cameras at one time, Villeneuve and cinematographer Greig Fraser favor framing and lighting for a single camera. Honestly, from a visual effects standpoint its a godsend! laughs Lambert. The moment you start to add additional cameras, especially four, you know that there are going to be certain cameras that will always be compromised because you cant plan for four cameras. Great attention was paid to get the desired backlight for each shot. According to Lambert, I introduced Greig to the world of LiDAR scanning apps on an iPhone. When we were going on another recce you would look around and ask, Wheres Greig? And hed be over somewhere scanning the earth or the actual rock structure. That geometry was brought into Unreal Engine.Shadows were critical for the Harkonnen harvester attack that takes place during daylight, in the middle of the desert, so special effects supervisor Gerd Nefzer and his team had industrial tractors hold large black screens in designated areas. Its all about sun and shade, notes Lambert. We needed to think of a visual way to understand what that shadow would do. We had an iPad with custom software from DNEG where we had a huge spice crawler in there. But we were also able to cast a shadow from a spice crawler. We could look at where the camera was going to be, see the structure and see where the shadow was on the ground. We definitely dont want to be running [the characters in the film] into this particular area because we dont have a real shadow for that. One of the main rules I had with Denis was, To try to keep things believable I never want to change the lighting on a character. If we shoot in the daylight and I try to make Paul look as if hes in shadow, it will look wrong. Theres nothing I can do to make that look correct. Another rule in the desert is we never step through previous footprints, which meant whenever we destroyed an area with footprints we would move over. If we couldnt move over to shoot, we would then rake the sand. Trying to simulate that is a big old problem. Given the time of year in the United Arab Emirates, it was extremely difficult locating a backlit dune with the correct wind direction for the iconic moment where Paul Atreides rides a sandworm. We found one, and that was the dune we would run along and then replicate to create the cascade, explains Lambert. But also, when you see some closeups of Paul running on top of the dune, we had to replicate the peak of the dune because what we needed to do is create a physical collapsable dune. What we came up with were these three huge steel tubes attached to industrial tractors. These were embedded into the dune we had created. A stuntie attached to a wire with a camera on a crane behind him would run, we would then callout for those tubes to be pulled out, the dune would collapse, then the stuntie would fall down and the camera would follow. Because of the light direction, which we needed to match the photography, we could only do it at one particular time of day. Obviously, there was a massive reset. We could do it once a day in the morning before we went off and did other things. It took us four attempts. With that element of the stuntie falling down and the camera behind kicking up sand, what I had to do was extend out the rest of the dune collapsing up ahead and the worm coming out so that you felt as if youre way the heck higher. Thats one of those things where if you get all the particulars correct it then works as a shot. Obviously, there is a lot of digital. But you have a basis that is always something real. The decision to deploy infrared cameras to emulate the black sun exterior environment of Giedi Prime brought with it some unpredictable results. We did a multitude of tests before we started the main shoot, recalls Lambert. We were going to shoot it outside between the two stadiums and on white sand, so it was a high contrast area where you had shade and this whiter than white. It was in this grey look. We tested everything. I even tested my gaffer tape for tracking markers. Cut to the day of the shoot when the fighters appeared. The three fighters appeared, and they looked great, muscular, ready to fight. And then we saw one of them through the infrared camera. They had covered his tattoos in makeup. However, in good old infrared, you get to see that. And, he had tattoos all over his body! I asked Denis, Does this fit the aesthetic? He said, No. I had to have Wylie Co. remove that, which was a substantial job. But its the day of the shoot. What are you going to do? We had to shoot.Lambert also needed to accommodate where the studio walls and buildings were casting shadows behind Feyd-Rautha and the other fighters. I made a decision, he notes. Rather than try to remove that - because if you go from something bright to something dark its always going to look bad - I decided to keep those shadows and played them as if they were stadium shadows from the big towers. If you were to look at it as a whole, it wouldnt make sense but in the fight sequence it works well. The fireworks on Giedi Prime scenes also evolved considerably during the shoot. When Feyd and Lady Margot Fenring are walking down the hall and she seduces him, we built a set with these huge structures internally, explains Lambert. The idea was we had fireworks outside that would have a certain pattern that would then play on the interactive light inside and we would extend those. But the actual fireworks, where you see the burst that went through a big development process. They looked completely different than what you see in the film. We had this idea of seeing holes in the atmosphere, and each time one opened it would open a blackness from within the white sky. But Denis was not keen on it until my producer found this little video of ink inside water, which Denis loved, and thats basically what it became!Hardly veering from the original concept was the Orni Bee. We built sections of the back of the Orni Bee for when Glossu Rabban is hanging off it and fighting the Freman. He was holding onto a particular rig, which we built close to the ground but then played as if it was way higher. We also built the Orni Bee for when theyre all taking off from Arrakeen to fight Paul out in the desert. Its a great big physical build. We didnt lift this one into the air because it was way heavier than the original ornithopter. When its flying, we built some silks around it on the actual horizon, which we then augmented. Having a practical asset helps to inspire Denis and the actors, and in the end helps visual effects because I always have something to actually work from.Paul Atreidess vision of a cataclysm became a major visual effects scene. We shot practical actors falling on the ground, who then got body parts replaced by CG to make them way thinner, while all the other characters you see in the background are all CG, states Lambert. It became a big visual effects shot. At one point the look of that particular sequence was going to be way the heck far out there because when we were shooting the plates Greig chose to do some close focus work, and everything was just shapes. It would have been hard to actually get the work inside of those plates. We pulled back a little bit to what the actual visual was going to be. Along with the visions there were holograms. Territory Studio did the Harkonnen tabletop in the city where you have Fayd and the Baron overseeing their bombing tactics on the Freman, shares Lambert. It was beautifully designed. Basically, there were some initial designs from Patrice but then Territory Studio took those to the next level in designing how to visualize a war in progress, like seeing where the spaceships and trajectories were. Denis had a lot of backwards and forwards and creative discussions directly with Territory Studio trying to get his story point across. It was a good relationship. Three puppeteers were responsible for the baby sandworm. When the baby worm was underground, that was a special effects rig that was a ball and chain being pulled, reveals Lambert. Those movements you see in the sand are practical. Then the actual puppeteers would puppeteer the baby worm when it wraps itself around the actress. She carries it and put it under the water. All the puppeteers are in the water moving the puppet. What we did in CG was whenever you see the worm above the water, we would compress the scale. But the main part of that is a puppet. We deemed that was the best approach. There were a multitude of techniques used depending on [what was required for the shot]. Its a philosophy that Denis and I have had throughout our other movies. What is the best way to actually make something believable? To make sure that you dont know that Ive done anything to it! Trevor Hogg is a freelance video editor and writer best known for composing in-depth filmmaker and movie profiles for VFX Voice, Animation Magazine, and British Cinematographer.0 Commentaires 0 Parts 24 Vue
-
WWW.AWN.COMParamount Pictures Shares Final Gladiator II TrailerWere loving the battle rhino in the visually stunning, often brutal ancient Roman world of Sir Ridley Scott's Oscar-winning Gladiator sequel, coming to theaters November 22.0 Commentaires 0 Parts 23 Vue
-
WWW.AWN.COMFMX 2025 Shares First Program UpdatesFMX Film & Media Exchange returns once again May 6-9, 2025, for the 29th edition of the conference on animation, effects, interactive, and immersive media. Plans are underway for another stellar program.Organizers have shared their first update on whats coming here are the highlights:FMX 2025: RHYTHM OF CHANGEThe theme for 2025 is RHYTHM OF CHANGE, prompting the question: How can we ensure that production technologies enhance and support the goals and intentions of the creators who keep us human and connected through their art?Look for discussions about the growing number of sharing platforms, open standards, and other initiatives that have recently emerged with a special focus on cross-platform collaboration, pipelines, and distances both physical and subjective from interoperability to the disruptions shaking up film production.Paul Debevec joins FMX 2025 as Program ChairPaul Debevec is the chief research officer at Netflixs Eyeline Studios in Los Angeles, overseeing R&D for visual effects and virtual production with computer vision, graphics, and machine learning. In 2002, Debevecs Light Stage 3 system pioneered the virtual production technique of surrounding actors with color LEDs to display images of virtual environments for lighting-accurate compositing.Debevecs techniques for photogrammetry, HDR imaging, image-based lighting, and photoreal digital actors have been used to create key visual effects sequences in Matrix, Spider-Man 2, Benjamin Button, AvatarI, Gravity, Oblivion, Maleficent, Furious 7, Blade Runner: 2049, Gemini Man, Free Guy, and numerous video games.Debevecs work has been recognized with two Academy Awards for Scientific and Technical Achievement, the Progress Medal from the Society of Motion Picture and Television Engineers, and in 2022, the Charles F. Jenkins Lifetime Achievement Emmy Award. He is a Governor of the Visual Effects Branch of the Academy of Motion Picture Arts and Sciences, a Fellow of the Visual Effects Society, and an Adjunct Research Professor at the University of Southern California."We are honored that the Master of Light himself, Dr Paul Debevec, has agreed to be our program chair for FMX 2025, said FMX Conference Chair Dr. Jan Pinkava. He has been instrumental in creating an entire discipline of image capture and image making, with rigorous and wide-ranging applications throughout Visual Effects, Virtual Production and beyond. Pauls network spans academia and industry at the highest level and we are excited to have his invaluable guidance. On top of that, hes a really nice guy!"Source: FMX 2025 Debbie Diamond Sarto is news editor at Animation World Network.0 Commentaires 0 Parts 53 Vue
-
WWW.AWN.COMAnimaj Introduces Sketch-to-Motion AI ToolAnimaj has launched Sketch-to-Motion, an AI tool which turns rough sketches into fully-rendered 3D animations with one click.Built on Animajs advanced AI models, the tool operates in a fully editable 3D environment, enabling seamless creative adjustments at any stage of the process. The system is built around two main types of models: sketch-to-pose and 3D motion in-betweening.Sketch-to-pose models use ResNet50 architecture and are trained on hundreds of thousands of pairs of sketches and their corresponding 3D poses for each character. These models can accurately predict the rig controller values of a character from any input sketch.Motion in-betweening models focus on predicting the intermediate poses between the key poses of the characters. Key poses are typically spaced a few frames apart. These models leverage Long Short-Term Memory (LSTM) architecture to ensure the movement remains consistent over time."Sketch-to-Motion changes everything about how we create animations, said Sixte de Vauplane, CEO and Co-founder of Animaj. Its not just faster; its smarter. Artists now have the freedom to iterate endlessly and see their ideas come to life instantly. This tool isnt replacing creative artists its empowering them." See it in action - watch Sketch-To-Motion: Redefining Animation:Tailored for high-end projects like the upcoming Pocoyo Season 6 a hit children's series with over 40 billion views on YouTube this innovation, according to Animaj, highlights AI's powerful potential to streamline top-tier productions with demanding quality standards.Its worth noting that Animaj is not licensing this technology but is instead keeping it exclusive to accelerate the growth of IPs they acquire, create, or co-create, ensuring that Sketch-to-Motion drives the creation and expansion of world-class franchises.Animaj is also currently developing an entire set of AI tools designed to transform artists into all-in-one creators who can orchestrate every aspect of their work, from camera positioning and lighting to facial expressions and nuanced character movements.Source: Animaj Journalist, antique shop owner, aspiring gemologistL'Wrenbrings a diverse perspective to animation, where every frame reflects her varied passions.0 Commentaires 0 Parts 59 Vue
-
WWW.AWN.COMUnreal Engine 5.5 Now AvailableEpic Games has released Unreal Engine 5.5, featuring major enhancements to animation authoring, rendering, virtual production, mobile game development, and developer iteration toolsets.Here are some highlights: Animation - UE 5.5 new features and enhancements facilitate high-fidelity in-editor animation authoring workflows. There are also additions to the animation gameplay authoring toolset.Sequencer - Unreal Engines nonlinear animation editor now boasts a more controllable interface with better filtering and easier access to properties.Animation deformers Users can craft more realistic animation effects such as contact deformation or better cartoon-style squash-and-stretch, with the new ability to author animatable animation deformers inside Control Rig and easily apply them to characters in Sequencer with a single click.Modular Control Rig - Modular Control Rig moves to Beta with UI and UX improvements; new quadruped and vehicles modules; and support for common bipedal skeleton types, while the Skeletal Editor is now Production-Ready with improvements that include quicker and simpler workflows for painting and editing weights. MetaHuman Animator - Part of the MetaHuman Plugin for Unreal Engine, MetaHuman Animators upgrades include an Experimental feature, allowing users to generate high-quality facial animationincluding inference of upper face gesturesjust from audio performances.Mutable character customization - Game developers whose projects require content that changes dynamically at runtime will benefit from the addition of the Mutable character customization system. The system can be used to generate dynamic skeletal meshes, materials, and textures for characters, animals, props, and weapons while optimizing memory usage, keeping shader cost low, and reducing the draw call count.Choosers Now production-ready Choosers offers a framework for selecting animations for playback based on game context without having to write complex logic, this game context asset selector can be used to select nearly any type of asset; this can encompass multiple levels of complexity, from simple random selectors to database-driven logic involving thousands of animations. Lumen - Now runs at 60 Hz on platforms with supported hardware due to improvements to the systems that underpin hardware ray tracing (HWRT). These improvements also impact the performance and capabilities of Path Tracer and light baking.Path Tracer - The DXR-accelerated, physically accurate progressive rendering mode is now production-ready, for creating final pixels for nonlinear applications or fully featured ground-truth reference images. This release sees a series of performance and fidelity improvements, Linux support, and support for all other Production-Ready features, including sky atmosphere and volumetric clouds. Substrate - The material authoring framework introduced as Experimental in Unreal Engine 5.2, moves to Beta. All features of legacy materials are now supported, as are all platforms to which UE deploys.Movie Render Graph (MRG) - Introduced as Experimental in Unreal Engine 5.4, Movie Render Graph (MRG) moves to Beta in this release, with further investment in the graph-based configuration workflow.MegaLights - This release offers a sneak peek at an Experimental new feature being called MegaLights. Already being dubbed the Nanite of lights, MegaLights enables users to add hundreds of dynamic shadow-casting lights to scenes, without constraints. Lighting artists, for the first time, can freely use textured area lights with soft shadows, Light Functions, media texture playback, and volumetric shadows on consoles and PC, focusing on artistic considerations rather than performance impact. Virtual production - Unreal Engines dedicated in-camera visual effects (ICVFX) toolset powers a myriad of productions in film, television, and commercials internationally. UE 5.5 sees the accumulated investment across multiple releases bringing the ICVFX toolset to full production-readiness, as well as advances in other features for virtual production and visualization.SMPTE 2110 - Unreal Engines support for SMPTE 2110 includes numerous stability improvements; automatic detection and repair of framelock loss; the ability to use PTP as a timecode provider; OCIO support for 2110 media; and other improvements to IP video signal flow, its ready to meet the needs of the real-world ICVFX projects as they make the transition to SMPTE 2110 deployments.Camera Calibration - Production-Ready with UE 5.5 is the Camera Calibration solver, with improved accuracy for lens and camera parameters estimation. Stemming from this work, Overscan is now built into all cameras, to support use cases like rendering with lens distortion or adding camera shake in post.Virtual Scouting Production-ready updated Virtual Scouting toolset introduced in UE 5.4, offers an out-of-the-box experience using OpenXR-compatible HMDs (with Oculus and Valve Index supported by default), and new opportunities for customization via an extensive API. The toolset now features a new VR Content Browser and asset placement; a Transform Gizmo that is customizable via Blueprint; and further polish, including a color-correct Viewfinder.Color Grading Panel - Previously part of the ICVFX Editor, the Color Grading Panel is now available for general use in the Unreal Editor, providing an artist-friendly interface for creative color manipulation in any Unreal Engine scene. The panel now also supports post-process volumes, cine cameras, and color correction regions.DMX - With applicability not just within virtual production, but also in broadcast and live events, Unreal Engines DMX tech stack joins the list of Production-Ready toolsets, with enhancements to the Control Console, Pixel Mapping, and Conflict Monitor.This release also adds GDTF compliance to the DMX Plugin for interfacing with GDTF- and MVR-enabled control devices and software, among other enhancements.Mobile game development:Mobile Forward Renderers new features increase visual fidelity on the platform and now supports D-buffer decals, rectangular area lights, capsule shadows, moveable IES textures for point and spotlights, volumetric fog, and Niagara particle lights. Screen-space reflections now work in both Mobile Forward and Deferred Renderers.Mobile Previewers improvements help with content development for mobile games including the ability to capture and preview a specific Android device profile, and to emulate half-precision 16-bit float shaders, making it easier to detect and deal with artifacts.Visit the Unreal Engine blog for more detailed information.Source: Epic Games Debbie Diamond Sarto is news editor at Animation World Network.0 Commentaires 0 Parts 66 Vue
-
WWW.AWN.COMTaika Waititis A Disney Holiday Short: The Boy & The Octopus DebutsDisney has just released an all-new short film, A Disney Holiday Short: The Boy & The Octopus, in collaboration with Oscar-winning filmmaker Taika Waititi, premiering today on YouTube.The film follows the journey of a child who discovers a curious octopus has attached to his head during a seaside vacation. After returning home, the boy forms a true friendship with the octopus by introducing his new companion to his life on land harnessing the power of the Force with his Jedi lightsaber, playing with his Buzz Lightyear action figure, and imagining Santa Claus route around the world with the map on his wall before taking the lovable octopus out into the world to experience the joy of the holidays, hidden under his Mickey Mouse beanie.While watching the Disney holiday classic, The Santa Clause (1994), the boy comes to understand the extent of the octopus desire to explore everything the world has to offer, and he sets in motion a plan to make it happen. For the boy and the octopus, it is the precious everyday moments of childhood and friendship, as much as the magic of the season, that make their time together so meaningful. Numerous hidden easter eggs include nods to films like Moana (2016), Lilo and Stitch (2002), and Toy Story (1995), among others.The story manages to connect the feelings that you get around the holidays, and the joy, the goodwill and everything, with those same emotions and those same sensibilities you get from Disney films, said Waititi. I think they go hand in hand and it's the perfect match and only Disney could have made something like this with me. The short was created in conjunction with global creative agencies adam&eveDDB and Untold Studios, and produced by Hungry Man. A melodic rendition of Part of Your World from the Disney classic The Little Mermaid (1989) can be heard throughout the short, highlighting the octopus desire to explore the world above. This take on the fan-favorite song was recorded live by a 60-piece orchestra and mixed in the legendary recording studio Abbey Road.Tim Pennoyer, Director of Brand Marketing and the and themarketinglead whohelpedspearhead the campaign, shares with AWN, The biggest challenge in bringingA Disney Holiday Short: The Boy & The Octopus to life was creating an octopus character that felt both realistic and emotionally expressive. Working with our amazing partners in adam&eveDDB and VFX experts Untold Studios, we knew we were asking a lotan octopus with eight tentacles that could move independently, shapeshift, and even change skin color, all while conveying the warmth and charm needed for our story. It was the most complex character Untold Studios has ever brought to life, requiring an incredible level of detail in VFX to capture those intricate movements and transformations. He adds, The octopus had to feel both lifelike and relatable, with a full range of motion that would make the character truly magical on screen. It was a huge technical and creative challenge, but we knew it was key to making the story unforgettable. We would spend hours talking through every single scene with the octopus debating each expression on his face, and even the sounds he would make. All these details bring him to life.Take a few moments to enjoy the film:For generations, Disney has been an ever-present part of the holiday season all over the world, and this short builds on the enduring connection that so many families have with Disney during this special time of year, said Asad Ayaz, Chief Brand Officer, The Walt Disney Company. Were thrilled to collaborate with Taika Waititi on this timeless story of childhood friendship against the backdrop of this magical season.A Disney Holiday Short: The Boy & The Octopus is the latest creative collaboration between Waititi and The Walt Disney Company. The acclaimed filmmaker directed Marvel Studios Thor: Ragnarok (2017) and Thor: Love and Thunder (2022), Searchlight Pictures Jojo Rabbit (2019) and Next Goal Wins (2023), as well as executive produced Hulus Reservation Dogs, What We Do in the Shadows, and the forthcoming limited series, Interior Chinatown.Source: The Walt Disney Company Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.0 Commentaires 0 Parts 67 Vue
-
WWW.AWN.COMMarvel Reveals Thunderbolts* Special Look TrailerFlorence Pugh stars as the down in the dumps assassin leader of the MCUs most unlikely band of misfits in the Jake Schreier film, hitting theaters May 2, 2025.0 Commentaires 0 Parts 69 Vue
-
WWW.AWN.COMCinesite Appoints Adipat Virdi as XR Executive DirectorThe former Facebook Global Creative Product Lead for Immersive will lead and develop client relationships for new projects as the company expands its footprint in the immersive market.0 Commentaires 0 Parts 70 Vue
-
WWW.AWN.COMAldi, McCann, Psyop Team for Latest Kevin the Carrot Holiday AdTis the season for a new Kevin the Carrot holiday advert! Since its inception in 2016, Aldi and McCann Manchester have collaborated each year with industry powerhouse Psyop, with fans looking to the annual release as the initiator of the most wonderful time of the year.Created in association with Riff Raff Films, this years installment sends the viewer on a caper braving the broiler, precarious mince pies, and other cuisine-related boobie traps as Kevin and Katie try to save the Festive Spirit from the nefarious thieves at Humbug Headquarters. Concluding with yet another feather in Kevins cap, the Festive Spirit is set free and fills empty tables with holiday feasts underscoring the ever-present message of celebrating the cheer of community and the magic of bringing families together.Enjoy!Last years Kevin and the Christmas Factory was named 2023s most effective ad, according to System1s data, although the latest installment may be the teams most ambitious chapter yet.Kevin the Carrots world just keeps getting juicier, and the team at McCann and Aldi are the zest we need to keep things fresh as we cook up new ways to expand the Kevin-verse, said Psyop directors Todd Mueller and Kylie Matulick. With a decade of Kevin marinating in all our minds, hes got so much more room to sprout! Were ready to serve up more flavor-filled adventures and show just how far Kevin can peel ahead! Kevin the Carrot is more than a character; hes become a cherished tradition and a holiday icon, added Psyop EP Jim Brattin. Each year, were inspired to create something even more memorable, and this seasons film is truly our most enchanting yet"Brattin executive produces, while Amy Fahl produces. VFX are created by Cristina Camacho, Steve Hallquist, and Billy Morris. Thomas Sali crafted the 2D animation, with Matthias Bauerle serving as 2D supervisor.Source: Psyop Journalist, antique shop owner, aspiring gemologistL'Wrenbrings a diverse perspective to animation, where every frame reflects her varied passions.0 Commentaires 0 Parts 68 Vue
-
WWW.AWN.COMStar Wars Film Trilogy in Development at LucasfilmSimon Kinberg will write and produce alongside Kathleen Kennedy; set after the events of Star Wars: The Rise of Skywalker, the trilogy moves past the Skywalker Saga and will follow a new set of characters.0 Commentaires 0 Parts 66 Vue
-
WWW.AWN.COMMass Effect Series in Development at Amazon MGM StudiosFast & Furious 9 writer Daniel Casey will executive produce and pen the script for the first adaptation of the fan-favorite sci-fi video game.0 Commentaires 0 Parts 69 Vue
-
WWW.AWN.COMDisney Drops Final Trailer for Mufasa: The Lion KingBarry Jenkins all-new film takes audiences back in time with a host of fan-favorite characters like Mufasa,Scar, Sarabi, Rafiki and Zazu before they called Pride Rock home; movie blends live-action filmmaking techniques with the latest photoreal CGI; coming to theaters December 20.0 Commentaires 0 Parts 67 Vue
-
WWW.AWN.COMCEO Stefan Danieli Resigns from Goodbye Kansas GroupThe exec will remain at the company for nine months to support search for new leader, which the Board says has already begun.0 Commentaires 0 Parts 70 Vue
-
WWW.AWN.COMHBO Drops First Look at Spooky Drama Series IT: Welcome to DerryThe IT prequel, from Warner Bros. Television and filmmakers Andy Muschietti, Barbara Muschietti, and Jason Fuchs, debuts in 2025 on HBO, and will stream on Max.0 Commentaires 0 Parts 74 Vue
-
WWW.AWN.COMAutodesks Wonder Dynamics Debuts New Wonder Studio AI ToolPowered by Video to 3D Scene technology, the Wonder Animation tool turns any video sequence into a 3D-animated scene via AI reconstruction; the beta is available now to Wonder Studio users.0 Commentaires 0 Parts 73 Vue
-
WWW.AWN.COMToho Greenlights New Godzilla FilmOscar-winning Godzilla Minus One director and writer Takashi Yamazaki will return to helm the upcoming project, as well as handle the VFX.0 Commentaires 0 Parts 99 Vue
-
WWW.AWN.COMGame of Thrones' Film in Development at Warner Bros.After years of resistance from execs, a feature set in the universe of author George R.R. Martin has been greenlit; no plot, cast, or crew details are currently known.0 Commentaires 0 Parts 100 Vue
-
WWW.AWN.COMDisney Establishes AI Oversight Unit for Responsible Tech Research, AdvancementA newly-created Office of Technological Enablement will be headed by previous Walt Disney Studios CTO Jamie Voris, and will explore opportunities and risks associated with AI, mixed reality, and other current and emerging tech.0 Commentaires 0 Parts 105 Vue
-
WWW.AWN.COMChaos Releases V-Ray 7 for 3ds MaxChaos has released V-Ray 7 for 3ds Max, adding Gaussian Splat support along with more than 20 new features and improvements, from creative scatter options to firefly removal.Version 7 continues V-Rays leadership in both visualization and visual effects, with innovations that accelerate and extend whats possible for creative professionals, said Phillip Miller, VP Product, Solutions for Artists at Chaos. V-Ray 7 also improves extensibility for custom pipelines while further cementing the Chaos ecosystem with design workflows from Enscape, previsualization with Chaos Vantage and virtual production with Project Arena.One immersive technique which is increasingly popular in creative industries, 3D Gaussian Splats, now provides users a fast way to create realistic 3D environments from photos or videos. V-Ray 7 now offers the first native support for Gaussian Splats with ray-traced rendering, so users can include their scenes within detailed environments with accurate reflections and shadows reducing the effort to place projects in context or on location.Immersive, interactive virtual tours can be uploaded from V-Ray 7 and assembled in Chaos Collaboration, to create panoramic experiences with automatic hotspot generation. Each tour can be customized by adding floor plans, personalizing hotspots and transitions or including contextual details/design elements.The V-Ray Frame Buffers (VFB) expansion helps users do more in one place. New custom-shaped render regions enable artists to render just one or many parts of their frame with custom-drawn shapes. A vignette layer camera effect is now included that can also be customized by shape. The VFB also includes new color-correction presets that let users explore different looks as they refine their design.Additional features and improvements include:Chaos Scatter AidsInstance Brush The new Chaos Scatter Brush can easily populate instances within a scene with more precision. Users can add extra details or quickly remove unwanted areas.Distribution Maps Users can switch between different density styles using the new Chaos Scatter distribution maps library, helping designers add realistic looks and experiment with instance distribution.Faster Scatter-Heavy Exports Chaos Scatter is now managed procedurally by V-Ray 7 Standalone, enabling massive scattering results to export in minimal time when network rendering or submitting to Chaos Cloud.Better LightingFirefly Removal V-Ray 7s new algorithm automatically detects and finishes unresolved pixels, known as fireflies, during bucket rendering, reducing the time needed to produce final images.Enhanced V-Ray Sun & Sky V-Ray 7 offers an improved PRG sky model that can create more realistic images and immersive animations. V-Ray Sky can also now render various observer altitudes up to several kilometers.Chaos Cosmos UpgradesV-Ray Luminaires V-Rays expanding Chaos Cosmos asset library, now brings more speed and accuracy to lighting fixtures, accurately distributing the light, and properly illuminating the fixtures themselves. Most light fixtures within Chaos Cosmos are now all Luminaires in V-Ray 7, making it easier to populate scenes with realistic lighting while also rendering results more quickly.Asset Variants Support Chaos Cosmos has more asset variations to choose from, such as seasonal options for 3D vegetation models (ex: summer or autumn leaves). Users can select desired variants and drag and drop it into their scene or alter it freely after import.Accelerated WorkflowsV-Ray Profiler Users can track the time V-Ray spends calculating shaders and volumes, exporting scenes, compiling geometry and displacement, and loading bitmaps. Once all hotspots are located, users can optimize their pipelines for faster rendering. The Profiler can be paired with the Memory Tracker for more data/insights.V-Ray Lister Geometry Tab V-Ray Lister gains a new tab for managing and tweaking V-Ray geometry objects including V-Ray Proxy, Fur, Decal, and Clipper. With advanced filtering options, artists can navigate and manage settings, controlling multiple geometry objects directly from the Listers UI.GPU BoostsFaster Time to First Pixel New optimizations for scatter rendering, texture heavy scenes, data uploads and geometry compilations offer better production/interactive rendering experiences, as well as a faster time to first pixel.Caustics Support V-Ray GPU now supports caustics, enabling realistic surface reflections and refractions in both production and interactive rendering. Based on photon mapping, the new Caustics solver is optimized to fully utilize GPU hardware, delivering faster results than a CPU.Out-of-Core Textures Texture-heavy scenes can be rendered more efficiently, helping artists add detail to their scenes without sacrificing shading quality.Other UpdatesSelective V-Ray Scene Conversion Users can convert selected objects, update materials with a new Material Process, or convert textures to .TX files for smoother use and better performance, all with the same toolset.Extended USD Support V-Ray 7 now supports the latest version of USD for Autodesk 3ds Max 0.9.0, so users benefit from the latest features and enhancements.OpenPBR Support Consistent shading results are achievable across applications with OpenPBR Material support recently introduced with 3ds Max 2025.3. This brings a new shading model to V-Ray that promises to boost production efficiency by reducing the need for manual adjustments when switching between compatible renderers and applications.Additional information is available here.Source: Chaos Debbie Diamond Sarto is news editor at Animation World Network.0 Commentaires 0 Parts 70 Vue
-
WWW.AWN.COMIATSE Voices Support for U.S. Federal Film and TV Tax IncentivesThe International Alliance of Theatrical Stage Employees (IATSE) has official voiced support for U.S. Representative Adam Schiffs recent letter to the U.S. Bureau of Labor Statistics (BLS) and U.S. Bureau of Economic Analysis (BEA) requesting statistics regarding the United States standing as a film and television production industry leader.Schiffs letter to both bureaus highlight the impact of international tax policy on American jobs and the urgency of introducing a competitive labor-based federal production tax incentive to keep film and television made in America. In his letter Schiff states: employment in film and television production has grown abroad over the past several decades, threatening job growth at home as more foreign countries provide meaningful production incentives.Countries like the United Kingdom, Australia, and others have recently expanded their federal incentive and subsidy structures to lure productions from the United States. This has been one of the forces attributed to a persistent economic downturn in the U.S. film and TV industry, with production in 2023 and 2024 down significantly compared to 2022.While states like California, New York, Georgia, and New Jersey have offered tax credits for production, contributing to local economies and jobs growth, they have not been enough, notes IATSE, to prevent productions from moving overseas.The proposal to implement a federal incentive would level the playing field and address this imbalance, said IATSE International president Matthew D. Loeb. We support the concept of a federal incentive for the creation of film and TV, provided the plan also has mechanisms to uphold labor standards. We are committed to saving Americas entertainment industry, and we look forward to working with our members, local unions, allies, and lawmakers at all levels to get it done.IATSE has joined the Congressman in urging the BLS and BEA to gather and release data on the impact of foreign production incentives on U.S. jobs and local communities. The union is confident the data will reveal what members already know: That productions choose where to locate based on the incentives, infrastructure, and talent available; that productions directly and indirectly drive spending which ripples benefits through local economies via the multiplier effect, and that legislation is needed urgently to save the cultural institution that is American film and television production.Download a copy of Congressman Schiffs letter here.Source: IATSEAttachmentSize Debbie Diamond Sarto is news editor at Animation World Network.0 Commentaires 0 Parts 96 Vue
-
WWW.AWN.COMUniversal Pictures, LEGO Team for a Trio of Live-Action FilmsJake Kasdan, Patty Jenkins, and Joe Cornish will each helm one of the untitled movies; a separate, live-action LEGO Ninjago feature will be penned by brothers Kevin and Dan Hageman.0 Commentaires 0 Parts 97 Vue
-
WWW.AWN.COMDisney+ Drops Look Ahead Teaser for Upcoming Marvel TitlesGet a glimpse at Marvel Studios, Marvel Television, and Marvel Animation titles, including Deadpool & Wolverine, Your Friendly Neighborhood Spider-Man, and What If? that are set to stream through 2025.0 Commentaires 0 Parts 88 Vue
-
WWW.AWN.COMFramestore Serves Big Plate of Spaghettification for Deadpool & WolverineOne must wonder if Green Lantern is next on Ryan Reynolds hit list for the Merc with a Mouth, considering he resurrected the title character from the cinematic debacle known as X-Men Origins: Wolverine for his latest actioner, Marvel'sDeadpool & Wolverine. Participating in the blood-soaked comic book irreverence is Framestore which was brought onboard by Production VFX Supervisor Swen Gillberg to provide previs, onset support, techvis, postvis and digitally augment 420 shots that ranged from the brutally funny opening sequence to the psychedelic and psychotic third act.When it came to the visualisation team, Swen was excellent in providing video briefs directly from the pitch room rather than relying on written communication, we met often and discussed ideas, explains Kaya Jabar, Senior Visualisation Supervisor, Framestore. He also used our team to iterate on ideas and designs quickly, lens up concepts and present them back to the other HODs [Heads of Departments] to ensure everyone was on the same page. Clear direction and reference were provided for most of the sequences in advance. Having a direct line to Kaya made it even easier to check in to see what avenues had already been explored with Swen and Shawn Levy [director] before adding ideas of our own, states Matt Twyford, VFX Supervisor, Framestore. We were lucky to have had a lot of quality time with Swen on the shoot and having the previs/postvis on hand all the time focused the decision-making. This allowed us to work up the quality of our assets while FPS [Framestore Preproduction Services] fast iterated ideas, hooking into our assets as they developed so the visualisation and visual effects were constantly converging.We relied heavily on motion capture to block out sequences quickly and ensure everything was grounded in reality, remarks Jabar. From a purely technical perspective, managing our scenes on the Oner fight was really difficult with the number of bespoke characters on screen. The uninterrupted flow of the animation proved challenging for real-time playback in viewport, where we tend to live as previs artists. Two unique challenges on the film had to be resolved. Firstly, the Cassandra/Paradox hand intersection and then the Oner, a five thousand frame continuous stunt fight in the City Street set environment, notes Twyford. Cassandras [Emma Corrin] power was shown across multiple closeup and long shots where we deformed Paradoxs [Matthew Macfadyen] whole head and face by pushing Cassandra's hand through it. This sequence required a huge amount of upfront rework for our creature [also human!] pipeline but the payoff as one of the big squirm in your seats moments was great. The Oner was more traditional in its technicalities, but huge in its project management scope. Over 80, five thousand frame takes of motion control, shot exterior through a British winter, generated over a thousand individual artist tasks.Houdini was introduced into postvis to help with timing of the characters and effects. As we are part of the full Framestore pipeline, this was more a workflow rather than tooling challenge, observes Jabar. For the Oner, we designed a way to count the actors on screen at render time and write out a HUD that helped with techvis and planning the actual shoot. One major adjustment for the visual effects workflow and pipeline was for the digital double of Paradox. We knew how to make a fully realistic double with full performance in closeup, but to then shove a hand through that head in a photorealistic and interactive way was not something we normally build for, states Twyford. Although these challenges are usually overcome with time and talent, we knew that the current tools for skin simulation are very much at the back end of the pipeline; this meant we would be showing the skin simulations right at the end of the visual effects process and frighteningly close to the deadlines, especially if any major changes were going to happen. We decided to create a new process where we moved the skin distortions into the animation rigs. This meant that the animators actually animated the skin themselves rather than it being driven by simulations run after the animation. The result was fantastic extra value added by the animators and we were able to show the result right at the front end of the process getting great early feedback from the Filmmakers. The effects artists then focused the post-animation simulations on the fine detail creases and stretches, eyebrow/eyelash and hair interaction. There was no shortage of complex shots to be visualized. For something like the Cold Open we used motion capture of a stunt performer based on initial storyboards, remarks Jabar. We then pitched new ideas on top and fleshed out the cut once stunts had rehearsed the new ideas from the vis. We also helped inform special effects for the carousel section where Deadpool spins, dispatching TVA agents in the centre of a fast circular dolly. The shot with the most iterations is definitely the Oner where we pushed beyond version 200 even in postvis, assembling all the elements and also iterating on the final moment where our heroes jump out of the back of the bus to nail the poses and slow-motion. This shot was also the most intensive in terms of animation and technical visualisation time for previs, as we wanted to really work out every single actor on screen and their action to help guide stunts on the day, while leaving it loose enough to allow them to flesh it out further.The project covered the full gamut of visual effects work from bluescreens, crowd duplication, stunt enhancement, simple and complex environment top ups, photo real simulations, pseudo-science simulations, and creature and digital double work, states Twyford. It was a show that had all the departments busy and challenged.The wide range of visual effects produced for the film also included spaghettification. Spaghettification was introduced originally in Loki Season 2 to show the universe breaking down into strands after major disruptions in the timelines. Only loosely linked to the scientific concept of the same name, it worked well in the Time Ripper sequence as it was established Marvel science and suited the story and visual dynamic. According to Twyford, We had originally developed the look and shots in Loki, so it was great to see it again in another environment as it has such a powerful and dark overtone when you see it develop through a scene. It requires subtle and clever animation by the effects simulation artists and complements the foreground action as it brings darkness and intimacy to the action. The opening title sequence was shot in the UK during summer, which meant that the location had to be practically and digitally dressed with snow. We worked up the dressing to height, added in some falling snow and balanced through the sequence in grade and atmosphere for continuity, reveals Twyford. Wolverines skeleton was a mix of practical and CG. We swapped out various pieces or the entirety of the TVA Minutemen to enhance the violence. Then came the blood. There was no practical blood shot, so everything is a CG simulation custom designed for every impact. The brief was to overdo it at first; some of the initial blood fountains were hilarious but also a bit too ridiculous. The heavy use of bullet time and speed up allowed us to create some beautiful shapes with the blood against snow influenced by Jackson Pollocks work. Technically, this a big challenge as the blood and debris have to interact fully with the environment, the characters and the props. Everything was tightly body tracked, and continuity damage carried through the edits building up to the final full reveal of the carnage. The Time Ripper destruction sequence went through numerous iterations as it evolved. Many of the key elements, like Cassandras destruction, Deadpool and Wolverines damage, the environmental effects both in the upper control room and the chamber below, all were developed dynamically as the edit, performances and visual effects all started coming together, notes Twyford. The only simulation we knew exactly where we were going with was Wolverines costume explosion where the postvis was so awesome we followed their lead. The internal burning for Wolverine was a look developed by the comp team with the cracks and debris coming from effects simulations. Deadpool was partially replaced with our digital double to allow us to blast hot light from inside his body and flare the fibers of his suit. Cassandra died in a million spectacular ways in the ongoing atomization look development of our effects team. We had a fantastic high quality digital double of Emma Corrin including all the skin and subdermal layers through to the skull and the brain. This was blown up, atomized, wind flayed, sandblasted and deconstructed in many unpleasant ways trying to find the right feel for a comedy film where the baddie dies horribly. The final look incorporated an internal plasma overheat with an external sandblasted skin. Thankfully, all over quickly in the final edit.Various Deadpool variants make cameo appearances, including one of a canine persuasion. Dogpool was a digital asset we built to cover any missing performances in the shoot, explains Twyford. Little Peggy and her trainers though were absolute superstars onset, and we got footage for every shot wearing her costume and booties. The original concept did not have her wearing a mask or goggles and we were asked to try and put some doggles on her. This turned out to be massively popular and allowed us to rework the optics to give the googly eye effect; this in turn allowed us to animate her eye performance and open up a new level to her character. In the Deadpool corps sequence her doggles eyes and facial hair are CG over her plate performances. Pinewood Studios in London provided the street environment used in numerous scenes. It included a whole block with the first two stories of buildings, all surrounded by bluescreen, states Twyford. Although it was influenced by the look of New York, it also has a more generic city feel with aspects of Vancouver and Boston. Our role was to extend the set upwards, create a digital city extension for the midground areas and then use a digital matte painting for the distant city horizon. It needed full city life including pedestrians, traffic and believable infrastructure. The set top ups were custom digital models to match Ray Chans [production designer] set design and the city extension used assets previously seen in Marvel productions along with street dressing from our library all laid out to art department references. Cars were a mixture of existing assets and new builds of vehicles to match set cars and buses. All the pedestrians were bluescreen elements shot specifically for the scenes and then dressed into our CG build with our custom Nuke particle tools. Everything was then lit and redressed for each day, twilight and early morning scenes.In the film, blood and gore were not in short supply. All the blood and gore were CG simulations, Twyford notes. With the amount of retimes and big camera moves in the plates we were wary of trying to force in filmed elements, especially when interacting with everything in the scene. Characters, props and the environment lidar were tracked tightly and simulations run pre-retiming to give us a starting reference. Then we creatively tweaked to get the most interesting shapes and framing that worked with the cameras and editorial timing. Once we had a good overall simulation, we worked up the secondary simulations of splashes, soaking into fabric, landing in snow and specific rivulets and drips. The initial brief was to use a comedic amount of blood, so we kept it as clean liquid with no bits or chunks to reduce any unpalatable goriness. A good amount of time was spent fine tuning the surface tension characteristics to allow us a slightly cartoon graphic feel to the flying blood shapes and with a tight shutter angle, every frame looks like it came directly from the pages of a comic. Our biggest challenge was making sure we always kept one eye on who these characters were, in terms of the timing of the animation and the design of all our elements, reflects Jabar. Everyone truly loved what we were trying to achieve, so we wanted to make sure we never lost sight of that despite the volume of work and the complexity. We were on the project for 18 months and I wanted to ensure I kept largely the exact same team throughout and as a supervisor. Ensuring everyone was motivated and rested enough to keep bringing fresh perspectives and their best work was challenging, but so rewarding, and I think that really showed on screen. Paradox proved to be the biggest technical and creative challenge because of Cassandra pushing her hand through his face. The reference was just a couple of the original comic book frames, and our job was to make it photoreal closeup on a performing actor across a dozen long shots, remarks Twyford. What the comic book frame did give us was the level of distortion and key moments, like the finger coming out of his nostril and eye socket. We then had to design the movement and how her hand reacted to the skin, tendons, bones and cartilage of his face in a realistic way. The result is one of the most memorably uncomfortable seat squirming movie moments and I hope everyone in the audience feels Paradoxs pain. Trevor Hogg is a freelance video editor and writer best known for composing in-depth filmmaker and movie profiles for VFX Voice, Animation Magazine, and British Cinematographer.0 Commentaires 0 Parts 89 Vue
-
WWW.AWN.COMGetting It Right: The Carefully Calibrated VFX that Makes The Boys The BoysPerhaps youve heard of Prime Videos hit Emmy Award-winning series The Boys. Based on the comic book of the same name by Garth Ennis and Darick Robertson, the series, developed by Eric Kripke, who also serves as an executive producer, recently completed its fourth season, with a fifth upcoming in 2026. The Boys is many things: a brilliant subversion of the superhero genre, a biting political satire, a tour de force of action filmmaking, a pitch-dark comedy, and a veritable gorefest of bloody violence, in which multiple exploding heads are but so many grace notes.For Visual Effects Supervisor Stephan Fleet, who has been with the show from the beginning, the multi-faceted entity that is The Boys has proved to be an often challenging, always engaging, and truly educational experience. While his specific duties include the rendering of such notable phenomena as flying sheep and human combustion, its often the less flamboyant effects that require the most innovative and labor-intensive work.From the unexpectedly nuanced considerations that determine the appropriate volume of blood, to the visual and legal complications involved in the representation of screens onscreen, Fleet shared some of the central aesthetic and technical issues that inform his VFX stewardship. Dan Sarto: There's a wide range of visual effects that you produced for this show. Im guessing they included a lot of things that were visually vivid, but not necessarily the most challenging, and others that may not have stood out, but actually were labor intensive, or groundbreaking in some way. Can you talk a little about that?Stephan Fleet: In the biz, there's something called an amort or amortization budget and, for visual effects, that can mean effects that repeat over multiple episodes. Like lasering is a gag that we do over and over again, as well as blowing people up. And those types of things become easier. I'm not saying they dont pose their unique challenges, but they become more commonplace because you do them again and again. This is a show that does not actually have a lot of that. We tend to introduce a new superpower at least every couple episodes, if not every episode, and there are just some quirky things like flying sheep, for example, or carnivorous chickens.(Note: Killer Sheep VFX produced by Untold Studios)So, to answer your question, it's all hard to me. The hardest stuff is the stuff that has to look 100% real. When you do something like a flying sheep, there's always going to be a slight suspension of disbelief with an audience. Just because those don't really exist. However, cloning, or having a character be multiple versions of themselves in the same frame, was one of the hardest things we had to do this season. Audiences are really savvy in this day and age. I mean, influencers will make TikTok videos where they clone themselves, so people know how it's done. So now you have to do it in ways that make it harder; you have to do the impossible shots.For instance, the first time we see the cloning character, Splinter that shot took about 16 hours to do using motion control. It was about eight hours of rehearsal and setting up the cameras with dancers and a metronome. There's no face replacement. It's all the actor playing every single character with a moving camera moving over and over and over again. And the funny thing is, I've seen a few people watch that shot now during the course of the show, and no one really thinks about it. Everyone just sees six of the same guy, and no one actually goes, wow, that's a complicated visual effect. It's just so smoothly done that it looks like another piece of footage, but it took 16 hours to make. (Note: Splinter VFX produced by Pixomondo)Later on in the season, we have Erin Moriarty's character, Starlight, as a doppelganger of herself that she's fighting. That was another really complicated scene because, while she's just talking to herself in a room, she does things like grab a water bottle out of the other person's hand or she grabs her face and shakes it. So now you have contact between the two, and its a clever blend of face replacement and plates. And then, when she's fighting herself, that is her playing both sides of it, again to a metronome with heavy stunt choreography and a little bit of visual effects. So, these things that don't have a lot of blood or spectacle, but take a lot of visual effects, are by far the hardest thing to do.(Note: the Doppleganger VFX was produced by Pixomondo)Also, anyone who really pays attention knows that one of our motifs and one of our means of exposition in pushing the story forward is through people watching the media on monitors. All that stuff has to be heavily designed with motion graphics. Every bit of text on a TikTok has to be written out and vetted to make sure that it has the right timbre. And then on top of that, we're very particular with how the monitors look.DS: We just take it for granted when we see something like that on a show, because we all know what that looks like. But it's all very, very carefully created.SF: Yeah, we go to great lengths. I've developed a great appreciation for, and learned a lot about, just the legality of using things like TikTok and X/Twitter. There are ways to use this stuff for real, if you're using it in a certain way. We try and do that, but if we can't, we'll make our own thing that gives people an idea of what it is. We do our research and try to emulate a similar TikTok online what would the comments be in this day and age? So, in our world, it's mirroring some very real political things going on in the real world, but in a slightly Bizarro way. (Note: the Hard Push VFX produced by DNEG)DS: You mentioned that there are some effects, like blowing people up, that youve done many times and that have become commonplace. And there's always a little camp in it, a little dark humor. How do you arbitrate that? How do you determine what's too much or too cartoony? And have you arrived at a set way you do that, or is it a continual process of evaluation?SF: One of the joys in doing episodic work and seasons of things is that they have long lifespans. So, you do a season of something, and you learn from it. And so, some of the stuff that we did in Season 1 is picked up on by Eric and the writers, and then they actually write to it for Season 2. And then, in Season 2, something new comes along and I pick it up. You start building almost like a library. And then you also get feedback from audiences, and you get to lean into what audiences want. And so, you get this wonderful opportunity to do multiple seasons of this and build it up.We started doing a realistic amount of blood, and very quickly learned that that was not going to work. It wasn't legible. So, we just started pushing the blood more and more until it became almost this Jackson Pollock canvas of unrealistic blood. But the storytelling and the romance overtook the reality of the situation. And I realized at that moment that, while we are a show that touts itself as being grounded in many ways, blood is a conceit for us and a language that we use to tell the stories. It is one of those things in which we don't necessarily strive for realism. Ultimately, we've learned that people come to expect these heightened moments, and so we we're able to lean into it and have a little bit of liberty. (Note: face punch VFX produced by Untold Studios)DS: What would you say is the most important skill or skills that you've brought to this project? What has served you best as a visual effects supervisor?SF: That's a great and really deep question, actually. I went to theater school for undergrad and I went to film school for a master's degree. Doing this show specifically has been my PhD for how to make a show, not just visual effects. If I look back at Season 1 to now, and who I was as a person, I've become a very different person.I'm a passionate artist, I'm very good at what I do, and I think I have a good eye. But ultimately, I'm here to shepherd my team on the show fantastic artists and a multitude of vendors throughout the world to create this vision. And it's not something they really teach you in this industry. Unfortunately, when you first start growing and coming up and becoming a department head, there's no management school for visual effects supervision.When I first started doing this, out of a mixture of drive, a little bit of fear, and just a hunger to get it done, I could be a little angry or aggressive. And it's not that you're mad at other people, you're just trying to push the product forward, which can lead to this pressure cooker of insanity. And what I've learned as I've gone through is that other people have a lot of great ideas, and I love to listen to other people's ideas and bring them in and really build a team. It's not a kumbaya experience. It's a very professional experience, but it can be a professional experience that involves fun.I think one thing in visual effects that can be a problem is we can come in as the only department below the line that is responsible for stuff in post. You're on set with a lot of people that are not responsible for the outcome of that product in post-production. So that can make you very nervous. But if you go to another department and just give them an ultimatum or something, it just stresses them out more and doesnt solve a problem. But if you understand a little bit about what they're going through, you can come to the table and ask, how do we go through this together? There's so much that people don't understand about what goes into making the simplest of television shows, yet alone a complicated show like this. I would recommend that any visual effects supervisor, or aspiring visual effects supervisor, do their best to learn about and empathize with every other department and the people that they work with. Jon Hofferman is a freelance writer and editor based in Los Angeles. He is also the creator of the Classical Composers Poster, an educational and decorative music timeline chart that makes a wonderful gift.0 Commentaires 0 Parts 98 Vue
-
WWW.AWN.COMIncrease in UK VFX Tax Incentive ConfirmedThe Chancellor of the Exchequer has confirmed an increased tax incentive for spending on VFX in the UK.Delivering Labour's first budget since coming to power in the July general election, Rachel Reeves announced that VFX spending in the UK will attract a net rebate of 29.25% which will be exempt from the 80% cap on spending eligible for film and TV tax relief. This had been proposed by the previous government in March, but the early election meant that the uplift was not implemented.The Labour government has identified the creative industries as one of eight growth-driving sectors within its Industrial Strategy, and the VFX uplift is projected to attract an additional 175 million per year of spending and the creation of 2,800 new jobs.Earlier this year, the Treasury had proposed to exclude costs relating to Generative AI from the VFX uplift. However, following consultation with the industry, this proposal has now been dropped.As requested by the UK Screen Alliance, the tax incentive has been moved up from April 1, 2025 to January 1, 2025. The move will avoid production delays and allow VFX companies to get work flowing as they recover from last years writers and actors strikes. Claims for the rebate can be made from April 1.The confirmation in the Budget that the VFX rebate will be available from the New Year is terrific news for the UKs visual effects companies, said Neil Hatton, CEO of UK Screen Alliance. We know that productions are making decisions right now on where to place their VFX work for 2025 and beyond. Todays announcement means that these clients will be incentivized to place many millions of dollars of inward investment work with the UKs award-winning VFX community, creating considerable value for the UK economy.Source: UK Screen Alliance Journalist, antique shop owner, aspiring gemologistLaurn brings a diverse perspective to animation, where every frame reflects her varied passions.0 Commentaires 0 Parts 94 Vue
-
WWW.AWN.COM'SNL' VFX Workers Unionize, Win Recognition with Unanimous SupportVFX workers for Saturday Night Live (SNL) are unionizing with the International Alliance of Theatrical Stage Employees (IATSE) and have won official recognition of their union. The group includes 16 VFX artists and leads who unanimously supported unionizing with IATSE.While SNL is known for its live televised segments, it also features several pre-recorded "digital shorts" which require editors and VFX workers to operate under tight time constraints; in 2017, Fast Company reported turnarounds could be as little as 12 hours.Over the six seasons I've worked at SNL, I've seen the VFX department evolve from a small group to a tightly integrated, highly organized operation capable of delivering hundreds of demanding shots over a 24-hour period, said VFX artist Richard Lampasone. It's an intense, collaborative, and extremely fun environment that constantly tests the limits of our skills, our versatility, and, after long days staring at a screen, our ability to form coherent sentences. Our work, like that of everyone else above and below the line, is critical to the show's success. We look forward to celebrating Season 50 by joining in SNL's decades-long tradition of supporting union labor, and to helping negotiate a contract that reflects the substantial value we add and makes ours a more accessible and sustainable career for years to come.Working here is tremendously fun, chaotic, and hugely rewarding, added Danny Behar, VFX artist for SNL. We work 15-hour days every Saturday, delivering renders before cast & crew start rehearsals, and ending after the show has finished broadcasting. Our department is essential to the show's success. For that and a multitude of other reasons, we deserve to have a seat at the table. We are the only department that currently does not have one. If we're going to continue working on the show, it is necessary for us to receive the basic entitlements offered to other units like pay equity and stable healthcare.The VFX workers are following in the footsteps of SNLs editors, who began their unionization campaign in October 2022, ultimately resulting in the editors ratifying their first agreement in May 2023. This week, NBCUniversal management agreed to a similar process for recognizing SNL VFX workers union after SNL VFX workers presented signed authorization cards demonstrating 100% support for unionization.We deserve what every other department at SNL has, we deserve to be protected, we deserve to be represented, and we deserve to be on equal footing with the people we work directly side by side with, said VFX lead David Torres Eber. SNL is a very stressful show to work on while also being a very enriching experience full of creative problems to solve. We should be focused on those problems each week and not whether our insurance has lapsed or when we can schedule a doctors appointment after the summer hiatus ends.The effort to unionize is part of a broader campaign by IATSE to bring representation to VFX workers across the entertainment industry, as positions in the field have not been historically represented. IATSE has made significant strides in recent months, securing union recognition for VFX workers at companies such as Marvel, Disney, Avatar, DNEG (Canada), and AppleTV. Those interested in joining the movement should visit their website for more information and to get in touch with IATSE organizers.Source: IATSE Journalist, antique shop owner, aspiring gemologistLaurn brings a diverse perspective to animation, where every frame reflects her varied passions.0 Commentaires 0 Parts 92 Vue
-
WWW.AWN.COMRoland Emmerich to Develop Live-Action Space Nation SeriesA live-action series adaptation of the Web3 video game Space Nation Online is in development from the transmedia IPs co-creators and co-founders Roland Emmerich (Independence Day, Stargate) and Marco Weber (The Thirteenth Floor, Igby Goes Down). The duo will also executive produce the project.Originally announced in June 2023, Space Nation Inc. was founded by Emmerich, Weber, and veteran game developers Jerome Wu and Tony Tang. With $50 million in funding, the four co-founders set out to build a first-of-its-kind transmedia IP spanning video games, online content, and TV/streaming. Space Nation Online soft launched on September 27, garnering high player numbers and retention.As gamers around the world are joining us in the Telikos Cluster for Space Nation Online, Im motivated to expand the games story with a dive into the origins of humanitys exodus from Earth following an alien invasion, said Emmerich. Working in parallel with Marco and the game development team to expand this new sci-fi universe has been a singularly unique creative experience, and Im excited to continue exploring whats possible through cross-media storytelling. Were at an important stage with Space Nation Online now live, and fans are beginning to experience how our IP will expand, added Weber. Its been a rewarding process to explore how different entertainment mediums can come together to build a larger universe. Roland and I are eager to continue developing this new universe as we create a truly interconnected transmedia experience for a global audience.To support the games launch, Emmerich and Weber teamed with filmmaker Martin Weisz (The Hills Have Eyes 2) to create a series of animated shorts featuring the Space Nation character Zoey, chronicling her ill-fated attempts to go viral online. In addition to the three entertainment industry veterans collaboration, Jess Lebow, lead writer of Space Nation Online, co-wrote the narrative alongside actress Winona Weber, voice of Zoey in both the shorts and the game.Check out the first episode, featuring alien teen space pirate now:Source: Space Nation Inc. Journalist, antique shop owner, aspiring gemologistLaurn brings a diverse perspective to animation, where every frame reflects her varied passions.0 Commentaires 0 Parts 96 Vue
-
WWW.AWN.COMCinesite Celebrates 10 Years of Innovation and Growth in MontrealCinesite Montreal has marked its 10th anniversary by releasing a retrospective showreel that highlights some of its most memorable work from The Addams Family, Paws of Fury: The Legend of Hank, Teenage Mutant Ninja Turtles: Mutant Mayhem, Blitz, Black Panther Wakanda Forever, Ant-Man & The Wasp, and No Time to Die.These projects, brought to life by the companys talented supervisors, artists, technicians, and R&D teams, have earned Cinesite Montreal numerous awards and recognition over the last decade.Since opening its doors in 2014, Cinesite Montreal has steadily expanded, establishing itself as a key player in the global animation and VFX industry. Over the past decade, the studio has delivered 60 VFX and animation projects.Watch Cinesites Montreal 10th Anniversary Reel:Commenting on this milestone, Cinesite Montreal general manager Graham Peddie said, Our 10th anniversary is a huge milestone that Im incredibly proud of. Weve seen progressive growth, both in the size of our studio and in the scale and complexity of our projects. All the people involved in our success to date have had a hand in establishing the groundwork for our future artists and collaborators to build upon. Were excited for what the next chapter holds.Hubert Bolduc, President of Investissement Qubec added, "I extend our warmest congratulations to Cinesite Montreal on their 10th anniversary. Their studio has not only been a signicant contributor to the growth of our local animation and visual effects industry but has also showcased the exceptional talent and creativity of Montrealers on the global stage. Investissement Qubec looks forward to continuing our partnership and supporting their future success."Beyond its industry success, Cinesite Montreal is deeply involved in the community. Partnerships with organizations like Opration Pre Nol and the Black Community Resource Center highlight its dedication to social responsibility. It has partnered with six different colleges and universities across Canada and is actively working to increase diversity and inclusion within the industry.Gretchen Libby, Director of Specialists for Media & Entertainment, Games, and Sports at Amazon Web Services (AWS), said, Over the past decade, Cinesite Montreal has consistently delivered exceptional animation and visual effects that have captivated audiences around the world. Cinesite understands the power of using technology to bring stories to life, and we look forward to many more years of collaboration and creative achievements between AWS and Cinesite.As the studio looks forward to the next decade, Cinesite Montreal plans to continue building on its legacy of creativity and community engagement.Upcoming VFX projects at Cinesite Montreal:Blitz (Releasing November 1st, 2024): Experience the gripping tales of a group of Londoners amid the British capitals wartime bombings.G20 (Releasing in 2025): G20, starring Viola Davis as US President Taylor Sutton, is an action thriller where terrorists take over the G20 summit. President Sutton must use her skills to defend her family, her company, and the world.Michael (Releasing April 18th, 2025): The story of Michael Jackson, the King of Pop, explores the life and legacy of the iconic musician.The SpongeBob Movie: Search for Squarepants (Releasing December 19th, 2025): Follow SpongeBob as he travels to the depths of the ocean to face the ghost of the Flying Dutchman.Upcoming Animation projects:HITPIG! (Releasing November 1st, 2024): The upcoming British-Canadian animated adventure comedy is directed by David Feiss (Cow & Chicken) & Cinzia Angelini (Mila). The lm comes from an original story by Berkeley Breathed, rooted from his 2008 children's book Pete & Pickles. Breathed penned the screenplay alongside Dave Rosenbaum and Tyler Werrin; it is scored by Isabella Summers of Florence and the Machine.Animal Farm (2025): The animated adaptation of George Orwells Animal Farm is directed by Andy Serkis.Smurfs (Releasing July 18th, 2025): Cinesite produces the upcoming collaboration between Paramount Animation, Nickelodeon Animation, LAFIG Belgium, and IMPS. Chris Miller directs from a script by Pam Brady. Ryan Harris produces.Source: Cinesite Journalist, antique shop owner, aspiring gemologistLaurn brings a diverse perspective to animation, where every frame reflects her varied passions.0 Commentaires 0 Parts 101 Vue
-
WWW.AWN.COMPaul Franklin Joins BeloFX as Creative DirectorThe Oscar winning VFX supervisor and filmmaker, known for his work on Interstellar and Inception, as well as co-founding DNEG in 1998, follows fellow co-founders Matt Holben and Alex Hope to the Canada, UK and India-based studio.0 Commentaires 0 Parts 95 Vue
-
WWW.AWN.COMRonald D. Moore to Showrun God of War Game Adaptation at Prime VideoBased on the critically acclaimed PlayStation videogame franchise, the live-action series follows God-in-exile Kratos, who stumbles into an epic adventure as he attempts to spread the ashes of his diseased wife with his estranged son.0 Commentaires 0 Parts 95 Vue
-
WWW.AWN.COMAMD Radeon GPUs Turbocharge New AI Tools in CG SoftwareBacked by AMD Radeon GPUs, AI tools are poised to transform the way that artists work. From video production to visual effects, rendering to retouching, new artificial intelligence (AI) and machine learning (ML) features in graphics applications speed up routine tasks and take the drudgery out of repetitive ones, leaving artists free to focus on their creative goals.New AI tools speed up day-to-day workflowOne artist tapping into the potential of AI through AMD hardware is Mike Griggs, a Digital Content Creation Consultant whose clients include international businesses like the BBC and JCDecaux.A powerful graphics workstation helps my productivity [by] accelerating the AI features of my content creation tools, he says. I've been using workstations with AMD Radeon PRO GPUs to enable these new experiences.Griggs says that he sees the largest day-to-day improvements in post-production, thanks to the new AI features in applications like DaVinci Resolve, Blackmagic Design's video editing, color grading and VFX software. Neural Engine, the machine learning system available in the Studio edition of the software, speeds up a range of routine tasks, including audio transcription, video stabilization and object tracking, and can automatically generate masks and depth maps from source footage.The latest stable release, DaVinci Resolve Studio 18.6, helps Neural Engine harness the power of AMD GPUs. In tests, AI-based mask generation and tracking tool Magic Mask now runs over 4x faster on a current high end AMD Radeon RX 7900 XTX GPU than in the previous release.1[Magic Mask] enables me to create custom masks for my 3D work without needing to go back to my 3D software, says Griggs. Artists such as myself can get results quicker than ever before.How AMD GPUs power AI processingAMD GPUs can accelerate these new tools thanks to their specialist AI hardware. The RDNA 3 GPU architecture, used in AMD Radeon PRO W7000 Series and AMD Radeon RX 7000 Series GPUs, features two dedicated AI accelerators per compute unit.On top of that, the graphics memory capacity of workstation cards from the Radeon PRO W7000 Series makes it possible to process large data sets. With 48GB of fast GDDR6 memory, the Radeon PRO W7900 GPU can handle even large production assets without the performance hit from going out of core.And having a GPU in your workstation capable of accelerating AI tools helps to keep workflows local. Not needing to process jobs online avoids the need to upload commercially sensitive data to the cloud, reduces demands on network bandwidth, and provides more control over deadline-critical tasks.Compatible with key graphics softwareNor is DaVinci Resolve the only CG application that can harness AMD GPUs to accelerate AI processing. Boris FX's Continuum plugins for VFX and motion graphics work include a range of AMD-compatible ML tools, for tasks ranging from blurring out faces to retiming video.Twixtor, RE:Vision Effects' video retiming plugin for software like After Effects and Premiere Pro, also features a new machine learning algorithm again, fully accelerated by AMD GPUs. Topaz Labs' Gigapixel AI and Video AI upscale still images and video, while 3D software Blender uses AI to remove noise from renders generated by its Cycles renderer.AMD: accelerating creativityWith their dedicated AI accelerators, the GPUs of the AMD Radeon PRO W7000 Series and AMD Radeon RX 7000 Series can speed up these everyday workflows, while the additional GPU memory available in workstation cards makes it possible to process even complex production scenes on a local workstation, without the need to upload sensitive data to the cloud.For artists like Mike Griggs, the speed boosts reduce interruptions in creative workflows and make it possible to turn around jobs quicker.I'm glad that my AMD graphics workstation is already well-placed to make the most of these new opportunities, while providing real-world benefits to my business and clients today, he says.See creative AI tools in action:Find out more about how AMD Radeon graphics can enhance your creative output - https://www.amd.com/en/products/graphics/radeon-for-creators.htmlFootnotes1 Testing conducted by AMD as of September 19, 2023, on a test system configured with a Ryzen 9 5900X CPU, 32GB DDR4, Radeon RX 7900 XTX GPU, and Windows 11 Pro, with AMD Software: Adrenalin Edition 23.9.1, using the application Black Magic DaVinci Resolve 18.6 vs. Black Magic DaVinci Resolve 18.5. Data was gathered on the HD to 8K UHD 4x (Playback FPS @ 1080p Timeline), Speedwarp 10% (Playback FPS @ 1080p timeline), and Magic Mask Tracking Better (FPS). Performance may vary. System manufacturers may vary configurations, yielding different results. RX-997All performance and/or cost savings claims are provided by Mike Griggs of creativebloke and have not been independently verified by AMD. Performance and cost benefits are impacted by a variety of variables. Results herein are specific to Mike Griggs and may not be typical. GD-181.David Diederichs is Product Marketing Manager for AMD. His postings are his own opinions and may not represent AMDs positions, strategies or opinions. Links to third party sites are provided for convenience and unless explicitly stated, AMD is not responsible for the contents of such linked sites and no endorsement is implied. GD-5RE: Vision Effects images courtesy of Clark Dunbar, Mammoth HD, Continuum images courtesy of Boris FX David Diederichs is Product Marketing Manager - AMD.0 Commentaires 0 Parts 110 Vue
-
WWW.AWN.COMBuilding the World of Percy Jackson and the OlympiansSince its premire in December 2023, the Disney+ young adult fantasy series Percy Jackson and the Olympians has not only attracted a wide and enthusiastic viewership, but has been acclaimed by critics for, among other things, its faithfulness to the source material, its performances, and, notably for our immediate purposes, its worldbuilding. Based on Disney Hyperions best-selling book series by award-winning author Rick Riordan, and starring Walker Scobell, Leah Sava Jeffries, and Aryan Simhadri, the series tells the epic story of 12-year-old modern demigod Percy Jackson (Scobell), who is accused by the sky god Zeus (Lance Reddick) of stealing his master lightning bolt. With help from his friends Grover (Simhadri) and Annabeth (Jeffries), Percy must embark on an epic quest to find the lightning bolt and restore order to Olympus.To help with the visual component of the aforementioned worldbuilding, the creators were fortunate to have the services of a number of leading VFX studios, including London-based MPC and, critically, Industrial Light & Magic, whose groundbreaking StageCraft LED Volume technology which integrates real-time animated environments in live-action shoots played a central role in the production.As we get ready for a new season, which is based on the second book of the series, The Sea of Monsters, and is slated to be released in 2025, we spoke to Senior Visual Effects Supervisor Erik Henry about the highlights and challenges of the Season 1.Dan Sarto: Why don't you walk me through the different types of visual effects that we see across the season, and highlight the things that were most challenging, or that stand out for other reasons?Erik Henry: Well, Id have to start with the Minotaur, because for us that was a very challenging scene. It's hard to make something believable that runs on all fours, then stands up on its hind legs, and can mix it up. Thats the sort of thing where, if you dont get it right, people are going to be turning it off. So, we knew right out of the gate it's in Episode 1 that was going to be super important. The challenge there was to believably work with the camera and the actors. We had a motion base to help us with that. Walker [Scobell], who plays Percy, was able to get on the back of the Minotaur to be thrown around and to have to hold on and slide around, because there's rain on set. We did that on a volume stage. I always like to talk about the chimera in Episode 3, because that's just one of my favorite monsters a mashup of a lioness and a goat and a snake. The success of that really comes down to the fact that we were able to take an entire set and just paint it all black and use flamethrowers. The special effects guys came in and gave us proper elements, because it's a fire-breathing creature.The work done on environments by one of our trusted vendors up in Canada also stands out. All kinds of wonderful environments from Medusa's Cavern, which they did as a content for the volume, to all the great work for the Underworld. We had this great concept that the Underworld was going to be a massive cave on an almost planetary scale. And if you look at that, you're going to see that it's not sky. There are actual mountains up there and they have an alluvial flow. You get these lovely, almost spider-like outcroppings from each of the mountains that's buried in the mists.DS: The chimera was really well done. I especially liked how the catlike mannerisms were brought out.EH: That was done by MPC in London, and it was their crown jewel. They also did some work with Alecto, but this is where they really shined. Its in the gait you can tell that's definitely how a lioness walks. They just hit the nail on the head. But its also the eyes and how it opens its mouth. We added a little tongue, a snake tongue that comes out, but even that felt feline. It's really cool, I like that. DS: Tell me a little bit about the LED stage work. You and ILM made fantastic use of the LED volumes to create a lot of big environments, but there's also a lot of intimate work within that.EH: Jeff White, the supervisor at ILM, was the one who was heading up all the stagecraft work. They were the right choice to build the volume stage for us and to do the content. They did the content for all but one of the sequences, and a lot of the show's success comes from those volumes. The things that were shot on the volume are juxtaposed with scenes that are outside, that take advantage of real light in ways that the volume can't. That came from ILM telling us, "Don't do everything on the volume. Let's pick and choose the things that make it shine." And that's what we did.One scene that was really remarkably well done is when Luke [Castellan, a son of Hermes] and Percy square off. It's kind of broken into two parts first they're training and then they fight for real. That's on a volume where we had a lot of trees that were live action on the volume stage and then, in the content, you had the same kind of trees. And there was a fireworks show going off above their heads that lit the whole place. It wasnt really a big thing, but the two scenes that are in that particular volume are so successful. You always try to do a little bit of something that's live. We had puddles that would get the reflection from the volume; we also shot plates of water, actually in Southern California, and used those on the stage for the ocean that's out in the distance. It was a nice collaboration, to shoot the scenes that made the most sense, and bring those to life with the volume.DS: Did you do any previs and, if so, what did you use it for?EH: Previs played an important role, as you could imagine. In the very first scene with Percy and Alecto in the fountain in front of the Metropolitan Museum of Art, we used it because we were really interested in knowing what shots we could do. It's not a complete circle, so it enabled us to understand when we're going to come off the volume, then we would maybe tighten up or push in or something. It helped us tremendously in figuring out what we could actually do to stay on the stage.DS: What about postvis?EH: We did do a little bit, but, by and large, the planning goes into the volume and such. Its about things like whether the trees in the background are blowing in the wind. And, if they are, do we have some trees on the set that are going to be blowing in the wind as well? Does that all work? What does the rain look like? It's about testing the color balance between the foreground and the content. It takes a lot of time to build something, but essentially we're doing shots that would normally take 20-some odd weeks in post. We're just doing all that up front so that it looks right when the camera rolls. DS: When you look back on the season, was there one big challenge that stands out from everything else?EH: I guess I would go back to the Minotaur, because its a perfect example of the general challenge we faced. The show had to be accessible to young kids and to older viewers. We wanted adults to be able to watch it. So, it couldnt be too scary for an eight-year-old, but it also couldnt be too silly that adults say, "Well, you watch it and I'm going to go do the dishes." We wanted it to be a family event, where parents were engaged, as well as younger kids.And I think we succeeded, but it wasn't easy. In the case of the Minotaur, we went back and forth on whether close-ups of it roaring over Percy were too scary. We didn't want it to look like a zombie. [Executive Producer/Creator] John Steinberg said to me, "It has to have a little bit of teddy bear in it. Just always keep that in mind."From a technical standpoint, the biggest challenge might have been a sequence that took place in the area near the pit of Tartarus. We had to shoot it on a stage, and it was supposed to be out in a vast, hilly desert. And so, you got really close to the sky that was created by the DP. I was really scared that that was not going to look like we were outside.That it worked so well is a testament to Jules [OLoughlin], our DP. He said to me, "I'm going to go really soft. I'm going to help you out as much as possible." And I remember telling him later, I did not think we were going to pull it off, but it definitely looks like you're outside." I think to some degree it was helped by color, but it really did work. Sometimes you think you're going to have a real hard time with something, and then youre pleasantly surprised by the outcome. Jon Hofferman is a freelance writer and editor based in Los Angeles. He is also the creator of the Classical Composers Poster, an educational and decorative music timeline chart that makes a wonderful gift.0 Commentaires 0 Parts 113 Vue
-
WWW.AWN.COMAdobe Celebrates 5 Years of Fresco, Now Available for FreeAdobe is celebrating five years of its digital painting and drawing app, Adobe Fresco, with the announcement of powerful updates ahead of Lightbox Expo, which kicks off today in Pasadena, California. The event brings together artists in animation, gaming, illustration, and live-action and runs October 25-27 at the Pasadena Convention Center. Get more LBX information here.Here are some Fresco highlights:Over the past five years Fresco has transformed digital art through innovations which streamline artist workflows and enhance creativity with analog techniques reimagined for touch and stylus devices, offering creatives the following:Live oil and watercolor brushes, vector brushes, and thousands of pixel brushesThe latest advancements in touch and stylus technology with haptic feedback, tilt, barrel-roll, and squeeze support for Apple PencilCutting-edge motion features that make it easy to add eye-catching movement to artwork in secondsThe user-friendly symmetry tool, which ensures precision and faster creation of complex compositions.A vector trimmer to quickly remove intersecting vector strokes and clean up line art.The paint inside tool option to fill an enclosed area without going outside the lines. Frescos latest innovations deliver more power and precision for creators:Motion Presets including bob, breathe, bounce and more help creators quickly add lightweight eye-catching movement and animations to artwork in seconds.Reflective and Rotational Symmetry helps creators achieve faster and more seamless creation of complex compositions.Ability to take full advantage of ecosystem tools like the Apple Pencil Pro capabilities across workflows including:Shortcuts with squeezeNew haptic feedback for key user actionsBarrel roll for more realistic brush strokes when pencil is rotatedAdobe has also announced that to further the companys mission of empowering creativity for all, it is making Fresco completely free to all users, so now anyone can discover drawing and painting with the support of Adobe Fresco and all its premium capabilities.Visit Adobes blog Celebrating five years of Fresco with powerful new updates that unlock digital drawing and painting for everyone for more information and see how artists are harnessing the power of Fresco in their creations.Source: Adobe Debbie Diamond Sarto is news editor at Animation World Network.0 Commentaires 0 Parts 77 Vue
Plus de lecture