Animation World Network
Animation World Network
Animation World Network – the largest animation and visual effects-related publishing group on the In
  • 3 people like this
  • 38 Posts
  • 2 Photos
  • 0 Videos
  • 0 Reviews
  • News
Search
Recent Updates
  • WWW.AWN.COM
    Unreal Engine 5.5 Now Available
    Epic Games has released Unreal Engine 5.5, featuring major enhancements to animation authoring, rendering, virtual production, mobile game development, and developer iteration toolsets.Here are some highlights: Animation - UE 5.5 new features and enhancements facilitate high-fidelity in-editor animation authoring workflows. There are also additions to the animation gameplay authoring toolset.Sequencer - Unreal Engines nonlinear animation editor now boasts a more controllable interface with better filtering and easier access to properties.Animation deformers Users can craft more realistic animation effects such as contact deformation or better cartoon-style squash-and-stretch, with the new ability to author animatable animation deformers inside Control Rig and easily apply them to characters in Sequencer with a single click.Modular Control Rig - Modular Control Rig moves to Beta with UI and UX improvements; new quadruped and vehicles modules; and support for common bipedal skeleton types, while the Skeletal Editor is now Production-Ready with improvements that include quicker and simpler workflows for painting and editing weights. MetaHuman Animator - Part of the MetaHuman Plugin for Unreal Engine, MetaHuman Animators upgrades include an Experimental feature, allowing users to generate high-quality facial animationincluding inference of upper face gesturesjust from audio performances.Mutable character customization - Game developers whose projects require content that changes dynamically at runtime will benefit from the addition of the Mutable character customization system. The system can be used to generate dynamic skeletal meshes, materials, and textures for characters, animals, props, and weapons while optimizing memory usage, keeping shader cost low, and reducing the draw call count.Choosers Now production-ready Choosers offers a framework for selecting animations for playback based on game context without having to write complex logic, this game context asset selector can be used to select nearly any type of asset; this can encompass multiple levels of complexity, from simple random selectors to database-driven logic involving thousands of animations. Lumen - Now runs at 60 Hz on platforms with supported hardware due to improvements to the systems that underpin hardware ray tracing (HWRT). These improvements also impact the performance and capabilities of Path Tracer and light baking.Path Tracer - The DXR-accelerated, physically accurate progressive rendering mode is now production-ready, for creating final pixels for nonlinear applications or fully featured ground-truth reference images. This release sees a series of performance and fidelity improvements, Linux support, and support for all other Production-Ready features, including sky atmosphere and volumetric clouds. Substrate - The material authoring framework introduced as Experimental in Unreal Engine 5.2, moves to Beta. All features of legacy materials are now supported, as are all platforms to which UE deploys.Movie Render Graph (MRG) - Introduced as Experimental in Unreal Engine 5.4, Movie Render Graph (MRG) moves to Beta in this release, with further investment in the graph-based configuration workflow.MegaLights - This release offers a sneak peek at an Experimental new feature being called MegaLights. Already being dubbed the Nanite of lights, MegaLights enables users to add hundreds of dynamic shadow-casting lights to scenes, without constraints. Lighting artists, for the first time, can freely use textured area lights with soft shadows, Light Functions, media texture playback, and volumetric shadows on consoles and PC, focusing on artistic considerations rather than performance impact. Virtual production - Unreal Engines dedicated in-camera visual effects (ICVFX) toolset powers a myriad of productions in film, television, and commercials internationally. UE 5.5 sees the accumulated investment across multiple releases bringing the ICVFX toolset to full production-readiness, as well as advances in other features for virtual production and visualization.SMPTE 2110 - Unreal Engines support for SMPTE 2110 includes numerous stability improvements; automatic detection and repair of framelock loss; the ability to use PTP as a timecode provider; OCIO support for 2110 media; and other improvements to IP video signal flow, its ready to meet the needs of the real-world ICVFX projects as they make the transition to SMPTE 2110 deployments.Camera Calibration - Production-Ready with UE 5.5 is the Camera Calibration solver, with improved accuracy for lens and camera parameters estimation. Stemming from this work, Overscan is now built into all cameras, to support use cases like rendering with lens distortion or adding camera shake in post.Virtual Scouting Production-ready updated Virtual Scouting toolset introduced in UE 5.4, offers an out-of-the-box experience using OpenXR-compatible HMDs (with Oculus and Valve Index supported by default), and new opportunities for customization via an extensive API. The toolset now features a new VR Content Browser and asset placement; a Transform Gizmo that is customizable via Blueprint; and further polish, including a color-correct Viewfinder.Color Grading Panel - Previously part of the ICVFX Editor, the Color Grading Panel is now available for general use in the Unreal Editor, providing an artist-friendly interface for creative color manipulation in any Unreal Engine scene. The panel now also supports post-process volumes, cine cameras, and color correction regions.DMX - With applicability not just within virtual production, but also in broadcast and live events, Unreal Engines DMX tech stack joins the list of Production-Ready toolsets, with enhancements to the Control Console, Pixel Mapping, and Conflict Monitor.This release also adds GDTF compliance to the DMX Plugin for interfacing with GDTF- and MVR-enabled control devices and software, among other enhancements.Mobile game development:Mobile Forward Renderers new features increase visual fidelity on the platform and now supports D-buffer decals, rectangular area lights, capsule shadows, moveable IES textures for point and spotlights, volumetric fog, and Niagara particle lights. Screen-space reflections now work in both Mobile Forward and Deferred Renderers.Mobile Previewers improvements help with content development for mobile games including the ability to capture and preview a specific Android device profile, and to emulate half-precision 16-bit float shaders, making it easier to detect and deal with artifacts.Visit the Unreal Engine blog for more detailed information.Source: Epic Games Debbie Diamond Sarto is news editor at Animation World Network.
    0 Comments 0 Shares 0 Views
  • WWW.AWN.COM
    Taika Waititis A Disney Holiday Short: The Boy & The Octopus Debuts
    Disney has just released an all-new short film, A Disney Holiday Short: The Boy & The Octopus, in collaboration with Oscar-winning filmmaker Taika Waititi, premiering today on YouTube.The film follows the journey of a child who discovers a curious octopus has attached to his head during a seaside vacation. After returning home, the boy forms a true friendship with the octopus by introducing his new companion to his life on land harnessing the power of the Force with his Jedi lightsaber, playing with his Buzz Lightyear action figure, and imagining Santa Claus route around the world with the map on his wall before taking the lovable octopus out into the world to experience the joy of the holidays, hidden under his Mickey Mouse beanie.While watching the Disney holiday classic, The Santa Clause (1994), the boy comes to understand the extent of the octopus desire to explore everything the world has to offer, and he sets in motion a plan to make it happen. For the boy and the octopus, it is the precious everyday moments of childhood and friendship, as much as the magic of the season, that make their time together so meaningful. Numerous hidden easter eggs include nods to films like Moana (2016), Lilo and Stitch (2002), and Toy Story (1995), among others.The story manages to connect the feelings that you get around the holidays, and the joy, the goodwill and everything, with those same emotions and those same sensibilities you get from Disney films, said Waititi. I think they go hand in hand and it's the perfect match and only Disney could have made something like this with me. The short was created in conjunction with global creative agencies adam&eveDDB and Untold Studios, and produced by Hungry Man. A melodic rendition of Part of Your World from the Disney classic The Little Mermaid (1989) can be heard throughout the short, highlighting the octopus desire to explore the world above. This take on the fan-favorite song was recorded live by a 60-piece orchestra and mixed in the legendary recording studio Abbey Road.Tim Pennoyer, Director of Brand Marketing and the and themarketinglead whohelpedspearhead the campaign, shares with AWN, The biggest challenge in bringingA Disney Holiday Short: The Boy & The Octopus to life was creating an octopus character that felt both realistic and emotionally expressive. Working with our amazing partners in adam&eveDDB and VFX experts Untold Studios, we knew we were asking a lotan octopus with eight tentacles that could move independently, shapeshift, and even change skin color, all while conveying the warmth and charm needed for our story. It was the most complex character Untold Studios has ever brought to life, requiring an incredible level of detail in VFX to capture those intricate movements and transformations. He adds, The octopus had to feel both lifelike and relatable, with a full range of motion that would make the character truly magical on screen. It was a huge technical and creative challenge, but we knew it was key to making the story unforgettable. We would spend hours talking through every single scene with the octopus debating each expression on his face, and even the sounds he would make. All these details bring him to life.Take a few moments to enjoy the film:For generations, Disney has been an ever-present part of the holiday season all over the world, and this short builds on the enduring connection that so many families have with Disney during this special time of year, said Asad Ayaz, Chief Brand Officer, The Walt Disney Company. Were thrilled to collaborate with Taika Waititi on this timeless story of childhood friendship against the backdrop of this magical season.A Disney Holiday Short: The Boy & The Octopus is the latest creative collaboration between Waititi and The Walt Disney Company. The acclaimed filmmaker directed Marvel Studios Thor: Ragnarok (2017) and Thor: Love and Thunder (2022), Searchlight Pictures Jojo Rabbit (2019) and Next Goal Wins (2023), as well as executive produced Hulus Reservation Dogs, What We Do in the Shadows, and the forthcoming limited series, Interior Chinatown.Source: The Walt Disney Company Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.
    0 Comments 0 Shares 0 Views
  • WWW.AWN.COM
    Marvel Reveals Thunderbolts* Special Look Trailer
    Florence Pugh stars as the down in the dumps assassin leader of the MCUs most unlikely band of misfits in the Jake Schreier film, hitting theaters May 2, 2025.
    0 Comments 0 Shares 2 Views
  • WWW.AWN.COM
    Cinesite Appoints Adipat Virdi as XR Executive Director
    The former Facebook Global Creative Product Lead for Immersive will lead and develop client relationships for new projects as the company expands its footprint in the immersive market.
    0 Comments 0 Shares 2 Views
  • WWW.AWN.COM
    Aldi, McCann, Psyop Team for Latest Kevin the Carrot Holiday Ad
    Tis the season for a new Kevin the Carrot holiday advert! Since its inception in 2016, Aldi and McCann Manchester have collaborated each year with industry powerhouse Psyop, with fans looking to the annual release as the initiator of the most wonderful time of the year.Created in association with Riff Raff Films, this years installment sends the viewer on a caper braving the broiler, precarious mince pies, and other cuisine-related boobie traps as Kevin and Katie try to save the Festive Spirit from the nefarious thieves at Humbug Headquarters. Concluding with yet another feather in Kevins cap, the Festive Spirit is set free and fills empty tables with holiday feasts underscoring the ever-present message of celebrating the cheer of community and the magic of bringing families together.Enjoy!Last years Kevin and the Christmas Factory was named 2023s most effective ad, according to System1s data, although the latest installment may be the teams most ambitious chapter yet.Kevin the Carrots world just keeps getting juicier, and the team at McCann and Aldi are the zest we need to keep things fresh as we cook up new ways to expand the Kevin-verse, said Psyop directors Todd Mueller and Kylie Matulick. With a decade of Kevin marinating in all our minds, hes got so much more room to sprout! Were ready to serve up more flavor-filled adventures and show just how far Kevin can peel ahead! Kevin the Carrot is more than a character; hes become a cherished tradition and a holiday icon, added Psyop EP Jim Brattin. Each year, were inspired to create something even more memorable, and this seasons film is truly our most enchanting yet"Brattin executive produces, while Amy Fahl produces. VFX are created by Cristina Camacho, Steve Hallquist, and Billy Morris. Thomas Sali crafted the 2D animation, with Matthias Bauerle serving as 2D supervisor.Source: Psyop Journalist, antique shop owner, aspiring gemologistL'Wrenbrings a diverse perspective to animation, where every frame reflects her varied passions.
    0 Comments 0 Shares 3 Views
  • WWW.AWN.COM
    Star Wars Film Trilogy in Development at Lucasfilm
    Simon Kinberg will write and produce alongside Kathleen Kennedy; set after the events of Star Wars: The Rise of Skywalker, the trilogy moves past the Skywalker Saga and will follow a new set of characters.
    0 Comments 0 Shares 2 Views
  • WWW.AWN.COM
    Mass Effect Series in Development at Amazon MGM Studios
    Fast & Furious 9 writer Daniel Casey will executive produce and pen the script for the first adaptation of the fan-favorite sci-fi video game.
    0 Comments 0 Shares 3 Views
  • WWW.AWN.COM
    Disney Drops Final Trailer for Mufasa: The Lion King
    Barry Jenkins all-new film takes audiences back in time with a host of fan-favorite characters like Mufasa,Scar, Sarabi, Rafiki and Zazu before they called Pride Rock home; movie blends live-action filmmaking techniques with the latest photoreal CGI; coming to theaters December 20.
    0 Comments 0 Shares 3 Views
  • WWW.AWN.COM
    CEO Stefan Danieli Resigns from Goodbye Kansas Group
    The exec will remain at the company for nine months to support search for new leader, which the Board says has already begun.
    0 Comments 0 Shares 5 Views
  • WWW.AWN.COM
    HBO Drops First Look at Spooky Drama Series IT: Welcome to Derry
    The IT prequel, from Warner Bros. Television and filmmakers Andy Muschietti, Barbara Muschietti, and Jason Fuchs, debuts in 2025 on HBO, and will stream on Max.
    0 Comments 0 Shares 6 Views
  • WWW.AWN.COM
    Autodesks Wonder Dynamics Debuts New Wonder Studio AI Tool
    Powered by Video to 3D Scene technology, the Wonder Animation tool turns any video sequence into a 3D-animated scene via AI reconstruction; the beta is available now to Wonder Studio users.
    0 Comments 0 Shares 6 Views
  • WWW.AWN.COM
    Toho Greenlights New Godzilla Film
    Oscar-winning Godzilla Minus One director and writer Takashi Yamazaki will return to helm the upcoming project, as well as handle the VFX.
    0 Comments 0 Shares 8 Views
  • WWW.AWN.COM
    Game of Thrones' Film in Development at Warner Bros.
    After years of resistance from execs, a feature set in the universe of author George R.R. Martin has been greenlit; no plot, cast, or crew details are currently known.
    0 Comments 0 Shares 8 Views
  • WWW.AWN.COM
    Disney Establishes AI Oversight Unit for Responsible Tech Research, Advancement
    A newly-created Office of Technological Enablement will be headed by previous Walt Disney Studios CTO Jamie Voris, and will explore opportunities and risks associated with AI, mixed reality, and other current and emerging tech.
    0 Comments 0 Shares 8 Views
  • WWW.AWN.COM
    Chaos Releases V-Ray 7 for 3ds Max
    Chaos has released V-Ray 7 for 3ds Max, adding Gaussian Splat support along with more than 20 new features and improvements, from creative scatter options to firefly removal.Version 7 continues V-Rays leadership in both visualization and visual effects, with innovations that accelerate and extend whats possible for creative professionals, said Phillip Miller, VP Product, Solutions for Artists at Chaos. V-Ray 7 also improves extensibility for custom pipelines while further cementing the Chaos ecosystem with design workflows from Enscape, previsualization with Chaos Vantage and virtual production with Project Arena.One immersive technique which is increasingly popular in creative industries, 3D Gaussian Splats, now provides users a fast way to create realistic 3D environments from photos or videos. V-Ray 7 now offers the first native support for Gaussian Splats with ray-traced rendering, so users can include their scenes within detailed environments with accurate reflections and shadows reducing the effort to place projects in context or on location.Immersive, interactive virtual tours can be uploaded from V-Ray 7 and assembled in Chaos Collaboration, to create panoramic experiences with automatic hotspot generation. Each tour can be customized by adding floor plans, personalizing hotspots and transitions or including contextual details/design elements.The V-Ray Frame Buffers (VFB) expansion helps users do more in one place. New custom-shaped render regions enable artists to render just one or many parts of their frame with custom-drawn shapes. A vignette layer camera effect is now included that can also be customized by shape. The VFB also includes new color-correction presets that let users explore different looks as they refine their design.Additional features and improvements include:Chaos Scatter AidsInstance Brush The new Chaos Scatter Brush can easily populate instances within a scene with more precision. Users can add extra details or quickly remove unwanted areas.Distribution Maps Users can switch between different density styles using the new Chaos Scatter distribution maps library, helping designers add realistic looks and experiment with instance distribution.Faster Scatter-Heavy Exports Chaos Scatter is now managed procedurally by V-Ray 7 Standalone, enabling massive scattering results to export in minimal time when network rendering or submitting to Chaos Cloud.Better LightingFirefly Removal V-Ray 7s new algorithm automatically detects and finishes unresolved pixels, known as fireflies, during bucket rendering, reducing the time needed to produce final images.Enhanced V-Ray Sun & Sky V-Ray 7 offers an improved PRG sky model that can create more realistic images and immersive animations. V-Ray Sky can also now render various observer altitudes up to several kilometers.Chaos Cosmos UpgradesV-Ray Luminaires V-Rays expanding Chaos Cosmos asset library, now brings more speed and accuracy to lighting fixtures, accurately distributing the light, and properly illuminating the fixtures themselves. Most light fixtures within Chaos Cosmos are now all Luminaires in V-Ray 7, making it easier to populate scenes with realistic lighting while also rendering results more quickly.Asset Variants Support Chaos Cosmos has more asset variations to choose from, such as seasonal options for 3D vegetation models (ex: summer or autumn leaves). Users can select desired variants and drag and drop it into their scene or alter it freely after import.Accelerated WorkflowsV-Ray Profiler Users can track the time V-Ray spends calculating shaders and volumes, exporting scenes, compiling geometry and displacement, and loading bitmaps. Once all hotspots are located, users can optimize their pipelines for faster rendering. The Profiler can be paired with the Memory Tracker for more data/insights.V-Ray Lister Geometry Tab V-Ray Lister gains a new tab for managing and tweaking V-Ray geometry objects including V-Ray Proxy, Fur, Decal, and Clipper. With advanced filtering options, artists can navigate and manage settings, controlling multiple geometry objects directly from the Listers UI.GPU BoostsFaster Time to First Pixel New optimizations for scatter rendering, texture heavy scenes, data uploads and geometry compilations offer better production/interactive rendering experiences, as well as a faster time to first pixel.Caustics Support V-Ray GPU now supports caustics, enabling realistic surface reflections and refractions in both production and interactive rendering. Based on photon mapping, the new Caustics solver is optimized to fully utilize GPU hardware, delivering faster results than a CPU.Out-of-Core Textures Texture-heavy scenes can be rendered more efficiently, helping artists add detail to their scenes without sacrificing shading quality.Other UpdatesSelective V-Ray Scene Conversion Users can convert selected objects, update materials with a new Material Process, or convert textures to .TX files for smoother use and better performance, all with the same toolset.Extended USD Support V-Ray 7 now supports the latest version of USD for Autodesk 3ds Max 0.9.0, so users benefit from the latest features and enhancements.OpenPBR Support Consistent shading results are achievable across applications with OpenPBR Material support recently introduced with 3ds Max 2025.3. This brings a new shading model to V-Ray that promises to boost production efficiency by reducing the need for manual adjustments when switching between compatible renderers and applications.Additional information is available here.Source: Chaos Debbie Diamond Sarto is news editor at Animation World Network.
    0 Comments 0 Shares 7 Views
  • WWW.AWN.COM
    IATSE Voices Support for U.S. Federal Film and TV Tax Incentives
    The International Alliance of Theatrical Stage Employees (IATSE) has official voiced support for U.S. Representative Adam Schiffs recent letter to the U.S. Bureau of Labor Statistics (BLS) and U.S. Bureau of Economic Analysis (BEA) requesting statistics regarding the United States standing as a film and television production industry leader.Schiffs letter to both bureaus highlight the impact of international tax policy on American jobs and the urgency of introducing a competitive labor-based federal production tax incentive to keep film and television made in America. In his letter Schiff states: employment in film and television production has grown abroad over the past several decades, threatening job growth at home as more foreign countries provide meaningful production incentives.Countries like the United Kingdom, Australia, and others have recently expanded their federal incentive and subsidy structures to lure productions from the United States. This has been one of the forces attributed to a persistent economic downturn in the U.S. film and TV industry, with production in 2023 and 2024 down significantly compared to 2022.While states like California, New York, Georgia, and New Jersey have offered tax credits for production, contributing to local economies and jobs growth, they have not been enough, notes IATSE, to prevent productions from moving overseas.The proposal to implement a federal incentive would level the playing field and address this imbalance, said IATSE International president Matthew D. Loeb. We support the concept of a federal incentive for the creation of film and TV, provided the plan also has mechanisms to uphold labor standards. We are committed to saving Americas entertainment industry, and we look forward to working with our members, local unions, allies, and lawmakers at all levels to get it done.IATSE has joined the Congressman in urging the BLS and BEA to gather and release data on the impact of foreign production incentives on U.S. jobs and local communities. The union is confident the data will reveal what members already know: That productions choose where to locate based on the incentives, infrastructure, and talent available; that productions directly and indirectly drive spending which ripples benefits through local economies via the multiplier effect, and that legislation is needed urgently to save the cultural institution that is American film and television production.Download a copy of Congressman Schiffs letter here.Source: IATSEAttachmentSize Debbie Diamond Sarto is news editor at Animation World Network.
    0 Comments 0 Shares 12 Views
  • WWW.AWN.COM
    Universal Pictures, LEGO Team for a Trio of Live-Action Films
    Jake Kasdan, Patty Jenkins, and Joe Cornish will each helm one of the untitled movies; a separate, live-action LEGO Ninjago feature will be penned by brothers Kevin and Dan Hageman.
    0 Comments 0 Shares 12 Views
  • WWW.AWN.COM
    Disney+ Drops Look Ahead Teaser for Upcoming Marvel Titles
    Get a glimpse at Marvel Studios, Marvel Television, and Marvel Animation titles, including Deadpool & Wolverine, Your Friendly Neighborhood Spider-Man, and What If? that are set to stream through 2025.
    0 Comments 0 Shares 11 Views
  • WWW.AWN.COM
    Framestore Serves Big Plate of Spaghettification for Deadpool & Wolverine
    One must wonder if Green Lantern is next on Ryan Reynolds hit list for the Merc with a Mouth, considering he resurrected the title character from the cinematic debacle known as X-Men Origins: Wolverine for his latest actioner, Marvel'sDeadpool & Wolverine. Participating in the blood-soaked comic book irreverence is Framestore which was brought onboard by Production VFX Supervisor Swen Gillberg to provide previs, onset support, techvis, postvis and digitally augment 420 shots that ranged from the brutally funny opening sequence to the psychedelic and psychotic third act.When it came to the visualisation team, Swen was excellent in providing video briefs directly from the pitch room rather than relying on written communication, we met often and discussed ideas, explains Kaya Jabar, Senior Visualisation Supervisor, Framestore. He also used our team to iterate on ideas and designs quickly, lens up concepts and present them back to the other HODs [Heads of Departments] to ensure everyone was on the same page. Clear direction and reference were provided for most of the sequences in advance. Having a direct line to Kaya made it even easier to check in to see what avenues had already been explored with Swen and Shawn Levy [director] before adding ideas of our own, states Matt Twyford, VFX Supervisor, Framestore. We were lucky to have had a lot of quality time with Swen on the shoot and having the previs/postvis on hand all the time focused the decision-making. This allowed us to work up the quality of our assets while FPS [Framestore Preproduction Services] fast iterated ideas, hooking into our assets as they developed so the visualisation and visual effects were constantly converging.We relied heavily on motion capture to block out sequences quickly and ensure everything was grounded in reality, remarks Jabar. From a purely technical perspective, managing our scenes on the Oner fight was really difficult with the number of bespoke characters on screen. The uninterrupted flow of the animation proved challenging for real-time playback in viewport, where we tend to live as previs artists. Two unique challenges on the film had to be resolved. Firstly, the Cassandra/Paradox hand intersection and then the Oner, a five thousand frame continuous stunt fight in the City Street set environment, notes Twyford. Cassandras [Emma Corrin] power was shown across multiple closeup and long shots where we deformed Paradoxs [Matthew Macfadyen] whole head and face by pushing Cassandra's hand through it. This sequence required a huge amount of upfront rework for our creature [also human!] pipeline but the payoff as one of the big squirm in your seats moments was great. The Oner was more traditional in its technicalities, but huge in its project management scope. Over 80, five thousand frame takes of motion control, shot exterior through a British winter, generated over a thousand individual artist tasks.Houdini was introduced into postvis to help with timing of the characters and effects. As we are part of the full Framestore pipeline, this was more a workflow rather than tooling challenge, observes Jabar. For the Oner, we designed a way to count the actors on screen at render time and write out a HUD that helped with techvis and planning the actual shoot. One major adjustment for the visual effects workflow and pipeline was for the digital double of Paradox. We knew how to make a fully realistic double with full performance in closeup, but to then shove a hand through that head in a photorealistic and interactive way was not something we normally build for, states Twyford. Although these challenges are usually overcome with time and talent, we knew that the current tools for skin simulation are very much at the back end of the pipeline; this meant we would be showing the skin simulations right at the end of the visual effects process and frighteningly close to the deadlines, especially if any major changes were going to happen. We decided to create a new process where we moved the skin distortions into the animation rigs. This meant that the animators actually animated the skin themselves rather than it being driven by simulations run after the animation. The result was fantastic extra value added by the animators and we were able to show the result right at the front end of the process getting great early feedback from the Filmmakers. The effects artists then focused the post-animation simulations on the fine detail creases and stretches, eyebrow/eyelash and hair interaction. There was no shortage of complex shots to be visualized. For something like the Cold Open we used motion capture of a stunt performer based on initial storyboards, remarks Jabar. We then pitched new ideas on top and fleshed out the cut once stunts had rehearsed the new ideas from the vis. We also helped inform special effects for the carousel section where Deadpool spins, dispatching TVA agents in the centre of a fast circular dolly. The shot with the most iterations is definitely the Oner where we pushed beyond version 200 even in postvis, assembling all the elements and also iterating on the final moment where our heroes jump out of the back of the bus to nail the poses and slow-motion. This shot was also the most intensive in terms of animation and technical visualisation time for previs, as we wanted to really work out every single actor on screen and their action to help guide stunts on the day, while leaving it loose enough to allow them to flesh it out further.The project covered the full gamut of visual effects work from bluescreens, crowd duplication, stunt enhancement, simple and complex environment top ups, photo real simulations, pseudo-science simulations, and creature and digital double work, states Twyford. It was a show that had all the departments busy and challenged.The wide range of visual effects produced for the film also included spaghettification. Spaghettification was introduced originally in Loki Season 2 to show the universe breaking down into strands after major disruptions in the timelines. Only loosely linked to the scientific concept of the same name, it worked well in the Time Ripper sequence as it was established Marvel science and suited the story and visual dynamic. According to Twyford, We had originally developed the look and shots in Loki, so it was great to see it again in another environment as it has such a powerful and dark overtone when you see it develop through a scene. It requires subtle and clever animation by the effects simulation artists and complements the foreground action as it brings darkness and intimacy to the action. The opening title sequence was shot in the UK during summer, which meant that the location had to be practically and digitally dressed with snow. We worked up the dressing to height, added in some falling snow and balanced through the sequence in grade and atmosphere for continuity, reveals Twyford. Wolverines skeleton was a mix of practical and CG. We swapped out various pieces or the entirety of the TVA Minutemen to enhance the violence. Then came the blood. There was no practical blood shot, so everything is a CG simulation custom designed for every impact. The brief was to overdo it at first; some of the initial blood fountains were hilarious but also a bit too ridiculous. The heavy use of bullet time and speed up allowed us to create some beautiful shapes with the blood against snow influenced by Jackson Pollocks work. Technically, this a big challenge as the blood and debris have to interact fully with the environment, the characters and the props. Everything was tightly body tracked, and continuity damage carried through the edits building up to the final full reveal of the carnage. The Time Ripper destruction sequence went through numerous iterations as it evolved. Many of the key elements, like Cassandras destruction, Deadpool and Wolverines damage, the environmental effects both in the upper control room and the chamber below, all were developed dynamically as the edit, performances and visual effects all started coming together, notes Twyford. The only simulation we knew exactly where we were going with was Wolverines costume explosion where the postvis was so awesome we followed their lead. The internal burning for Wolverine was a look developed by the comp team with the cracks and debris coming from effects simulations. Deadpool was partially replaced with our digital double to allow us to blast hot light from inside his body and flare the fibers of his suit. Cassandra died in a million spectacular ways in the ongoing atomization look development of our effects team. We had a fantastic high quality digital double of Emma Corrin including all the skin and subdermal layers through to the skull and the brain. This was blown up, atomized, wind flayed, sandblasted and deconstructed in many unpleasant ways trying to find the right feel for a comedy film where the baddie dies horribly. The final look incorporated an internal plasma overheat with an external sandblasted skin. Thankfully, all over quickly in the final edit.Various Deadpool variants make cameo appearances, including one of a canine persuasion. Dogpool was a digital asset we built to cover any missing performances in the shoot, explains Twyford. Little Peggy and her trainers though were absolute superstars onset, and we got footage for every shot wearing her costume and booties. The original concept did not have her wearing a mask or goggles and we were asked to try and put some doggles on her. This turned out to be massively popular and allowed us to rework the optics to give the googly eye effect; this in turn allowed us to animate her eye performance and open up a new level to her character. In the Deadpool corps sequence her doggles eyes and facial hair are CG over her plate performances. Pinewood Studios in London provided the street environment used in numerous scenes. It included a whole block with the first two stories of buildings, all surrounded by bluescreen, states Twyford. Although it was influenced by the look of New York, it also has a more generic city feel with aspects of Vancouver and Boston. Our role was to extend the set upwards, create a digital city extension for the midground areas and then use a digital matte painting for the distant city horizon. It needed full city life including pedestrians, traffic and believable infrastructure. The set top ups were custom digital models to match Ray Chans [production designer] set design and the city extension used assets previously seen in Marvel productions along with street dressing from our library all laid out to art department references. Cars were a mixture of existing assets and new builds of vehicles to match set cars and buses. All the pedestrians were bluescreen elements shot specifically for the scenes and then dressed into our CG build with our custom Nuke particle tools. Everything was then lit and redressed for each day, twilight and early morning scenes.In the film, blood and gore were not in short supply. All the blood and gore were CG simulations, Twyford notes. With the amount of retimes and big camera moves in the plates we were wary of trying to force in filmed elements, especially when interacting with everything in the scene. Characters, props and the environment lidar were tracked tightly and simulations run pre-retiming to give us a starting reference. Then we creatively tweaked to get the most interesting shapes and framing that worked with the cameras and editorial timing. Once we had a good overall simulation, we worked up the secondary simulations of splashes, soaking into fabric, landing in snow and specific rivulets and drips. The initial brief was to use a comedic amount of blood, so we kept it as clean liquid with no bits or chunks to reduce any unpalatable goriness. A good amount of time was spent fine tuning the surface tension characteristics to allow us a slightly cartoon graphic feel to the flying blood shapes and with a tight shutter angle, every frame looks like it came directly from the pages of a comic. Our biggest challenge was making sure we always kept one eye on who these characters were, in terms of the timing of the animation and the design of all our elements, reflects Jabar. Everyone truly loved what we were trying to achieve, so we wanted to make sure we never lost sight of that despite the volume of work and the complexity. We were on the project for 18 months and I wanted to ensure I kept largely the exact same team throughout and as a supervisor. Ensuring everyone was motivated and rested enough to keep bringing fresh perspectives and their best work was challenging, but so rewarding, and I think that really showed on screen. Paradox proved to be the biggest technical and creative challenge because of Cassandra pushing her hand through his face. The reference was just a couple of the original comic book frames, and our job was to make it photoreal closeup on a performing actor across a dozen long shots, remarks Twyford. What the comic book frame did give us was the level of distortion and key moments, like the finger coming out of his nostril and eye socket. We then had to design the movement and how her hand reacted to the skin, tendons, bones and cartilage of his face in a realistic way. The result is one of the most memorably uncomfortable seat squirming movie moments and I hope everyone in the audience feels Paradoxs pain. Trevor Hogg is a freelance video editor and writer best known for composing in-depth filmmaker and movie profiles for VFX Voice, Animation Magazine, and British Cinematographer.
    0 Comments 0 Shares 11 Views
  • WWW.AWN.COM
    Getting It Right: The Carefully Calibrated VFX that Makes The Boys The Boys
    Perhaps youve heard of Prime Videos hit Emmy Award-winning series The Boys. Based on the comic book of the same name by Garth Ennis and Darick Robertson, the series, developed by Eric Kripke, who also serves as an executive producer, recently completed its fourth season, with a fifth upcoming in 2026. The Boys is many things: a brilliant subversion of the superhero genre, a biting political satire, a tour de force of action filmmaking, a pitch-dark comedy, and a veritable gorefest of bloody violence, in which multiple exploding heads are but so many grace notes.For Visual Effects Supervisor Stephan Fleet, who has been with the show from the beginning, the multi-faceted entity that is The Boys has proved to be an often challenging, always engaging, and truly educational experience. While his specific duties include the rendering of such notable phenomena as flying sheep and human combustion, its often the less flamboyant effects that require the most innovative and labor-intensive work.From the unexpectedly nuanced considerations that determine the appropriate volume of blood, to the visual and legal complications involved in the representation of screens onscreen, Fleet shared some of the central aesthetic and technical issues that inform his VFX stewardship. Dan Sarto: There's a wide range of visual effects that you produced for this show. Im guessing they included a lot of things that were visually vivid, but not necessarily the most challenging, and others that may not have stood out, but actually were labor intensive, or groundbreaking in some way. Can you talk a little about that?Stephan Fleet: In the biz, there's something called an amort or amortization budget and, for visual effects, that can mean effects that repeat over multiple episodes. Like lasering is a gag that we do over and over again, as well as blowing people up. And those types of things become easier. I'm not saying they dont pose their unique challenges, but they become more commonplace because you do them again and again. This is a show that does not actually have a lot of that. We tend to introduce a new superpower at least every couple episodes, if not every episode, and there are just some quirky things like flying sheep, for example, or carnivorous chickens.(Note: Killer Sheep VFX produced by Untold Studios)So, to answer your question, it's all hard to me. The hardest stuff is the stuff that has to look 100% real. When you do something like a flying sheep, there's always going to be a slight suspension of disbelief with an audience. Just because those don't really exist. However, cloning, or having a character be multiple versions of themselves in the same frame, was one of the hardest things we had to do this season. Audiences are really savvy in this day and age. I mean, influencers will make TikTok videos where they clone themselves, so people know how it's done. So now you have to do it in ways that make it harder; you have to do the impossible shots.For instance, the first time we see the cloning character, Splinter that shot took about 16 hours to do using motion control. It was about eight hours of rehearsal and setting up the cameras with dancers and a metronome. There's no face replacement. It's all the actor playing every single character with a moving camera moving over and over and over again. And the funny thing is, I've seen a few people watch that shot now during the course of the show, and no one really thinks about it. Everyone just sees six of the same guy, and no one actually goes, wow, that's a complicated visual effect. It's just so smoothly done that it looks like another piece of footage, but it took 16 hours to make. (Note: Splinter VFX produced by Pixomondo)Later on in the season, we have Erin Moriarty's character, Starlight, as a doppelganger of herself that she's fighting. That was another really complicated scene because, while she's just talking to herself in a room, she does things like grab a water bottle out of the other person's hand or she grabs her face and shakes it. So now you have contact between the two, and its a clever blend of face replacement and plates. And then, when she's fighting herself, that is her playing both sides of it, again to a metronome with heavy stunt choreography and a little bit of visual effects. So, these things that don't have a lot of blood or spectacle, but take a lot of visual effects, are by far the hardest thing to do.(Note: the Doppleganger VFX was produced by Pixomondo)Also, anyone who really pays attention knows that one of our motifs and one of our means of exposition in pushing the story forward is through people watching the media on monitors. All that stuff has to be heavily designed with motion graphics. Every bit of text on a TikTok has to be written out and vetted to make sure that it has the right timbre. And then on top of that, we're very particular with how the monitors look.DS: We just take it for granted when we see something like that on a show, because we all know what that looks like. But it's all very, very carefully created.SF: Yeah, we go to great lengths. I've developed a great appreciation for, and learned a lot about, just the legality of using things like TikTok and X/Twitter. There are ways to use this stuff for real, if you're using it in a certain way. We try and do that, but if we can't, we'll make our own thing that gives people an idea of what it is. We do our research and try to emulate a similar TikTok online what would the comments be in this day and age? So, in our world, it's mirroring some very real political things going on in the real world, but in a slightly Bizarro way. (Note: the Hard Push VFX produced by DNEG)DS: You mentioned that there are some effects, like blowing people up, that youve done many times and that have become commonplace. And there's always a little camp in it, a little dark humor. How do you arbitrate that? How do you determine what's too much or too cartoony? And have you arrived at a set way you do that, or is it a continual process of evaluation?SF: One of the joys in doing episodic work and seasons of things is that they have long lifespans. So, you do a season of something, and you learn from it. And so, some of the stuff that we did in Season 1 is picked up on by Eric and the writers, and then they actually write to it for Season 2. And then, in Season 2, something new comes along and I pick it up. You start building almost like a library. And then you also get feedback from audiences, and you get to lean into what audiences want. And so, you get this wonderful opportunity to do multiple seasons of this and build it up.We started doing a realistic amount of blood, and very quickly learned that that was not going to work. It wasn't legible. So, we just started pushing the blood more and more until it became almost this Jackson Pollock canvas of unrealistic blood. But the storytelling and the romance overtook the reality of the situation. And I realized at that moment that, while we are a show that touts itself as being grounded in many ways, blood is a conceit for us and a language that we use to tell the stories. It is one of those things in which we don't necessarily strive for realism. Ultimately, we've learned that people come to expect these heightened moments, and so we we're able to lean into it and have a little bit of liberty. (Note: face punch VFX produced by Untold Studios)DS: What would you say is the most important skill or skills that you've brought to this project? What has served you best as a visual effects supervisor?SF: That's a great and really deep question, actually. I went to theater school for undergrad and I went to film school for a master's degree. Doing this show specifically has been my PhD for how to make a show, not just visual effects. If I look back at Season 1 to now, and who I was as a person, I've become a very different person.I'm a passionate artist, I'm very good at what I do, and I think I have a good eye. But ultimately, I'm here to shepherd my team on the show fantastic artists and a multitude of vendors throughout the world to create this vision. And it's not something they really teach you in this industry. Unfortunately, when you first start growing and coming up and becoming a department head, there's no management school for visual effects supervision.When I first started doing this, out of a mixture of drive, a little bit of fear, and just a hunger to get it done, I could be a little angry or aggressive. And it's not that you're mad at other people, you're just trying to push the product forward, which can lead to this pressure cooker of insanity. And what I've learned as I've gone through is that other people have a lot of great ideas, and I love to listen to other people's ideas and bring them in and really build a team. It's not a kumbaya experience. It's a very professional experience, but it can be a professional experience that involves fun.I think one thing in visual effects that can be a problem is we can come in as the only department below the line that is responsible for stuff in post. You're on set with a lot of people that are not responsible for the outcome of that product in post-production. So that can make you very nervous. But if you go to another department and just give them an ultimatum or something, it just stresses them out more and doesnt solve a problem. But if you understand a little bit about what they're going through, you can come to the table and ask, how do we go through this together? There's so much that people don't understand about what goes into making the simplest of television shows, yet alone a complicated show like this. I would recommend that any visual effects supervisor, or aspiring visual effects supervisor, do their best to learn about and empathize with every other department and the people that they work with. Jon Hofferman is a freelance writer and editor based in Los Angeles. He is also the creator of the Classical Composers Poster, an educational and decorative music timeline chart that makes a wonderful gift.
    0 Comments 0 Shares 22 Views
  • WWW.AWN.COM
    Increase in UK VFX Tax Incentive Confirmed
    The Chancellor of the Exchequer has confirmed an increased tax incentive for spending on VFX in the UK.Delivering Labour's first budget since coming to power in the July general election, Rachel Reeves announced that VFX spending in the UK will attract a net rebate of 29.25% which will be exempt from the 80% cap on spending eligible for film and TV tax relief. This had been proposed by the previous government in March, but the early election meant that the uplift was not implemented.The Labour government has identified the creative industries as one of eight growth-driving sectors within its Industrial Strategy, and the VFX uplift is projected to attract an additional 175 million per year of spending and the creation of 2,800 new jobs.Earlier this year, the Treasury had proposed to exclude costs relating to Generative AI from the VFX uplift. However, following consultation with the industry, this proposal has now been dropped.As requested by the UK Screen Alliance, the tax incentive has been moved up from April 1, 2025 to January 1, 2025. The move will avoid production delays and allow VFX companies to get work flowing as they recover from last years writers and actors strikes. Claims for the rebate can be made from April 1.The confirmation in the Budget that the VFX rebate will be available from the New Year is terrific news for the UKs visual effects companies, said Neil Hatton, CEO of UK Screen Alliance. We know that productions are making decisions right now on where to place their VFX work for 2025 and beyond. Todays announcement means that these clients will be incentivized to place many millions of dollars of inward investment work with the UKs award-winning VFX community, creating considerable value for the UK economy.Source: UK Screen Alliance Journalist, antique shop owner, aspiring gemologistLaurn brings a diverse perspective to animation, where every frame reflects her varied passions.
    0 Comments 0 Shares 17 Views
  • WWW.AWN.COM
    'SNL' VFX Workers Unionize, Win Recognition with Unanimous Support
    VFX workers for Saturday Night Live (SNL) are unionizing with the International Alliance of Theatrical Stage Employees (IATSE) and have won official recognition of their union. The group includes 16 VFX artists and leads who unanimously supported unionizing with IATSE.While SNL is known for its live televised segments, it also features several pre-recorded "digital shorts" which require editors and VFX workers to operate under tight time constraints; in 2017, Fast Company reported turnarounds could be as little as 12 hours.Over the six seasons I've worked at SNL, I've seen the VFX department evolve from a small group to a tightly integrated, highly organized operation capable of delivering hundreds of demanding shots over a 24-hour period, said VFX artist Richard Lampasone. It's an intense, collaborative, and extremely fun environment that constantly tests the limits of our skills, our versatility, and, after long days staring at a screen, our ability to form coherent sentences. Our work, like that of everyone else above and below the line, is critical to the show's success. We look forward to celebrating Season 50 by joining in SNL's decades-long tradition of supporting union labor, and to helping negotiate a contract that reflects the substantial value we add and makes ours a more accessible and sustainable career for years to come.Working here is tremendously fun, chaotic, and hugely rewarding, added Danny Behar, VFX artist for SNL. We work 15-hour days every Saturday, delivering renders before cast & crew start rehearsals, and ending after the show has finished broadcasting. Our department is essential to the show's success. For that and a multitude of other reasons, we deserve to have a seat at the table. We are the only department that currently does not have one. If we're going to continue working on the show, it is necessary for us to receive the basic entitlements offered to other units like pay equity and stable healthcare.The VFX workers are following in the footsteps of SNLs editors, who began their unionization campaign in October 2022, ultimately resulting in the editors ratifying their first agreement in May 2023. This week, NBCUniversal management agreed to a similar process for recognizing SNL VFX workers union after SNL VFX workers presented signed authorization cards demonstrating 100% support for unionization.We deserve what every other department at SNL has, we deserve to be protected, we deserve to be represented, and we deserve to be on equal footing with the people we work directly side by side with, said VFX lead David Torres Eber. SNL is a very stressful show to work on while also being a very enriching experience full of creative problems to solve. We should be focused on those problems each week and not whether our insurance has lapsed or when we can schedule a doctors appointment after the summer hiatus ends.The effort to unionize is part of a broader campaign by IATSE to bring representation to VFX workers across the entertainment industry, as positions in the field have not been historically represented. IATSE has made significant strides in recent months, securing union recognition for VFX workers at companies such as Marvel, Disney, Avatar, DNEG (Canada), and AppleTV. Those interested in joining the movement should visit their website for more information and to get in touch with IATSE organizers.Source: IATSE Journalist, antique shop owner, aspiring gemologistLaurn brings a diverse perspective to animation, where every frame reflects her varied passions.
    0 Comments 0 Shares 18 Views
  • WWW.AWN.COM
    Roland Emmerich to Develop Live-Action Space Nation Series
    A live-action series adaptation of the Web3 video game Space Nation Online is in development from the transmedia IPs co-creators and co-founders Roland Emmerich (Independence Day, Stargate) and Marco Weber (The Thirteenth Floor, Igby Goes Down). The duo will also executive produce the project.Originally announced in June 2023, Space Nation Inc. was founded by Emmerich, Weber, and veteran game developers Jerome Wu and Tony Tang. With $50 million in funding, the four co-founders set out to build a first-of-its-kind transmedia IP spanning video games, online content, and TV/streaming. Space Nation Online soft launched on September 27, garnering high player numbers and retention.As gamers around the world are joining us in the Telikos Cluster for Space Nation Online, Im motivated to expand the games story with a dive into the origins of humanitys exodus from Earth following an alien invasion, said Emmerich. Working in parallel with Marco and the game development team to expand this new sci-fi universe has been a singularly unique creative experience, and Im excited to continue exploring whats possible through cross-media storytelling. Were at an important stage with Space Nation Online now live, and fans are beginning to experience how our IP will expand, added Weber. Its been a rewarding process to explore how different entertainment mediums can come together to build a larger universe. Roland and I are eager to continue developing this new universe as we create a truly interconnected transmedia experience for a global audience.To support the games launch, Emmerich and Weber teamed with filmmaker Martin Weisz (The Hills Have Eyes 2) to create a series of animated shorts featuring the Space Nation character Zoey, chronicling her ill-fated attempts to go viral online. In addition to the three entertainment industry veterans collaboration, Jess Lebow, lead writer of Space Nation Online, co-wrote the narrative alongside actress Winona Weber, voice of Zoey in both the shorts and the game.Check out the first episode, featuring alien teen space pirate now:Source: Space Nation Inc. Journalist, antique shop owner, aspiring gemologistLaurn brings a diverse perspective to animation, where every frame reflects her varied passions.
    0 Comments 0 Shares 19 Views
  • WWW.AWN.COM
    Cinesite Celebrates 10 Years of Innovation and Growth in Montreal
    Cinesite Montreal has marked its 10th anniversary by releasing a retrospective showreel that highlights some of its most memorable work from The Addams Family, Paws of Fury: The Legend of Hank, Teenage Mutant Ninja Turtles: Mutant Mayhem, Blitz, Black Panther Wakanda Forever, Ant-Man & The Wasp, and No Time to Die.These projects, brought to life by the companys talented supervisors, artists, technicians, and R&D teams, have earned Cinesite Montreal numerous awards and recognition over the last decade.Since opening its doors in 2014, Cinesite Montreal has steadily expanded, establishing itself as a key player in the global animation and VFX industry. Over the past decade, the studio has delivered 60 VFX and animation projects.Watch Cinesites Montreal 10th Anniversary Reel:Commenting on this milestone, Cinesite Montreal general manager Graham Peddie said, Our 10th anniversary is a huge milestone that Im incredibly proud of. Weve seen progressive growth, both in the size of our studio and in the scale and complexity of our projects. All the people involved in our success to date have had a hand in establishing the groundwork for our future artists and collaborators to build upon. Were excited for what the next chapter holds.Hubert Bolduc, President of Investissement Qubec added, "I extend our warmest congratulations to Cinesite Montreal on their 10th anniversary. Their studio has not only been a signicant contributor to the growth of our local animation and visual effects industry but has also showcased the exceptional talent and creativity of Montrealers on the global stage. Investissement Qubec looks forward to continuing our partnership and supporting their future success."Beyond its industry success, Cinesite Montreal is deeply involved in the community. Partnerships with organizations like Opration Pre Nol and the Black Community Resource Center highlight its dedication to social responsibility. It has partnered with six different colleges and universities across Canada and is actively working to increase diversity and inclusion within the industry.Gretchen Libby, Director of Specialists for Media & Entertainment, Games, and Sports at Amazon Web Services (AWS), said, Over the past decade, Cinesite Montreal has consistently delivered exceptional animation and visual effects that have captivated audiences around the world. Cinesite understands the power of using technology to bring stories to life, and we look forward to many more years of collaboration and creative achievements between AWS and Cinesite.As the studio looks forward to the next decade, Cinesite Montreal plans to continue building on its legacy of creativity and community engagement.Upcoming VFX projects at Cinesite Montreal:Blitz (Releasing November 1st, 2024): Experience the gripping tales of a group of Londoners amid the British capitals wartime bombings.G20 (Releasing in 2025): G20, starring Viola Davis as US President Taylor Sutton, is an action thriller where terrorists take over the G20 summit. President Sutton must use her skills to defend her family, her company, and the world.Michael (Releasing April 18th, 2025): The story of Michael Jackson, the King of Pop, explores the life and legacy of the iconic musician.The SpongeBob Movie: Search for Squarepants (Releasing December 19th, 2025): Follow SpongeBob as he travels to the depths of the ocean to face the ghost of the Flying Dutchman.Upcoming Animation projects:HITPIG! (Releasing November 1st, 2024): The upcoming British-Canadian animated adventure comedy is directed by David Feiss (Cow & Chicken) & Cinzia Angelini (Mila). The lm comes from an original story by Berkeley Breathed, rooted from his 2008 children's book Pete & Pickles. Breathed penned the screenplay alongside Dave Rosenbaum and Tyler Werrin; it is scored by Isabella Summers of Florence and the Machine.Animal Farm (2025): The animated adaptation of George Orwells Animal Farm is directed by Andy Serkis.Smurfs (Releasing July 18th, 2025): Cinesite produces the upcoming collaboration between Paramount Animation, Nickelodeon Animation, LAFIG Belgium, and IMPS. Chris Miller directs from a script by Pam Brady. Ryan Harris produces.Source: Cinesite Journalist, antique shop owner, aspiring gemologistLaurn brings a diverse perspective to animation, where every frame reflects her varied passions.
    0 Comments 0 Shares 36 Views
  • WWW.AWN.COM
    Paul Franklin Joins BeloFX as Creative Director
    The Oscar winning VFX supervisor and filmmaker, known for his work on Interstellar and Inception, as well as co-founding DNEG in 1998, follows fellow co-founders Matt Holben and Alex Hope to the Canada, UK and India-based studio.
    0 Comments 0 Shares 29 Views
  • WWW.AWN.COM
    Ronald D. Moore to Showrun God of War Game Adaptation at Prime Video
    Based on the critically acclaimed PlayStation videogame franchise, the live-action series follows God-in-exile Kratos, who stumbles into an epic adventure as he attempts to spread the ashes of his diseased wife with his estranged son.
    0 Comments 0 Shares 27 Views
  • WWW.AWN.COM
    AMD Radeon GPUs Turbocharge New AI Tools in CG Software
    Backed by AMD Radeon GPUs, AI tools are poised to transform the way that artists work. From video production to visual effects, rendering to retouching, new artificial intelligence (AI) and machine learning (ML) features in graphics applications speed up routine tasks and take the drudgery out of repetitive ones, leaving artists free to focus on their creative goals.New AI tools speed up day-to-day workflowOne artist tapping into the potential of AI through AMD hardware is Mike Griggs, a Digital Content Creation Consultant whose clients include international businesses like the BBC and JCDecaux.A powerful graphics workstation helps my productivity [by] accelerating the AI features of my content creation tools, he says. I've been using workstations with AMD Radeon PRO GPUs to enable these new experiences.Griggs says that he sees the largest day-to-day improvements in post-production, thanks to the new AI features in applications like DaVinci Resolve, Blackmagic Design's video editing, color grading and VFX software. Neural Engine, the machine learning system available in the Studio edition of the software, speeds up a range of routine tasks, including audio transcription, video stabilization and object tracking, and can automatically generate masks and depth maps from source footage.The latest stable release, DaVinci Resolve Studio 18.6, helps Neural Engine harness the power of AMD GPUs. In tests, AI-based mask generation and tracking tool Magic Mask now runs over 4x faster on a current high end AMD Radeon RX 7900 XTX GPU than in the previous release.1[Magic Mask] enables me to create custom masks for my 3D work without needing to go back to my 3D software, says Griggs. Artists such as myself can get results quicker than ever before.How AMD GPUs power AI processingAMD GPUs can accelerate these new tools thanks to their specialist AI hardware. The RDNA 3 GPU architecture, used in AMD Radeon PRO W7000 Series and AMD Radeon RX 7000 Series GPUs, features two dedicated AI accelerators per compute unit.On top of that, the graphics memory capacity of workstation cards from the Radeon PRO W7000 Series makes it possible to process large data sets. With 48GB of fast GDDR6 memory, the Radeon PRO W7900 GPU can handle even large production assets without the performance hit from going out of core.And having a GPU in your workstation capable of accelerating AI tools helps to keep workflows local. Not needing to process jobs online avoids the need to upload commercially sensitive data to the cloud, reduces demands on network bandwidth, and provides more control over deadline-critical tasks.Compatible with key graphics softwareNor is DaVinci Resolve the only CG application that can harness AMD GPUs to accelerate AI processing. Boris FX's Continuum plugins for VFX and motion graphics work include a range of AMD-compatible ML tools, for tasks ranging from blurring out faces to retiming video.Twixtor, RE:Vision Effects' video retiming plugin for software like After Effects and Premiere Pro, also features a new machine learning algorithm again, fully accelerated by AMD GPUs. Topaz Labs' Gigapixel AI and Video AI upscale still images and video, while 3D software Blender uses AI to remove noise from renders generated by its Cycles renderer.AMD: accelerating creativityWith their dedicated AI accelerators, the GPUs of the AMD Radeon PRO W7000 Series and AMD Radeon RX 7000 Series can speed up these everyday workflows, while the additional GPU memory available in workstation cards makes it possible to process even complex production scenes on a local workstation, without the need to upload sensitive data to the cloud.For artists like Mike Griggs, the speed boosts reduce interruptions in creative workflows and make it possible to turn around jobs quicker.I'm glad that my AMD graphics workstation is already well-placed to make the most of these new opportunities, while providing real-world benefits to my business and clients today, he says.See creative AI tools in action:Find out more about how AMD Radeon graphics can enhance your creative output - https://www.amd.com/en/products/graphics/radeon-for-creators.htmlFootnotes1 Testing conducted by AMD as of September 19, 2023, on a test system configured with a Ryzen 9 5900X CPU, 32GB DDR4, Radeon RX 7900 XTX GPU, and Windows 11 Pro, with AMD Software: Adrenalin Edition 23.9.1, using the application Black Magic DaVinci Resolve 18.6 vs. Black Magic DaVinci Resolve 18.5. Data was gathered on the HD to 8K UHD 4x (Playback FPS @ 1080p Timeline), Speedwarp 10% (Playback FPS @ 1080p timeline), and Magic Mask Tracking Better (FPS). Performance may vary. System manufacturers may vary configurations, yielding different results. RX-997All performance and/or cost savings claims are provided by Mike Griggs of creativebloke and have not been independently verified by AMD. Performance and cost benefits are impacted by a variety of variables. Results herein are specific to Mike Griggs and may not be typical. GD-181.David Diederichs is Product Marketing Manager for AMD. His postings are his own opinions and may not represent AMDs positions, strategies or opinions. Links to third party sites are provided for convenience and unless explicitly stated, AMD is not responsible for the contents of such linked sites and no endorsement is implied. GD-5RE: Vision Effects images courtesy of Clark Dunbar, Mammoth HD, Continuum images courtesy of Boris FX David Diederichs is Product Marketing Manager - AMD.
    0 Comments 0 Shares 40 Views
  • WWW.AWN.COM
    Building the World of Percy Jackson and the Olympians
    Since its premire in December 2023, the Disney+ young adult fantasy series Percy Jackson and the Olympians has not only attracted a wide and enthusiastic viewership, but has been acclaimed by critics for, among other things, its faithfulness to the source material, its performances, and, notably for our immediate purposes, its worldbuilding. Based on Disney Hyperions best-selling book series by award-winning author Rick Riordan, and starring Walker Scobell, Leah Sava Jeffries, and Aryan Simhadri, the series tells the epic story of 12-year-old modern demigod Percy Jackson (Scobell), who is accused by the sky god Zeus (Lance Reddick) of stealing his master lightning bolt. With help from his friends Grover (Simhadri) and Annabeth (Jeffries), Percy must embark on an epic quest to find the lightning bolt and restore order to Olympus.To help with the visual component of the aforementioned worldbuilding, the creators were fortunate to have the services of a number of leading VFX studios, including London-based MPC and, critically, Industrial Light & Magic, whose groundbreaking StageCraft LED Volume technology which integrates real-time animated environments in live-action shoots played a central role in the production.As we get ready for a new season, which is based on the second book of the series, The Sea of Monsters, and is slated to be released in 2025, we spoke to Senior Visual Effects Supervisor Erik Henry about the highlights and challenges of the Season 1.Dan Sarto: Why don't you walk me through the different types of visual effects that we see across the season, and highlight the things that were most challenging, or that stand out for other reasons?Erik Henry: Well, Id have to start with the Minotaur, because for us that was a very challenging scene. It's hard to make something believable that runs on all fours, then stands up on its hind legs, and can mix it up. Thats the sort of thing where, if you dont get it right, people are going to be turning it off. So, we knew right out of the gate it's in Episode 1 that was going to be super important. The challenge there was to believably work with the camera and the actors. We had a motion base to help us with that. Walker [Scobell], who plays Percy, was able to get on the back of the Minotaur to be thrown around and to have to hold on and slide around, because there's rain on set. We did that on a volume stage. I always like to talk about the chimera in Episode 3, because that's just one of my favorite monsters a mashup of a lioness and a goat and a snake. The success of that really comes down to the fact that we were able to take an entire set and just paint it all black and use flamethrowers. The special effects guys came in and gave us proper elements, because it's a fire-breathing creature.The work done on environments by one of our trusted vendors up in Canada also stands out. All kinds of wonderful environments from Medusa's Cavern, which they did as a content for the volume, to all the great work for the Underworld. We had this great concept that the Underworld was going to be a massive cave on an almost planetary scale. And if you look at that, you're going to see that it's not sky. There are actual mountains up there and they have an alluvial flow. You get these lovely, almost spider-like outcroppings from each of the mountains that's buried in the mists.DS: The chimera was really well done. I especially liked how the catlike mannerisms were brought out.EH: That was done by MPC in London, and it was their crown jewel. They also did some work with Alecto, but this is where they really shined. Its in the gait you can tell that's definitely how a lioness walks. They just hit the nail on the head. But its also the eyes and how it opens its mouth. We added a little tongue, a snake tongue that comes out, but even that felt feline. It's really cool, I like that. DS: Tell me a little bit about the LED stage work. You and ILM made fantastic use of the LED volumes to create a lot of big environments, but there's also a lot of intimate work within that.EH: Jeff White, the supervisor at ILM, was the one who was heading up all the stagecraft work. They were the right choice to build the volume stage for us and to do the content. They did the content for all but one of the sequences, and a lot of the show's success comes from those volumes. The things that were shot on the volume are juxtaposed with scenes that are outside, that take advantage of real light in ways that the volume can't. That came from ILM telling us, "Don't do everything on the volume. Let's pick and choose the things that make it shine." And that's what we did.One scene that was really remarkably well done is when Luke [Castellan, a son of Hermes] and Percy square off. It's kind of broken into two parts first they're training and then they fight for real. That's on a volume where we had a lot of trees that were live action on the volume stage and then, in the content, you had the same kind of trees. And there was a fireworks show going off above their heads that lit the whole place. It wasnt really a big thing, but the two scenes that are in that particular volume are so successful. You always try to do a little bit of something that's live. We had puddles that would get the reflection from the volume; we also shot plates of water, actually in Southern California, and used those on the stage for the ocean that's out in the distance. It was a nice collaboration, to shoot the scenes that made the most sense, and bring those to life with the volume.DS: Did you do any previs and, if so, what did you use it for?EH: Previs played an important role, as you could imagine. In the very first scene with Percy and Alecto in the fountain in front of the Metropolitan Museum of Art, we used it because we were really interested in knowing what shots we could do. It's not a complete circle, so it enabled us to understand when we're going to come off the volume, then we would maybe tighten up or push in or something. It helped us tremendously in figuring out what we could actually do to stay on the stage.DS: What about postvis?EH: We did do a little bit, but, by and large, the planning goes into the volume and such. Its about things like whether the trees in the background are blowing in the wind. And, if they are, do we have some trees on the set that are going to be blowing in the wind as well? Does that all work? What does the rain look like? It's about testing the color balance between the foreground and the content. It takes a lot of time to build something, but essentially we're doing shots that would normally take 20-some odd weeks in post. We're just doing all that up front so that it looks right when the camera rolls. DS: When you look back on the season, was there one big challenge that stands out from everything else?EH: I guess I would go back to the Minotaur, because its a perfect example of the general challenge we faced. The show had to be accessible to young kids and to older viewers. We wanted adults to be able to watch it. So, it couldnt be too scary for an eight-year-old, but it also couldnt be too silly that adults say, "Well, you watch it and I'm going to go do the dishes." We wanted it to be a family event, where parents were engaged, as well as younger kids.And I think we succeeded, but it wasn't easy. In the case of the Minotaur, we went back and forth on whether close-ups of it roaring over Percy were too scary. We didn't want it to look like a zombie. [Executive Producer/Creator] John Steinberg said to me, "It has to have a little bit of teddy bear in it. Just always keep that in mind."From a technical standpoint, the biggest challenge might have been a sequence that took place in the area near the pit of Tartarus. We had to shoot it on a stage, and it was supposed to be out in a vast, hilly desert. And so, you got really close to the sky that was created by the DP. I was really scared that that was not going to look like we were outside.That it worked so well is a testament to Jules [OLoughlin], our DP. He said to me, "I'm going to go really soft. I'm going to help you out as much as possible." And I remember telling him later, I did not think we were going to pull it off, but it definitely looks like you're outside." I think to some degree it was helped by color, but it really did work. Sometimes you think you're going to have a real hard time with something, and then youre pleasantly surprised by the outcome. Jon Hofferman is a freelance writer and editor based in Los Angeles. He is also the creator of the Classical Composers Poster, an educational and decorative music timeline chart that makes a wonderful gift.
    0 Comments 0 Shares 44 Views
  • WWW.AWN.COM
    Adobe Celebrates 5 Years of Fresco, Now Available for Free
    Adobe is celebrating five years of its digital painting and drawing app, Adobe Fresco, with the announcement of powerful updates ahead of Lightbox Expo, which kicks off today in Pasadena, California. The event brings together artists in animation, gaming, illustration, and live-action and runs October 25-27 at the Pasadena Convention Center. Get more LBX information here.Here are some Fresco highlights:Over the past five years Fresco has transformed digital art through innovations which streamline artist workflows and enhance creativity with analog techniques reimagined for touch and stylus devices, offering creatives the following:Live oil and watercolor brushes, vector brushes, and thousands of pixel brushesThe latest advancements in touch and stylus technology with haptic feedback, tilt, barrel-roll, and squeeze support for Apple PencilCutting-edge motion features that make it easy to add eye-catching movement to artwork in secondsThe user-friendly symmetry tool, which ensures precision and faster creation of complex compositions.A vector trimmer to quickly remove intersecting vector strokes and clean up line art.The paint inside tool option to fill an enclosed area without going outside the lines. Frescos latest innovations deliver more power and precision for creators:Motion Presets including bob, breathe, bounce and more help creators quickly add lightweight eye-catching movement and animations to artwork in seconds.Reflective and Rotational Symmetry helps creators achieve faster and more seamless creation of complex compositions.Ability to take full advantage of ecosystem tools like the Apple Pencil Pro capabilities across workflows including:Shortcuts with squeezeNew haptic feedback for key user actionsBarrel roll for more realistic brush strokes when pencil is rotatedAdobe has also announced that to further the companys mission of empowering creativity for all, it is making Fresco completely free to all users, so now anyone can discover drawing and painting with the support of Adobe Fresco and all its premium capabilities.Visit Adobes blog Celebrating five years of Fresco with powerful new updates that unlock digital drawing and painting for everyone for more information and see how artists are harnessing the power of Fresco in their creations.Source: Adobe Debbie Diamond Sarto is news editor at Animation World Network.
    0 Comments 0 Shares 11 Views
  • WWW.AWN.COM
    The Many Challenges of Bringing Goosebumps to Life
    Goosebumps, based on the popular series of young adult horror novels by R.L. Stine, premired simultaneously on Disney+ and Hulu in October 2023. Published by Scholastic, Goosebumps is one of the bestselling book series of all time, with more than 400 million books in print in 32 languages.In the series, which was created by Rob Letterman and Nicholas Stoller, a group of five high school students unleash supernatural forces upon their town (never a good thing), with manifestations ranging from zombies to a haunted mask to unpleasantly invasive worms.Unsurprisingly, given the shortage of reliable zombie actors these days and the well-documented difficulties of getting worms to take direction, the series depended on visual effects to realize its narrative goals.With a new season set to premiere in January, we spoke with SeniorVFXSupervisor Lawren Bancroft-Wilson about the many challenges he faced, in particular having to produce a wide variety of creatures and special effects for a single series, and the invaluable help we received from leading effects studios, including MPC, MARZ, and Pixomondo. Dan Sarto: Why don't you start by giving me an overview of the visual effects for which you were responsible across the first season?Lawren Bancroft-Wilson: One of the tricky things with Goosebumps was that we were taking four or five different stories and letting them play out in individual episodes, so that it almost feels like an anthology when it starts. So the first five all have our main characters experiencing their own story, and then it dovetails in the last couple of episodes. When we started to break it down, the work that was involved was really specific to each book. I mean, we were looking at a bipedal zombie person. Then its worms, a very effects-heavy worm simulation. It required totally different strengths to get the shots done and we had to figure out which of our vendors was best for each one.Really, each episode was a different handoff. Episode 3 was duplicating, or twinning, so a character becomes like nine or ten of themselves. That's a slow process of figuring out how to do repeated takes, how to really work with the director to get the performances and interactions, so the actor's able to play each character well. There, visual effects are standing back and just saying this is what we need, and helping the director to get what they need. Episode 4, the worm episode, was a big one trying to figure out how to create this massive simulation, including worms chasing someone on a motorcycle. There had to be ground interaction, and other effects on the environment. That one was definitely the hardest one, because it was really testing what you can do in terms of heavy effects simulations on a shorter TV or episodic schedule. Episodes 5 and 6 get into a bit more surrealist stuff. Those are episodes about this notebook, and about characters being trapped in the notebook. If a page rips in the real world, how does that affect our virtual, storybook world? We had to find a way to map effects. Say the page gets wet, and the ink starts to run how do we translate that to the world we're in? If the page is crumpled, how do we see the walls crumble as the character gets away?Rob Letterman, the showrunner and one of the creators, is very good at understanding that a lot depends on the supervisor you get, and so we really went and kicked the tires of each VFX company, saying, "You might be good at this, but who is it we're going to be working with at your company to help realize this?" DS: Can you talk a little about some specific episodes and which companies handled them?LB-W: The first episode was handled by MPC because of their animation background. The zombies and everything had to feel real. We needed to know that we can get realistic physics, even though they're zombie creatures. Realistic run cycles. You shouldn't be able to tell whether there's a stunt actor in there or there's a CG one.Episode 2, the troll episode, went to MARZ, which is a really good creature company in terms of creating models. The visual effects supervisor was Cristian Camaroschi, who had actually worked on one of the Goosebumps movies. They were really great at building this troll model with great detail, which was able to show a lot of emotion through the facial animation, as well as transforming from a mask into a face. I already talked about Episode 4, with the worm simulations. We used Pixomondo for that. We needed a company that had a good depth of FX artists, and we needed a supervisor who was able to crack all the different levels of the worms when they're small individuals going into the body, and then a large worm creature that we haven't seen before. It had to perform with our characters, and then on top of that, we had to figure out how to do the FX simulations to make sure it all feels like it's constantly generating, that it's basically eroding and leaving worms behind. The supervisor was Carlo Monaghan, who joined our team for Season 2 because he was so great and Rob was just blown away by him.Episodes 5 and 6 was a mix of companies working on the environment and the simulation stuff. We worked with Distillery for some of that, and they also came on to do our big environment snowstorm. They really knew how to do good environmental extensions, and they built a beautiful cliff, as well as all the effects simulations involved with it.There are a lot of changes in the episodic world, where, after we've shot, we have to reevaluate what we're going to be doing. There's not always the ability to have everything 100% prevised and then just adopt the previs on the day. We get a lot of curveballs thrown at us through production. Environments change, locations change, and schedules change. We have to basically compress what would be a larger amount of storytelling into a smaller amount, and we have to refigure how we execute that in VFX. DS: As far as the practical effects versus visual effects, how much could you actually decide beforehand what was going to be done in-camera and what you'd have to augment, or do completely in CG?LB-W: Rob and I are very similar in that we always want to try to do as much in-camera as possible, and we also want to always have that touchstone of seeing what something in-camera looks like. Even if we know we're going to change something, we want to attempt it, we want to see what it is. When you're working on a show where youre dealing with things that everyone sees in real life a lot, or that people have a familiarity with, its good to get a touchstone so we can all start from the same place, and then we can creatively say whether something should be one way or the other.For the football sequence, for example, we actually had players out there tackling and jumping. Then the real players were removed, and we put in our zombie players to do things that our stunt actors couldn't necessarily do. There were things that we wanted, like charges on the field, but we weren't able to use any actual flames on that field, so we had to add that. We always start off trying to do as much practical as we can, but we always try to make sure that what we're trying to do is going to be safe and we're not putting anyone at risk. There's a scene where Harold Biddle is eating worms out of a bowl. We put a mix of produce, gummy worms and other stuff in there, and he went to town eating these. Then we added some CG worms after to give a little bit of extra fight to them. There's stuff like that, where you could just say, "We're just going to do it all in CG and do it later," but the problem is you get into that uncanny valley. It always helps to have something filmed.You want to make a really engaging and amazing thing, and sometimes that means we stick exactly to what we shot. Then there are other times where the story evolves in a way where you say, "Well, we actually need this extra thing, and we can't go back and shoot it, so let's do the vis effects of it." It's always a conversation. My entire career has been us sitting in a room, getting our marching orders or getting the script and thinking, "Okay, how can we do this? What can you do? I'll do this. We just figure out what works best and what's going to let the director get the best performance out of the actors. That's always a really big thing we want to make sure that we're not in the way of performance, that the storytelling comes first.I always think of vis effects as being a multiplier. How can we use it to multiply what we get elsewhere, in terms of time and money, and especially for special effects? It's like special effects can do one amazing thing, and then we'll do the next ones, which maybe has a quicker reset time. We'll steal a bit of that first amazing one and put it into the others, and just help everyone multiply what they do. DS: Every visual effects supervisor I talk to is like, "Lighting, lighting, lighting. They always want to shoot something practical, if only to catch the lighting. Lighting on a spooky show is especially important. In something that's got this creepy lighting going on, does it add a layer of difficulty, or is it just always difficult?LB-W: Lighting is always incredibly important. DPs have to be able to set up and get stuff going, and they always are trying to light how they envision a scene. We're always working with them to try to know exactly what their intentions are, so that we can carry those intentions in post.The DPs for this season, Tom Yatsko and Steve McNutt, like to play with very, very dark stuff. They want everything darker, darker, darker. I think its great. A consistent and strong vision by a director of photography makes our job and visual effects much easier, because we can know what look we're going for, and we can match that look. I think, for us, it's not whether something is brighter or darker. In the end, things tend to look quite a bit better when the lighting is very intentional.I would say the harder thing about our series is that it includes a lot of fire. One of the key story points is about a character who died from a fire, and we see those fire elements come up quite a bit. Working in dark and then having fire added in where we had to do a lot of CG effects simulations was a real balancing act, making sure that Tom and Steve could add some interactive fire to the plates. With that interactive, real fire as a baseline, when we add our fire, it feels like it lives in there.DS: Last question: did you do any previs or postvis on this? Were you using storyboards? How did that work?LB-W: We started off with some previs for the football sequence, to basically just try to get our layout of what the field is, what lenses we need to use to make it feel proper size, and then figuring out how we have to do the animation. It's a level of techvis at that point. We did a lot of that throughout and we did a lot of postvis.Once it gets to postvis, it's a very short schedule because our animation tends to run at the same time before we're locked. We can call it postvis, but really we're adjusting the edit around the animation, on which Rob is very, very focused. You cannot get any animation by him that doesn't sync up, where continuity is not there, or physics isn't there. We were cutting as we were doing the animation. We did a lot of that just to make sure that performances were there, that the editorial made sense.There's a constant back-and-forth between editorial timing, pacing, and our animators designing what's happening, with Rob essentially directing the performance of our creatures and everything with the editorial. It's almost like directing on set, saying, "Okay, let's do this take again." He gets that opportunity with vis effects, to continually redesign the postvis as we edit. Jon Hofferman is a freelance writer and editor based in Los Angeles. He is also the creator of the Classical Composers Poster, an educational and decorative music timeline chart that makes a wonderful gift.
    0 Comments 0 Shares 11 Views
  • WWW.AWN.COM
    Dino Dex Now Streaming on Amazon Kids+ and Prime Video
    From the Emmy Award-winning Dino franchise, the live-action CGI series follows 9-year-old Dex, who with his neighbor Kayla, goes on hilariously fun explorations to study dinosaurs.
    0 Comments 0 Shares 24 Views
  • WWW.AWN.COM
    Pitch Black Names Manuel Ramrez President of El Ranchito
    Pitch Black named Manuel Ramrez president of Spanish VFX house El Ranchito, passing the leadership reins from co-founder Flix Bergs, who will transition to an advisory role while continuing to support the companys development and strategic direction. Pitch Black is the parent company of leading visual effects studios FuseFX, FOLKS, Rising Sun Pictures, and El Ranchito.In his new role Ramrez will manage the studio's daily operations and promote Spain globally as a central hub for visual effects. The 20+ year industry veteran previously served as studio director, contributing to the studios operational and creative success.Ramrezs tenure at El Ranchito began in 2015, when he established and led the effects department. He has subsequently taken on leadership roles including head of VFX, managing operations, research and development, and pipeline development.With the VFX industry entering a new era of technological advancement, the studio looks to Ramrez to lead El Ranchito into the future."Its an honor to take on this new challenge," said Ramrez. "Im excited to lead El Ranchito in this new phase, with the goal of strengthening our global presence and continuing to create an environment where creative talent can thrive.""It has been an honor over the last 20 years to help lead the El Ranchito team to where it is today, said Bergs. I feel very comfortable leaving the company in Manuel's capable hands, supported by an outstanding team that shares the same vision."Pitch Black CEO Sbastien Bergeron added, Flix has done an exceptional job in positioning El Ranchito as a prominent visual effects studio, not only in Spain but globally, delivering outstanding work for its clients. Manuel is the perfect choice to take the reins and guide El Ranchito into its next chapter."Source: Pitch Black Debbie Diamond Sarto is news editor at Animation World Network.
    0 Comments 0 Shares 27 Views
  • WWW.AWN.COM
    Its All About the Viscosity in FOLKS The Umbrella Academy Season 4 VFX
    While we are sad to see The Umbrella Academy end its highly enjoyable run after four seasons, we certainly werent disappointed in how the show closed out its Netflix run. The hit series follows a family of former child heroes who have grown apart and must reunite to continue protecting the world. Stars include Elliot Page, Tom Hopper, and David Castaeda.For the VFX team at FOLKS, whove worked on the series since its inaugural season, Season 4 presented their most significant creative mandate yet. Led by VFX supervisor Laurent Spillemaecher and Head of Creatures Supervisor Gabriel Beauvais, FOLKS was responsible for conceptual pre-production work to establish different creatures' identities; postvis; final elaboration of complex shots; and creation of character and environment assets, animation, simulation, and compositing.However, their most monstrous achievement was enhancing an environment for a Christmas-themed sequence, adding crowd characters, and creating digital doubles of characters Ben and Jen in their sickly, pustulous versions. Pustulous is probably too weak a word to describe the couples decent into a huge monstrosity. These characters had to be seen in close-ups, expressing emotions and gradually transforming into a complex, bulbous creature that ultimately morphs into a viscous blob to destroy the world. Dont you hate when that happens? Season 4 focuses on the leadup to The Cleanse, which will rid humanity of errant timelines that have somehow overlapped and left remnants that a pair of rather insane, murderous academics have been collecting as proof. This all culminates with The Blob, an intelligent half-liquid, half-solid form that absorbs energy.The Blob presented significant technical challenges in FX simulation that required precise art direction to achieve the desired movement in each individual shot, Spillemaecher shares. Beneath the thick, organic simulated envelope lies a cluster of solid masses - like organs or meat pieces - that move with collisions based on a tuned rigid-body particle simulation, with guides and controllers to adjust the speed and direction to the desired location. Then, we would simulate the surrounding slimy envelope, which had a complex subsurface shading system to be able to see through.We were involved really early in the process to develop the various creatures long before principal photography, says Spillemaecher describing when their Season 4 work began. We developed a close collaborating relationship since the first season with the Netflix team and the overall VFX supervisor, Everett Burrell. We started with early scripts and, from there, started some sketches with our art department and creature supervisor, Gabriel Beauvais. Work began in February 2023, with concept art for the Ben and Jennifer merge moment, alongside their distinct looks. According to Spillemaecher, the FOLKS team also spent time focused on R&D for the FX simulations to support the various creature stages. By the end of March 2024, we delivered our final shots, completing a year of intensive VFX work. Our production and supervision team comprised 35 members, while we had 120 artists on board.Early visual development started almost from scratch. Spillemaecher explains, We actually had very little to start withalmost nothing! We kicked off our exploration using early scripts as our foundation. We also analyzed various films for inspiration, drawing from classics like Akira and John Carpenter's The Blob while also identifying certain styles we wanted to avoid. This process of determining what not to do is just as crucial in developing our concept.For FOLKS, the concept phase was critical, and was handled in a number of ways, including classic painting photos, as well as 3D sculpting tests to discuss proportions, key poses, and how certain creatures would interact with their environment, move, or behave. We also did a lot of FX simulation testing for skin surfaces, deformations, and how the creatures would eventually melt and become a blob (half-liquid, half-solid conscious form), Spillemaecher notes. Summing up their Season 4 work, Spillemaecher says production of The Blob was the biggest challenge his team faced. One of the major challenges was the evolution of the large creature, which transformed from a roughly 6-foot-tall, human-shaped figure with tentacles to a towering 100-foot monster. This required multiple asset builds and a highly adaptable internal rig system. We needed all surfaces to be movable, stretchable, and scalable simultaneously while still maintaining a coherent structure to support realistic animation behavior. Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.
    0 Comments 0 Shares 32 Views
  • WWW.AWN.COM
    Freefolk Shares KAOS VFX Breakdown Reel
    Freefolk recently served as a lead vendor on the long-awaited Netflix series KAOS, starring Jeff Goldblum as Zeus, a dark but humorous contemporary spin on Greek mythology.VFX supervisor Steve Murgatroyd, CG supervisor Harin Hirani, VFX executive producer Meg Guidon, and VFX producer Hannah Dakin and VFX coordinator Regan Perry led the Freefolk team that created a stunning fully CG Mount Olympus, the series opening shot, and home of Zeus. In addition to the mountain and Zeus palace, the team completed the surrounding environment, building the cloudscape, atmospheric fog elements, and distant clouds.Being tasked with not only the first shot of the series but also establishing Zeus decadent dwelling, was incredibly exciting and challenging, said Murgatroyd. It needed to be both breathtaking and believable as well as feel immense, lavish, rich and beautiful.Murgatroyd explained that the mountains surface was hand sculpted, based on references, with additional rocks and stones layered on top, he continued, Each and every tree, plant and bush scattered on the mountain was carefully considered, mixing up rich and vibrant greens of Cyprus trees with pockets of warm pink from the wisteria that decorate the walls of the gardens and mansion. In addition, the team built set dressing assets, such as fountains, statues and garden furniture, where all of these small details were key in making it feel like a real living place.For the team to execute the dramatic thunderstorms orchestrated by the king of gods, the VFX team controlled the composition of the clouds as well as the speed at which they developed.We generated several cloud simulations in Houdini starting from birth to over 3000 frames, these were then randomly rotated, scaled, offset and pieced together to create the larger storm structure, shared Hirani. We were then able to retime these caches on a per shot basis to get the speed we wanted. Other cloud layers were done procedurally like the distant ones and the surrounding base clouds the storm sat on.He continued, One of the particular challenges for this sequence was storms are usually seen from below, at ground level, where they appear dark as they are dense and block the light but in this sequence, we are looking at the storm from above. To tell the story we had to creatively darken the shadows of the clouds in compositing to make them appear dense and stormy and show their progression.Check out Freefolks VFX magic:Other key work included creating the Frame on the river Lethe, and the increasing amount of people who passed through on their way to renewal, Prometheus Cliff, a huge concert stadium with crowd replication, Heras collection of Tacitas tongues, a CG gold statue of Zeus and lavish flora and fauna as well as numerous other crucial VFX.KAOS is now streaming on Netflix.Source: Freefolk Debbie Diamond Sarto is news editor at Animation World Network.
    0 Comments 0 Shares 48 Views
  • WWW.AWN.COM
    Ben Owen Appointed Studio Manager at The Yard London Office
    The French VFX studio The Yard has appointed Ben Owen as the new studio manager for its recently launched London office. With over 15 years of experience in the VFX industry, Owen has worked at Framestore, MPC, DNEG, One of Us, and more.We are delighted to welcome Ben to The Yard, said Laurens Ehrmann, founder and Senior VFX Supervisor of The Yard. His extensive experience in both talent acquisition and studio operations, alongside his knowledge of the international VFX industry, makes him the ideal manager for our new London team. His deep understanding of the UK market aligns with our mission to elevate local talent and establish a stronger presence as a key player in the VFX sector.The exceptional skills of The Yards artists, combined with the knowledge of our Production and Supervision teams, have consistently enabled the studio to contribute to outstanding projects that have garnered industry acclaim, Owen added. I am thrilled to be part of The Yards studio and contribute to its presence in London, a city recognized for its artistic innovation and high production standards. I am eager to apply my experience in fostering a vibrant, collaborative and inclusive atmosphere where ambitious projects and local talent can thrive.The Yard has produced VFX for both films and episodics, including high-profile projects such as The Rings of Power Season 2, Halo Season 2, All The Light We Cannot See, John Wick: Chapter 4, and Indiana Jones and the Dial of Destiny. By establishing a presence in the UK, The Yard aims to solidify its position as a leading VFX studio that combines international expertise with a commitment to fostering local talent.Source: The Yard Journalist, antique shop owner, aspiring gemologistLaurn brings a diverse perspective to animation, where every frame reflects her varied passions.
    0 Comments 0 Shares 40 Views
  • WWW.AWN.COM
    Reallusions 2024 3D Character Contest Meet the Winners!
    Reallusion has just released its list of winners for its 2024 3D Character Contest, which ran May 1 - September 30, 2024. 267 submissions were received from 55 countries, showcasing the best 3D character design and animation created using Reallusions powerful character production ecosystems - Character Creator for 3D character creation, iClone for facial and body animation, and AccuRIG for rigging.At Reallusion we are proud to see an influx of high-level character artists who are adopting Character Creator and iClone into their professional pipelines, said Enoc Burgos, Reallusion Director of Partnership Marketing. The contest proved the viability of using Character Creator to create unique 3D characters without thematic limitations. We congratulate all who participated, and look forward to bringing more innovations to the industry.Prize Sponsored by Renown Industry LeadersThe 2024 3D Character Contest was hosted by Reallusion and sponsored by industry leaders such as NVIDIA, Maxon, Rokoko, Noitom, KitBash3D, 3DConnexion, KeenTools, TexturingXYZ, and Fox Renderfarm. Through this contest, artists produced a wide range of stylized work, pushing Character Creators extensive pipeline that allows artists to seamlessly integrate a new workflow designed for rapid creativity with zero sacrifice in character quality. Watch Overview VideoWINNERSThe 2024 Reallusion 3D Character Contest offered cash and prizes valued at over USD$47,500 thanks to our A-list sponsors who partnered with Reallusion. Here are the winners:REALISTIC CHARACTER DESIGNUnder this category, entrants create a realistic human character in Character Creator by combining the character base with tools like ZBrush, Substance Painter, Marvelous Designer, Blender, Wrap, Texturing.xyz, and Maya. 1st Place: The Last Entry - by Konrad Hetko, 3D Character Artist"Reallusion's tools were incredibly helpful, especially the ease of switching between Character Creator and ZBrush using GoZ. But what I appreciate the most is FaceTools, which takes care of the most tedious aspects of working with morphs and dynamic normals by automating everything with just one click!" - Konrad Hetko2nd Place: Shannaz - by Tom Babka, 3D Artist"I really like tools from Reallusion. Especially the Character Creator. They help me speed up the character creation process tremendously. Quality textures, Topologies, Meshes and Morphs are of a high standard and very well made. As well as tools and addons for Blender, ZBrush, Unreal or Unity. Any pipeline including characters can be significantly sped up." - Tom Babka3rd Place: Hard Battle - by Jeet Shah, 3D Character Artist "Character Creator introduced me to a refreshing new way of creating 3d characters with a very non-destructive approach with its hundreds of sliders for full body modifications. I could easily try out new proportions, and new ideas without fearing that I won't be able to revert to the old one." - Jeet ShahSTYLIZED CHARACTER DESIGNUnder this category, entrants create a stylized character in Character Creator by combining the character base with tools like ZBrush, Substance Painter, Marvelous Designer, Blender, Wrap, Texturing.xyz, and Maya 1st Place: ASTIN - Chemical Oasis - by Jorge Leonardo Ayala Arias, 3D Character Artist"It's such an ease to have a faithful base to the concept, reducing the time to see in action for the created assets and the character. Character Creator provides the possibility and versatility of adding bones for extra accessories, giving life to the character." - Jorge Leonardo Ayala Arias2nd Place: Avice - A Knight of the Realm - by Kurt Boutilier, Digital Sculptor / 3D Print Artist"Reallusion tools helped me create a base mesh for my character, as well as the ability to rig and pose her much faster than modeling or rigging a character from scratch." - Kurt Boutilier3rd Place: Seeker - by Duai Sebastian Florez, 3D Character Artist"Character Creator 4 is an incredible software; it helped me save 50% of the time in my character creation pipeline. Additionally, it allows me to pose characters quickly without needing complex rigging methods. Its integration with ZBrush is so seamless that they almost seem like a single program." - Duai Sebastian FlorezREALISTIC CHARACTER ANIMATION Under this category, entrants use iClone to create a realistic animation for a character made with the CC3+ Base. The character model can be customized with Reallusion tools and content or third-party software like ZBrush, Blender, Unreal Engine. 1st Place: Lisa & Fia - by Robert Lundqvist, 3D Hobby Artist"Easy and fast character creation with Character Creator and Headshot." - Robert Lundqvist(Watch Roberts full entry)2nd Place: The Playful Deity - by Kay John Yim, Architect by day / CGI Artist by night"Character Creator 4s Ultimate Morphs allowed me to create my character's foundation non-destructively, which seamlessly integrated into iClone for posing and animation. I utilized ActorCore motions as the base for the character animation and relied heavily on iClones 'Edit Motion Layer' and 'Motion Correction' for user-friendly animation adjustments." - Kay John Yim(Watch Johns full entry)3rd Place: NeuroNexus - by Hamidreza Hamzehpour, CG Generalist - Film/Animation Maker"Initially, I used the base 3D model of the character creator and sent it to ZBrush using the ZBrush Face Tool Plugin to sculpt my character on the base m, then exported the rigged character to Cinema 4D. Also, I've used iClone to mix some animations like walking to stop, start to walk, etc." - Hamidreza Hamzehpour(Watch Hamidrezas full entry)STYLIZED CHARACTER ANIMATION Under this category, entrants use iClone to create a realistic animation for a character made with the CC3+ Base. The character model can be customized with Reallusion tools and content or third-party software like ZBrush, Blender, Unreal Engine. 1st Place: HellGal - by Loc Bramoull, Cinematics for indies and AA studios"Character Creator is always amazing to kickstart any humanoid character production, already supporting the really accessible facial motion capture in iClone, here with my iPad via LiveFace." - Loc Bramoull2nd Place: Dance of the Flying Spirit - by Melis Caner, 3D Generalist"As a 3D generalist, creating a character from scratch has always been challenging. Reallusion tools helped me build my character and animation significantly easier, faster, and more efficiently." - Melis Caner(Watch Melis full entry)3rd Place: Nesting Realm Keeper - by Varuna Darensbourg, Artist & Game Dev"For this challenge, I was able to complete everything using CC4, iClone, PS and GoZ+. I needed to work as streamlined as possible. CC4 & GoZ+ really helped me finish my character efficiently and helped to quickly create props, like the Dragon egg, pillar and more. Another supermassive time saver was using the facial capture from the Voice video, using AccuFace." - Varuna Darensbourg(Watch Varunas full entry)BEST ACCURIG CHARACTERSUnder this category, entrants use AccuRIG to turn their static characters into fully animatable projects in any topology. The winning characters showcase high sculpting details with attractive poses and professional renders. SPECIAL AWARDSSpecial Awards and Prizes are reserved for outstanding achievement in certain areas of the competition. These awards are juried by our judges and add more to the winnings. HONORABLE MENTIONSThe judging process for the "2024 Reallusion 3D Character Contest " proved to be quite difficult and took many hours of deliberation. Besides the top 3 placements from Best Character Animation & Best Character Design, Reallusion also selected 48 winners for AccuRIG characters, Special Awards, and Honorable Mentions. See the Winners Page for winner details and showcases. Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.
    0 Comments 0 Shares 46 Views
  • 0 Comments 0 Shares 139 Views
  • 0 Comments 0 Shares 136 Views
More Stories