Befores & Afters
Befores & Afters
A brand new visual effects and animation publication from Ian Failes.
  • 3 persone piace questo elemento
  • 146 Articoli
  • 2 Foto
  • 0 Video
  • 0 Anteprima
  • News
Cerca
Aggiornamenti recenti
  • BEFORESANDAFTERS.COM
    Rebelways Black Friday Sale: Unlock 25% Off on VFX Training Courses!
    This Black Friday, unlock your full creative potential with 25% off all VFX training courses at Rebelway!Whether youre a budding VFX artist or a seasoned pro, Rebelways industry-leading courses will help you master the art of visual effects.Why Choose Rebelway?Master industry-standard software: Houdini, Nuke, Unreal and more.Learn from industry professionals: Gain insights from top VFX artists.Create stunning visuals: Develop your skills in character animation, environment creation, and dynamic simulations.Lifetime Access: Learn at your own pace and revisit lessons whenever you need.Get 25% off any individual course or 10% off any bundle.Heres how it works:Use One Of These Coupons in checkout to get your discount:Course25 To Get 25% Off Any CourseBundle10 To Get 10% Off The Course Bundle, with bundles you save more then a $1000 on courses.This offer is valid starting from today.Featured VFX Courses to Take Advantage of This Black FridayCREATURE CFX IN HOUDINIHOUDINI FOR 3D ARTISTSACTION MOVIE FX IN HOUDINICINEMATIC LIGHTING IN HOUDINIFeatured Coding CoursesPYTHON FOR PRODUCTIONMACHINE LEARNINGOther Popular Rebelway Courses to ExploreHOUDINI FUNDAMENTALSCOMPOSITING IN NUKEINTRO TO UNREAL ENGINECITY CREATION IN HOUDINIREALTIME FX IN HOUDINIADVANCED WATER FXTake a look at some of the standout projects created by Rebelway studentsshowcasing the impressive VFX skills theyve developed.If youre ready to create incredible VFX projects like these, NOW is the best time. Visit Rebelways website www.rebelway.net and enjoy 25% off any course with the code COURSE25, or 10% off any bundle with the code BUNDLE10.The offer lasts all week, so act fast!Not sure which course to choose? Feel free to reach out to them at info@rebelway.net, and theyll be happy to assist you!Brought to you by Rebelway:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post Rebelways Black Friday Sale: Unlock 25% Off on VFX Training Courses! appeared first on befores & afters.
    0 Commenti 0 condivisioni 6 Views
  • BEFORESANDAFTERS.COM
    First look: see previs and postvis reels for Dial of Destiny
    Watch the reels for the first time right here.James Mangolds Indiana Jones and The Dial of Destiny has been out for some time, but were now able to bring you a first look at previs and postvis reels from Proof. As you can see, the studio delivered previs for the film in an animated comic-book quality to help inform the technical and storytelling. The postvis reel is also fascinating to see how plates and bluescreen photography was filled in with temporary visual effects to help the editorial process.Check out the reels, below. The post First look: see previs and postvis reels for Dial of Destiny appeared first on befores & afters.
    0 Commenti 0 condivisioni 7 Views
  • BEFORESANDAFTERS.COM
    You really dont want to miss this latest OTOY short as part of the Roddenberry Archive
    It includes the return of William Shatner as James T. Kirk.You may have already seen some of the intriguing Star Trek-related shorts produced by OTOY as part of The Archive from the Roddenberry Archive.The latest is 765874: Unification, which celebrates the 30th anniversary of Star Trek: Generations. It launched on the web and via the Apple Vision Pro app. In it, we see live-action footage and CG images, with actors portraying characters like James T. Kirk and Spock during the shoot. According to OTOYs blog, performances came from Sam Witwer as James T. Kirk, with Lawrence Selleck as Spock. Witwer and Selleck were filmed in costume, performing as Kirk and Spock on set, aided by both physical and digital prosthetics resulting in period-accurate portrayals matching the appearance of the characters as they originally appeared in TV and Film at the time.Watching the short, and seeing a few behind the scenes images and videos here and there, really boggles the mind how they handled the face replacement work (which, as noted above, they call digital prosthetics). The visual effects supervisor was Mark Spatny.Heres a fun video from production designer Dave Blass. For folks using terms like "AI" and "Deep Fake" #Unification was all done in camera with @SamWitwer performance captured along with his Kirk version LIVE. This next level of Digital Prosthetic technology used by actors and craftsmen will be huge. It's technology in the hands of pic.twitter.com/OnDXQux3cD Dave Blass (@DaveBlass) November 20, 2024 Head to OTOYs blog post for more info.The post You really dont want to miss this latest OTOY short as part of the Roddenberry Archive appeared first on befores & afters.
    0 Commenti 0 condivisioni 7 Views
  • BEFORESANDAFTERS.COM
    Pixomondo breaks down its work on s2 of House of the Dragon in-depth
    Including previs, techvis, virtual production and final dragon animation work.The post Pixomondo breaks down its work on s2 of House of the Dragon in-depth appeared first on befores & afters.
    0 Commenti 0 condivisioni 7 Views
  • BEFORESANDAFTERS.COM
    Watch Outpost VFXs breakdown for s2 of The Rings of Power
    Environments and more.The post Watch Outpost VFXs breakdown for s2 of The Rings of Power appeared first on befores & afters.
    0 Commenti 0 condivisioni 7 Views
  • BEFORESANDAFTERS.COM
    How ActionVFX assets were used in Godzilla Minus One
    Smoke, dust and fire elements helped combine the CG creature into final shots.Today, a special sponsored episode of the podcast thanks to ActionVFX. Were talking to Tom Cowles, Social Media & Community Manager, at Action VFX, about a few different things. Firstly, we run through one of the coolest uses of Action VFX assets in recent times and that is on the film Godzilla Minus One. Here, the VFX studio Shirogumi in Japan used ActionVFX assets for things like atmospheric smoke, dust and fire. And, of course, the film won the Oscar for visual effects this year. We also jump into Action VFXs new subscription pricing, and how to see what kinds of new VFX elements are coming down the pipe. Keep an eye on Action VFX and at befores & afters as their Black Friday sale begins on November 26th.Brought to you by ActionVFX:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post How ActionVFX assets were used in Godzilla Minus One appeared first on befores & afters.
    0 Commenti 0 condivisioni 7 Views
  • BEFORESANDAFTERS.COM
    The visual effects of Dune: Part Two
    An exclusive interview with visual effects supervisor Paul Lambert.Today on the podcast were talking about Denis Villeneuves Dune: Part Two with visual effects supervisor Paul Lambert.Now, the first thing you should also know is that Paul is part of the full issue of befores & afters magazine in print covering the film. The wrap-around cover in the latest print magazine, covering Dune: Part Two.Hes featured along with a number of other effects crew members, including special effects supervisor Gerd Nefzer, second unit visual effects supervisor Patrick Heinen from Wylie Co., and Rhys Salcombe and Stephen James, VFX supervisors from DNEG.In this chat with Paul, we go through the big visual effects scenes in the film, including the opening eclipse moments, the sandworm riding, the attack on the spice harvester, how Giedi Prime was made, the battle of Arrakis, and how the Fremens blue eyes were achieved. This is a really fun and informative chat and I hope you might also grab the magazine as well to get the even fuller picture. This episode of the befores & afters podcast is sponsored by SideFX. Looking for great customer case studies, presentations and demos? Head to the SideFX YouTube channel. There youll find tons of Houdini, Solaris and Karma content. This includes recordings of recent Houdini HIVE sessions from around the world.Find the podcast above, and here in this gallery check out some behind the scenes stills from the film.Click to view slideshow.The post The visual effects of Dune: Part Two appeared first on befores & afters.
    0 Commenti 0 condivisioni 10 Views
  • BEFORESANDAFTERS.COM
    On The Set Pic: Red One
    This is kind of cool: Dwayne Johnson previously shared a bunch of bluescreen shoot pics from the set of Red One.Shooting our new Christmas franchise #RedOneThink JUMANJI meets MIRACLE ON 34TH ST meets HOBBS & SHAW with a dash of HARRY POTTER & ITS A WONDERFUL LIFE.Let it sink in RED ONE@AmazonStudios @SevenBucksProd Jake Kasdan(director of the JUMANJI franchise) pic.twitter.com/9nIPzqZUnI Dwayne Johnson (@TheRock) November 29, 2022 The post On The Set Pic: Red One appeared first on befores & afters.
    0 Commenti 0 condivisioni 12 Views
  • BEFORESANDAFTERS.COM
    A new official 30 min doco on the making of The Penguin is now out there
    And it includes some great behind the scenes on the VFX of the series.The post A new official 30 min doco on the making of The Penguin is now out there appeared first on befores & afters.
    0 Commenti 0 condivisioni 36 Views
  • BEFORESANDAFTERS.COM
    Behind the scenes of Wicked and its flying stunts
    A new featurette is out! The post Behind the scenes of Wicked and its flying stunts appeared first on befores & afters.
    0 Commenti 0 condivisioni 37 Views
  • BEFORESANDAFTERS.COM
    Issue #23 of the print mag is a FULL issue on the VFX of Dune: Part Two
    Yep, a full issue all about the making of the film!Issue #23 of befores & afters magazine is now available ((US store link, or find your store below), and covers the special and visual effects of Denis Villeneuves Dune: Part Two, including the biggest sequences such as the opening eclipse encounter, spice harvester attack, sandworm riding and the battle on Arrakis.Featured are visual effects supervisor Paul Lambert, special effects supervisor Gerd Nefzer, second unit visual effects supervisor Patrick Heinen, from Wylie Co., and DNEG visual effects supervisors Stephen James and Rhys Salcombe.The mag is full of behind the scenes and final imagery, including many before and after images.A fun aspect of this issue is the wrap-around cover showcasing a moment from the eclipse sequence at the start of the film.Remember, you can find the issue at your local Amazon store:USA UK Canada Germany France Spain Italy Australia Japan Sweden Poland NetherlandsThe post Issue #23 of the print mag is a FULL issue on the VFX of Dune: Part Two appeared first on befores & afters.
    0 Commenti 0 condivisioni 40 Views
  • BEFORESANDAFTERS.COM
    Watch DNEGs VFX breakdown for s2 of The Rings of Power
    Including the major battle, and the Troll.The post Watch DNEGs VFX breakdown for s2 of The Rings of Power appeared first on befores & afters.
    0 Commenti 0 condivisioni 37 Views
  • BEFORESANDAFTERS.COM
    World VFX Day 2024 is happening live from Tokyo
    Its not long now until World VFX Day kicks off for 2024. This year its happening over two days6th December and 8th Decemberlive from Tokyo, right at the tail end of SIGGRAPH Asia.befores & afters will be there, and were very proud to be an event supporter in this incredible event championed by Hayley Miller.To find out more, or to get involved in sessions or as a sponsor, just head to https://worldvfxday.com/.The post World VFX Day 2024 is happening live from Tokyo appeared first on befores & afters.
    0 Commenti 0 condivisioni 34 Views
  • BEFORESANDAFTERS.COM
    Delve into Digital Domains visual effects for Agatha All Along
    Magic, the broom chase and the ghostly effects.Disney+ and Marvel Televisions Agatha All Along just wrapped up its first season. Below, Ive reproduced Digital Domains press release showcasing their VFX work on the show. It was led by Digital Domain visual effects Supervisor Michael Melchiorre, working with production VFX supe (and Digital Domain alum) Kelly Port. It also includes some before and after imagery.Magic EffectsBeginning with Lookdev in November 2022, Digital Domain was tasked with establishing distinct looks and colors for the magic of each witch: Agatha Harkness (purple), Teen (blue), Rio Vidal (green), Alice Wu-Gulliver (orange), Lilia Calderu (yellow), and Jennifer Kale (pink). Each effect had to be unique yet maintain a consistent feel, with each beam of magic requiring a specific origin and destination. Endeavoring to keep the effect grounded in the 2D world, the compositing team came up with an artistic and creative solution, and they were able to literally create magic in Nuke. The team augmented the beams generated in Nuke with tesla coil footage and other practical elements. This technique allowed for different variations and quick iterations. Occasionally, artists hand-animated bits of energy, drawing inspiration from classic films like Ghostbusters and Poltergeist to match the series authentic 2D, old-school vibe. In addition to the magic effects, Digital Domain also handled the witches desiccation effects, a task we were also responsible for on WandaVision. Traditionally, artists would have created CG digidoubles, but, in this instance, the time and cost would have been prohibitive for the high number of characters that needed to be desiccated. So, while the hero characters, like Agatha and Teen, utilized the same 3D approach seen in WandaVision, building full CG Digi doubles for approximately 12 other witches requiring desiccation was not feasible.For the non-hero characters, the team developed a hybrid approach. The CG team started by building one master desiccated head asset based on our internal genman/genwoman rigs. The compers then took these generic renders and UV-mapped them to fit the unique facial features of each actor or actress. This allowed for a quick turnaround time, seamlessly blending compositing techniques with 3D enhancements to achieve the desired result.Broom ChaseDrawing inspiration from Return of the Jedi and its speeder-bike chase, on-set production built an elaborate forest set. It used drones to capture 180-degree array footage that, when stitched together, could be used as backgrounds to insert our broomstick riding coven into. Each coven member was shot individually on a blue screen, suspended by harnesses. Once the compositing team extracted each character from the blue screen, the animation team stepped in to help fly the witches through our digitally created forest environment that matched the practical set. This helped ensure they were all traveling correctly in 3D space, keeping their scale and position consistent and, most of all, keeping a natural and realistic appearance. The comp team took this flight information from anim and reprojected each witch onto animated geometry, rephotographing them with the camera lens that matched the selected array background. The lighting team rendered full 3D versions of our forest assets that were used to extend and enhance specific areas of the practical photography. This resulted in seamless composites that were often constructed from as many as eight to ten individual plates. As they try to escape the forest road, the coven flies high above the treetops and across a blood moon. This scene initially began as a traditional painted backdrop on glass, a technique reminiscent of matte paintings used 40 years ago. Using this painting as our base, the team fleshed it out, adding extra details like distant stars and galaxies. Individual puppets dressed as each character were photographed on miniature brooms against a blue screen on set. To enhance the scene, we augmented each character to ensure their hair blew naturally and clothes moved properly. We hand-animated each character to fly smoothly in an arc across the moon, paying homage to E.T. Digital Domains previsualization team, led by Head of Visualization Matt McClurg, was particularly instrumental in planning and mapping out the complex Broom Chase sequence. On set, the production relied heavily on the previs created by McClurg and his team, using it as a key reference for camera angles and coverage within the technical confines of the shoot. This groundwork was invaluable for the Digital Domain artists during post-production, offering a blueprint for our VFX team on how the plates were intended to be assembled. Although the creative process evolved and adapted, the visualization provided vital touchstones that guided the VFX teams efforts. As the primary previs vendor, McClurg and his team delivered approximately 740 unique shots across 16 sequences, including the Morgue, the Castle, The Witches Road, and several others, significantly contributing to the series visual storytelling. View this post on InstagramA post shared by Ali Ahn (@aliahn)Agathas Death & GhostOne of the most challenging sequences was Agathas death. Digital Domains VFX work in these scenes required a delicate balance of beauty and darkness as Agatha sacrifices herself in a peaceful, somber moment. She is lifted by intertwining magic and gently placed on the ground, where she merges with the earth. The intent was to create a scene reminiscent of a nurse logsimultaneously beautiful and decaying. Although there was consideration of completing this sequence in 2D, most of the work was done in CG. As Agatha is gently laid to rest on the ground, her body withers and decays in a visually stunning manner, with grass, flowers, and mushrooms growing to envelop her. As the mushrooms mature and die, they are replaced by blooming roses and purple flowers, symbolizing the beauty that arises from her sacrifice. The environment team meticulously controlled every mushroom, flower, and blade of grass, while the effects team added subtle details such as falling pebbles and dirt being pushed aside as new flowers sprouted from the ground. A keen observer will notice that Agathas toes begin to curl as she withers away. A subtle nod to witches that have come before. In the following episode, Agatha is reborn as a ghost. The team drew inspiration from Ghostbusters, Poltergeist and other 80s ghost films to achieve the ethereal effect. Production was very specific about how ghostly she needed to appear and, most importantly, that we retain every subtlety of Kathryn Hahns wicked performance. We began by extracting Agatha from each background and reconstructing the room behind her. Through a series of keys and mattes of targeted densities, compositors slowly layered Agatha back into the plate, paying close attention to the density and transparency of her form. On-set production had fans blowing on Agatha to simulate a gentle, ethereal breeze. However, this was often blowing too fast for the gentle look we were after. To solve this piece of the puzzle, compositors strategically retimed and slowed down areas of her gown or hair that were moving too fast. Again being careful not to alter the performance or facial features. These areas were then reintegrated into Agathas ghostly form.The exceptional work of our roto/paint and lighting departments further elevated Digital Domains VFX excellence. Their meticulous attention to detail and expert craftsmanship provided the perfect finishing touches, ensuring that every frame was seamlessly integrated and visually stunning. This underscored the teams commitment to creating a captivating and immersive experience for viewers.The Digital Domain team also created the skeletal face for the villain reveal, Agathas mothers ghost, and more.Marvel Televisions Agatha All Along is available to stream today on Disney+.The post Delve into Digital Domains visual effects for Agatha All Along appeared first on befores & afters.
    0 Commenti 0 condivisioni 36 Views
  • BEFORESANDAFTERS.COM
    Watch Outposts VFX breakdown for s2 of Pachinko
    Go behind the scenes.The post Watch Outposts VFX breakdown for s2 of Pachinko appeared first on befores & afters.
    0 Commenti 0 condivisioni 59 Views
  • BEFORESANDAFTERS.COM
    Watch Raynault VFXs breakdown for Deadpool & Wolverine
    Environments, bluescreen comps and more.The post Watch Raynault VFXs breakdown for Deadpool & Wolverine appeared first on befores & afters.
    0 Commenti 0 condivisioni 61 Views
  • BEFORESANDAFTERS.COM
    The making of Gladiator: a look back with VFX supervisor John Nelson
    The Colosseum, tigers and more.Coming up this week is the release of Ridley Scotts Gladiator II. So, we thought wed go back to the first Gladiator with the VFX supervisor of that film, John Nelson. John of course won the visual effects Oscar for Gladiator, alongside Tim Burke and Rob Harvey of Mill Film, and SFX supervisor Neil Corbould.In this chat we dive deep into a number of the big sequences, starting with that very famous Steadicam shot of the Gladiators entering the Colosseum. We also talk about the Rome builds, the amazing tiger fight, and the forest battle in Germania. John shares a few fun memories from Oscar night as well. This is a really informative chat looking back at the VFX process from around the year 2000. I have to say also that Gladiator was one of those films that had an amazing DVD release with very very thorough VFX featurettes looking over the shoulder of artists at The Mill working on SGI machines, and working with tools like Softimage and Flameso try and find those featurettes if you can.This episode of the befores & afters podcast is sponsored by SideFX. Looking for great customer case studies, presentations and demos? Head to the SideFX YouTube channel. There youll find tons of Houdini, Solaris and Karma content. This includes recordings of recent Houdini HIVE sessions from around the world.The post The making of Gladiator: a look back with VFX supervisor John Nelson appeared first on befores & afters.
    0 Commenti 0 condivisioni 67 Views
  • BEFORESANDAFTERS.COM
    The Polar Express is 20. Heres a fantastic behind the scenes document anyone can access
    The mocapd Robert Zemeckis film featured pioneering work by Sony Pictures Imageworks.Sure, a lot of people remember The Polar Express because of the Uncanny Valley. But the film (celebrating its 20th anniversary right now) was arguably one of the big game changers in the way it approached motion capture and virtual cinematography, thanks to the efforts of director Robert Zemeckis, visual effects supervisor Ken Ralston and the team at Sony Pictures Imageworks.The technical artistry of the film is packaged up in an extremely insightful behind the scenes document publicly available from Imageworks website. It was presented as a course at SIGGRAPH 2005 and titled From Mocap to Movie: The Polar Express, presented by Rob Bredow, Albert Hastings, David Schaub, Daniel Kramer and Rob Engle.Inside youll find a wealth of information about the motion capture process, animation, virtual cinematography, effects, lighting, stereo and moreeven the original optical flow test for the film is covered.I think its a fascinating read, and an important one in the history of motion capture and virtual production. Remember, the film came out in 2004; Avatar (which took performance capture much futher, of course, came out in 2009). Over the years, SIGGRAPH courses have been an invaluable resource for discovering the history of tools and techniques at visual effects studios. I love that this resource for The Polar Express exists.The post The Polar Express is 20. Heres a fantastic behind the scenes document anyone can access appeared first on befores & afters.
    0 Commenti 0 condivisioni 62 Views
  • BEFORESANDAFTERS.COM
    See how JAMM orchestrated Josh Brolins interaction with an orangutan on Brothers
    Watch the JAMM VFX breakdown below.The post See how JAMM orchestrated Josh Brolins interaction with an orangutan on Brothers appeared first on befores & afters.
    0 Commenti 0 condivisioni 59 Views
  • BEFORESANDAFTERS.COM
    Watch Untold Studios VFX breakdown for that walrus Virgin Media spot
    Digital walrus, digital boat, and digital water. View this post on InstagramA post shared by Untold Studios. (@untold_studios)The post Watch Untold Studios VFX breakdown for that walrus Virgin Media spot appeared first on befores & afters.
    0 Commenti 0 condivisioni 67 Views
  • BEFORESANDAFTERS.COM
    Watch breakdowns from Vine FX for Paris Has Fallen
    Go behind the scenes.The post Watch breakdowns from Vine FX for Paris Has Fallen appeared first on befores & afters.
    0 Commenti 0 condivisioni 77 Views
  • BEFORESANDAFTERS.COM
    VIDEO: Introducing Reallusion AI Smart Search
    In this new video, discover Reallusions AI Smart Search, now seamlessly integrated into iClone, Character Creator and Cartoon Animator. It provides instant access to countless models and animations from the Content Store, Marketplace and ActorCore. Choose between AI-powered Deep Search for precise results or traditional text-based search for simplicity all within the application.Brought to you by Reallusion:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post VIDEO: Introducing Reallusion AI Smart Search appeared first on befores & afters.
    0 Commenti 0 condivisioni 73 Views
  • BEFORESANDAFTERS.COM
    On The Set Pic: Nautilus
    The series was filmed in Queensland, Australia.The post On The Set Pic: Nautilus appeared first on befores & afters.
    0 Commenti 0 condivisioni 75 Views
  • BEFORESANDAFTERS.COM
    Watch Framestores vis reel for Deadpool & Wolverine
    Framestores Pre-Production Services (FPS) delivered over 900 previs, techvis and postvis shots.The post Watch Framestores vis reel for Deadpool & Wolverine appeared first on befores & afters.
    0 Commenti 0 condivisioni 71 Views
  • BEFORESANDAFTERS.COM
    More invisible VFX from Ripley
    This time from ReDefine.The post More invisible VFX from Ripley appeared first on befores & afters.
    0 Commenti 0 condivisioni 73 Views
  • BEFORESANDAFTERS.COM
    The VFX and stop-motion animation of Beetlejuice Beetlejuice
    Including some very fun Easter eggs about the stop-motion scenes.Today on the befores & afters podcast, a look behind the scenes of Tim Burtons Beetlejuice Beetlejuice. We start with visual effects supervisor Angus Bickerton, who shares some of the overall VFX challenges, including the putting back together of Monica Bellucis Delores character, and the puppet for the Beetlejuice baby. Work by Framestore and One of Us is discussed. Angus makes particular mention of creature effects creative supervisor Neal Scanlan, too.Then we dive into the stop motion animation work by Mackinnon & Saunders, including with Ian Mackinnon, stop motion supervising producer, and Chris Tichborne, animation supervisor. Theres a lot of fun detail here about the making of the sandworm and its animation, and the plane crash. I love Chris mention of the live action reference video he shot of himself in a swimming pool where he couldnt actually tell anyone he was with what it was for. Also, theres a fun easter egg moment featuring Tim Burton on the plane. This episode of the befores & afters podcast is sponsored by SideFX. Looking for great customer case studies, presentations and demos? Head to the SideFX YouTube channel. There youll find tons of Houdini, Solaris and Karma content. This includes recordings of recent Houdini HIVE sessions from around the world.Listen to the podcast above. And, below, a video breakdown of the stop-motion scenes.The post The VFX and stop-motion animation of Beetlejuice Beetlejuice appeared first on befores & afters.
    0 Commenti 0 condivisioni 82 Views
  • BEFORESANDAFTERS.COM
    The making of Slimer in Ghostbusters: Frozen Empire
    A new VFX breakdown from Imageworks is here.The post The making of Slimer in Ghostbusters: Frozen Empire appeared first on befores & afters.
    0 Commenti 0 condivisioni 70 Views
  • BEFORESANDAFTERS.COM
    Behind the scenes of the Iacon 5000 scene in Transformers One
    Includes a few fun breakdowns and views of ILM Sydney.The post Behind the scenes of the Iacon 5000 scene in Transformers One appeared first on befores & afters.
    0 Commenti 0 condivisioni 79 Views
  • BEFORESANDAFTERS.COM
    The making of Rook in Alien: Romulus
    Legacy Effects and Metaphysic combined to make the character. Excerpts from befores & afters magazine in print.At one point in Fede Alvarezs Alien: Romulus, the characters encounter a damaged android, Rook. Rook resembles the android Ash from Alien, played by Ian Holm, who passed away in 2020. With several scenes, and even dialogue, Rook would require a unique combination of a practical animatronic built and puppeteered by Legacy Effects, and visual effects augmentation by Metaphysic using machine learning techniques.For Legacy Effects, the animatronic Rook build needed to happen fast. The studio would normally look to have four to six months to make such a thing, but here they only had two. One challenge, plainly, was that they did not have the actual actor to do a live cast or 3D scan with. There were no existing molds of Ian from Alien, reveals Mahan. They certainly made one because Yaphet Kotto knocks Ashs head off with a fire extinguisher. They certainly made something, but it doesnt exist. And if it does, no one wants to admit that they have it because we searched.Below, scroll through for behind the scenes of the Rook animatronic shoot. View this post on InstagramA post shared by Amy Byron (@shmamy_b)Luckily, there was an existing cast of Holm from The Hobbit films, and certainly the original film from which to reference. That cast of Ian was done many years after Alien, of course, notes MacGowan, so all we could get from that really was the placement of his features. What we did do was make two clay portraits of his face. Andy Bergholtz and Jason Matthews did those, and then we scanned these sculptures. It was only a half face, so we scanned it and then Scott Patton digitally re-sculpted the whole thing.The Rook animatronic was then ultimately built as a creature effect that could be puppeteered. The sets had to be constructed so that the team could be hidden underneath or allow for the choreography via slots in a table when Rook is shown crawling. The animatronic also featured a less-damaged right arm that a puppeteer could perform, and then a left damaged arm that was an animatronic puppet arm. The whole body was actually a life cast of my body, says MacGowan, that was then re-sculpted with all the damage and it was all put together.Part of the performance is the delivery of lines, and for this an actor was cast and his voice recorded. Legacy Effects used the voice to program in the moves onto their Rook animatronic for playback on set. This became the basis of the character, with enhancements made by Metaphysic for eyes and mouth movement, resulting in a hybrid practical/digital approach.Its pretty satisfying to bring back that character, reflects Mahan. It wasnt easy. I think its a very admirable attempt to resurrect somebody whos no longer with us to be in a movie again. I mean, if you wouldve told us when we were walking out of the theater having seen Ash in Alien that someday we were going to make a replica of him in a different movie, I wouldnt have believed it. Its very cool.The VFX side of RookFede said to me, It needs to start as a puppet, shares VFX supervisor Eric Barba. He said, Its a broken android, so it didnt have to be perfect. It had gone through some hell, half its bodys missing, part of its face is going to be missing, but were going to have to augment it probably if we dont get it right in camera.issue #22 Alien: RomulusI fell back on what I know of head replacement and recreating CG, continues Barba, who worked on groundbreaking digital human productions such as The Curious Case of Benjamin Button and TRON: Legacy. Quite honestly, I thought Id moved away from doing that because its excruciating. I used to joke with people that I had property in the Uncanny Valley, and its really difficult to get rid of. No one wants to live there, and when you finally move out of there, you really dont want to go back. And so I said, Look, were going to make the best puppet we possibly can. Well put a headcam on our actor that well cast, well get his performance and well get the audio from that performance. On the day, well play that back for the cast so thats what theyre reacting to. But it just means you have to have all those things done ahead of time and be happy with those choices. Its easier said than done but thats exactly what we did.As noted, Legacy Effects delivered a Rook animatronic puppet for use on set for filming in Budapest. The plan, then, was to augment the puppets movements digitally. Our puppet was never going to look photorealistic from its mouth movements, advises Barba. We wanted the stuff coming out of its side, too. Initially, we settled on a 3D approach but that approach became time consuming and costly, and we were on a modest budget and a shortened back-end post schedule.Fede felt strongly about the deepfake technology, adds Barba. I actually brought a wonderful artist into post, Greg Teegarden. I said, Look, I want you to do deepfake just on the eyes for our preview screenings and lets see. We were very lucky that we got the studio on board and we pulled the original 4K scan of Alien, of all the Ian Holm photography. We started building a model, and we used that model to do the initial directors cut. We had something there other than the puppet. And I cant tell you how exciting that was when we first saw stuff. ILM also did a test and it brought that puppet alive and Fede felt even more strongly about how we should do this.To finalize the Rook shotsknowing that budget and timeline were criticalBarba then called upon his former boss Ed Ulbrich, now chief content officer & president of production at Metaphysic, which has broken into the machine learning and generative AI space, including with digital humans. Says Barba: I was super excited about what they could offer, and I said, Well, lets do a test and show Fede. And thats what we did, and thats what led us to using Metaphysic, which really helped us solve a lot of problems.They have amazing AI tools that you cant do with just a deepfake or even without more 3D trickery, says Barba. They could re-target our eyelines. They could add blinks, they could make adjustments from the head-cam footage. They wrote software to drive our solve and then they could dial in or out the performance if Fede wasnt quite happy with it. Metaphysic was able to give us those tools, and I think they did a great job. We threw them a lot of curve balls and changes.One particularly challenging aspect of Rook was the many lighting conditions the android appears in, as well as being displayed on black and white monitors on occasion. The thing that surprised me the most was how well the monitor shots worked right out of the box, comments Barba. Fedes mantra was going back to the analog future. Everything needed to have that look.To get the look, the director sought out a specific JVC camera that had been used on Alien (1979). Fede loved the look of the head-cam shots and monitor shots, notes Barba, especially that burning trail you see sometimes in 1980s music videos. He said, Ah, weve got to match that. So we did. We literally got that camera and we started shooting with it in principal photography. And then it broke! It lost its ability to focus. Everything started becoming soft. We were in Budapest and it was the only one we could find and no one knew how to fix it. So, we ended up shooting it on other cameras and then Wylie Co. matched the look and did all the screens to keep it concise and cohesive throughout. They did a great job making that look work.Relating also to those monitor shots of Rook was the fact that the animatronic had been filmed without a CCTV-like camera positioned in the frame, that is, without something that would show how a monitor shot of Rook would be possible in the first place. So, a camera was added in via visual effects. And the artist responsible for that work wasnone other than the director, Alvarez. (Its worth looking back at Alvarezs own early days in VFX and directing at his YouTube page, something he discussed in detail at the recent VIEW Conference). Below, from VIEW Conference, a shot of the Rook animatronic without the camera in place, and one where it has been added to the scene.Go further into Alien: Romulus in the print magazine.The post The making of Rook in Alien: Romulus appeared first on befores & afters.
    0 Commenti 0 condivisioni 87 Views
  • BEFORESANDAFTERS.COM
    Video to 3D Scene tech showcased in Wonder Animation beta release
    Its part of the AI toolset from Wonder Studio that will let you film and edit sequences with multiple cuts and various shots and then be able to reconstruct the scene in 3D space. Wonder Dynamics, an Autodesk company, has launched the beta of Wonder Animation, a new tool in the Wonder Studio suite that transforms video sequences into 3D-animated scenes. Capable of handling multiple cuts and complex shots, Wonder Animation reconstructs these scenes in a 3D space. And then makes it full editable.Its now available to Wonder Studio users. You can find more info in Autodesks blog post and in the video below. The post Video to 3D Scene tech showcased in Wonder Animation beta release appeared first on befores & afters.
    0 Commenti 0 condivisioni 97 Views
  • BEFORESANDAFTERS.COM
    Gladiator II SFX supervisor Neil Corbould convinced Ridley Scott to reimagine the rhino
    Special effects supervisor Neil Corbould on finding some old storyboards from the abandoned rhino sequence in the first Gladiator film and showing them to Ridley Scott for Gladiator II, who decided to include the rhino in his new film.The post Gladiator II SFX supervisor Neil Corbould convinced Ridley Scott to reimagine the rhino appeared first on befores & afters.
    0 Commenti 0 condivisioni 92 Views
  • BEFORESANDAFTERS.COM
    On The Set Pic: Uprising
    Credit: Lee Jae-hyuk/NetflixThe post On The Set Pic: Uprising appeared first on befores & afters.
    0 Commenti 0 condivisioni 82 Views
  • BEFORESANDAFTERS.COM
    The visual effects of Percy Jackson and the Olympians
    Visual effects supervisor Erik Henry on using ILM StageCraft, and on the many creatures of the show.Today on the befores & afters podcast, were chatting to visual effects supervisor Erik Henry about the Disney+ series Percy Jackson and the Olympians. Its a show with a multitude of creatures and also one that has utilized ILMs StageCraft LED volume and related tech for filming.Erik goes into detail about how various creatures were filmed on set with stuffies or partial make-up effects and bucks, and then about how the vendors created the final CG versions. Some of those vendors were ILM, MPC, Raynault FX, Storm Studios, Hybride and MARZ.Check out some shot breakdown stills below.Chimera previs.Chimera background plate.Chimera fire element.Chimera final comp from MPC.Minotaur motion base.Minotaur animation dev.Minotaur final shot by ILM.The post The visual effects of Percy Jackson and the Olympians appeared first on befores & afters.
    0 Commenti 0 condivisioni 104 Views
  • BEFORESANDAFTERS.COM
    The making of U2s Vertigo
    A newly released behind the scenes doco showcases BUFs work for the music video.The post The making of U2s Vertigo appeared first on befores & afters.
    0 Commenti 0 condivisioni 86 Views
  • BEFORESANDAFTERS.COM
    How miniatures were made on Alien: Romulus
    Miniature effects supervisor Ian Hunter and Pro Machina teamed up for the film. An excerpt from befores & afters magazine in print.As well as creatures, Alien: Romulus features a number of space environments, the Renaissance research station, the Corbelan hauler, the Weyland-Yutani Echo space probe and other spacecraft. Some of these elements were initially considered as effects tasks that could be handled with miniatures, as VFX supervisor Eric Barba relates. We really wanted to shoot as much as we could as miniatures, but at some point the budget and number of days you have to shoot pokes up. And then also the action we wanted to stage didnt lend itself to an easy motion control shoot with miniatures. View this post on InstagramA post shared by PRO MACHINA (@promachina)Ultimately, the Corbelan and the probe were built in miniature by Pro Machina Inc., with Ian Hunter as miniature effects supervisor. We shot half a dozen shots but in the end were able to use just a few shots of the Corbelan model, states Barba. The probe was built practically but it was entirely digital in the film. The thing is, we got such great models to use from amazingly talented model makers and that gave us exactly what the CG team then had to do to match them.In terms of the work by Pro Machina, Gillis and his co-foundersCamille Balsamo-Gillis and Reid Collumspartnered with Hunter to build the Corbelan hauler and the Weyland-Yutani Echo space probe as models for the film. Pro Machina came about from a desire to, Gillis explains, have under one roof the ability to build all sorts of miniatures, as well as the creature effects and props. I invited Ian to come in as a freelance VFX supervisor and also keep working with Camille, who had been his producer on several projects already.issue #22 Alien: RomulusWhen I told Fede that we also build miniatures and that I had two-time Oscar winner Ian Hunter, Fedes eyes lit up. I mean, it was not just me recommending Ian. I dont want to take credit for that because Ians work stands on its own. In my opinion, he is the premier miniature effects and VFX creator. So, they took it from there. I provided the space and the structure, but its Ian and Camille who run the show.As noted, the two miniaturesthe Corbelan and the probewere built and then filmed for a number of shots, some of them against LED walls for light interaction. Those two models were scanned and used as the basis for the digital models that ILM created, describes Gillis. I think that ILM did a spectacular job with them. Theyre very tactile looking. Of course, the foundation of them are the actual miniatures, which were so great. I liked the approach, where we started with something practical. View this post on InstagramA post shared by PRO MACHINA (@promachina)In fact, I think we need more of the hand-off happening with models because they still have a tremendous amount to offer. Our practical work is a 120-year-old craft, and it is real. So, lets use it where we can. Lets use the right tool for the right moment. I just hope the fans appreciate the hand-off because I dont want to diminish anyones art, I want to enhance. Thats what the goal always is.Go further into the film in the print magazine.The post How miniatures were made on Alien: Romulus appeared first on befores & afters.
    0 Commenti 0 condivisioni 98 Views
  • BEFORESANDAFTERS.COM
    Roadtesting Rokokos Smartgloves and Coil Pro
    Matt Estela fires up this motion capture kit from Rokoko for a test run of their Smartgloves and Coil Pro.You may know Matt Estela from his Houdini activities, or his incredible CG resource CGWiki. Matt is currently a Senior Houdini Artist at Google and previously worked in VFX and animation at Animal Logic, Dr. D Studios and Framestore CFC.Matt likes tinkering with new tech, so I asked him to road test Rokokos Smartloves and Coil Pro; two motion capture offerings from the company. Heres how Matt broke down the tools.(And, yes, befores & afters ON THE BOX series is back!)TLDR; its goodIt captures fingers nicely, position tracking with the Coil Pro works great, calibration is fast, the capture software is easy to use and exports very clean FBXs. Its a little pricey as a full package, but worth it all things considered, and Rokoko support is great.My BackgroundWhile Im known as a minor Houdini celebrity in very small social circles, I actually started in 3D as a character animator in 1999/2000. It took about a year to realize I didnt have the patience or dedication for it and moved into more technical roles, but 24 years later I still love and appreciate quality animation.My move into tech plus my background as a failed animator meant when Ian offered these gloves to review I jumped at the chance.Tech BackgroundBroadly, mocap tech falls into several categories:Dedicated opticalDedicated IMUMachine learningAdaptation of smartphones and VR headsetsAt its core, mocap needs to know where a joint is in 3D space. Optical uses multiple cameras to identify bright dots on a suit, triangulates where those dots are based on all those cameras, and then calculates an absolute position of where that dot is. While optical is very accurate, it is also very expensive; these systems require special high speed cameras, ideally as many as possible, with associated dedicated infrared lighting, which all need to be carefully calibrated in a dedicated performance space.IMU (Inertial Measurement Unit) systems like Rokoko dont directly solve the absolute position of joints, but calculate it from acceleration. Cast your mind back to high school physics, and remember how position, velocity and acceleration are linked. Velocity is a change in position over time, and acceleration is a change in velocity over time. IMU sensors measure acceleration, you can take that acceleration and run those high school equations in reverse; use acceleration to get velocity, use velocity to get position. Because IMUs are self-contained they dont require cameras, meaning they dont suffer from the occlusion issues of optical systems. While IMU systems are not as accurate as optical, they are substantially cheaper.Machine learning has been a recent addition to the space, where they guess the pose of a human based on training data. They produce adequate results for real time use, but to achieve the quality required for film and games require offline processing in the cloud, which can be a concern for some.The final category is adapting smartphones and VR headsets. Both have cameras and IMU sensors on board, and also increasingly feature on-board machine learning for hand tracking and pose estimation. Quality is variable, and are limited to motions that can be comfortably done while holding a phone or wearing a headset.SmartglovesIn 2020 Rokoko launched the Smartgloves, one of the first commercial systems to offer hand tracking at a reasonable price point without requiring the skills of a dedicated motion capture facility. It also offered the ability to integrate with the smartsuit to provide an all in one solution for body and hand mocap.I had the chance to test these gloves shortly after launch. My experience with mocap at that point was a handful of optical systems for some university research projects, and dabbling with some smartphone systems for facial capture and early VR apps for hand and head position capture.This put me in an interesting space; I hadnt tried any IMU systems, and so was judging the gloves based on experience with the optical body capture and VR hand capture systems mentioned above.I tested them for a couple of weeks, and my initial verdict was oh, theyre ok I guess. The gloves did exactly what they were designed to do, capture fingers, but as someone who talks with their hands a lot, my expectation was that the gloves would capture full hand gestures, which if you think about it, means understanding what the wrists and elbows are doing for full Italian style gesticulation silliness.Further, because I was only wearing the gloves (and clothes, cmon), it was natural to try and focus on hand centric animation; clapping, typing, steepling fingers etc. Again, the gloves in their original format arent really meant to do this. Think about the limitation of IMU, theres no ability to know where the hands are relative to each other, they cant detect if youre really perfectly still or moving veerrryyy slowly at a constant velocity.This all manifests as drift; do a clap for example, very quickly the hands end up intersecting each other. Hands on a desk will slide away, overall body pose starts to rotate etc. If your needs are broad body gestures maybe this is fine, especially for Vtubers and similar fields where high accuracy isnt an issue.At its core, IMU on its own is incapable of the accuracy needed to capture hand gestures. Again back to high school physics, that process of acceleration -> velocity -> position is affected by sensor accuracy and the limits of real time calculation. The numbers arent perfect, the sensors arent perfect, meaning results drift. Theres ways to compensate for this, e.g. the smartsuit understands the biomechanics of a human skeleton to make educated guesses of where the feet should be, how knees should bend, and if paired with the gloves, can drastically improve the quality of the hand tracking. But without the suit, and without other sensor data, two handed gestures would always be difficult.Rokoko themselves of course know about the limitations of IMU, and had plans to augment this.Coil ProFast forward a few years, and Rokoko released the Coil Pro. This uses another technology EMF, or electromagnetic fields, in conjunction with IMU, to be able to calculate worldspace positions. It promised results like the worldspace positions of optical, without the occlusion issues of optical, and especially without the cost of optical.Rokoko mentioned this was coming soon back in 2020, time passed, I forgot. In 2024 they got in touch again, unsurprisingly getting it to market took longer than expected, and asked if Id be interested in trying again, of course I was.SetupThe coil arrived, about the size of a small guitar practice amp. Special mention has to be made for the unboxing process, an amusing bit of showmanship:The install process was pretty straightforward; connect it via USB to your computer to register it, then disconnect it. It doesnt require a permanent connection to your computer, only power (which is also delivered via USB), so its easy to move to a convenient location.A new pair of Smartgloves also arrived with the Coil, they needed to be registered and have firmware updates. This took longer than expected, mainly because Im an idiot who didnt read the manual carefully enough; the gloves need to be updated one at a time. Special shout-out to Rokoko support who were very helpful, and logged in to my machine remotely to identify issues. Everyone at Rokoko pointed out I wasnt getting special treatment, this is the level of service they offer to all their customers.Once the gloves were updated and registered, the final setup step was how you bind the gloves to your avatar within the Rokoko software. By default the gloves float free in virtual 3D space, which worked, but the results felt a little drifty and strange. Again my dumb fault for not reading the manual, support advised me to bind the gloves to a full body avatar, despite not having a full Smartsuit.Suddenly the result was a lot more accurate. My understanding is that when linked this way, the software can use biomechanics to make better estimates of the wrist and arm positions, leading to a much more accurate result.In useWith everything setup, I was impressed at how invisible the process became. Previous mocap tests with optical and smartphone/vr headset systems constantly reminded me of their limitations; occlusion with optical will guess strangely, ML systems will often do a plausible but incorrect guess of limb locations. With the Smartgloves and Coil, I never got these glitches, it just feels like an on screen mirror of my actions.Calibration is very straightforward; hit a neutral pose, hold it for 3 seconds, done. Calibration for optical systems has taken a lot longer. Once calibrated, hit record, do a motion, then stop recording. You can review the action immediately, re-record if you choose.Exporting to FBX is very easy, and loaded into Houdini without issues.Example: Testing a laptopMany times Ive had ideas for little animations, but Id get 10% into the process, get bored, stop. Similarly Id have ideas for stuff that I might film (I was one of the dweebs who got one of the first video capable DSLRs thinking Id be the next Gondry), but again the effort to finish something was just too much.Once the gloves were setup and I could see my avatar on screen, I started testing scenarios in realtime; how cartoony could I move my arms, how did the body adjust based on the hand positions, how well typing was captured. Quickly I improvised a scenario where testing a laptop goes wrong, the tester reacts, panics. I could let it record, try a few different takes, play it back. It was the ideal blend of animation output, but with the spontaneity of improv and acting.The limitations of doing full body capture with only gloves led to some fun problem solving. How would the character enter and exit? I couldnt capture walking easily, but maybe theyd be on a segway? Again I could test quickly, export a clip, blend it and the previous take in Houdini, be inspired, try something else. Heres the end result:Example: NewsreaderA friend was curious about the gloves, so I asked what sort of motion they might record. He said a newsreader talking to camera, shuffling papers, that sort of thing.Out of curiosity I timed how long it took from getting the suggestion to having it playback in Houdini; it was 5 minutes. 5 minutes! Amazingly, 2 minutes of that time was waiting for a Rokoko software update. The result was perfectly usable, glitch free, no need for cleanup. Thats pretty amazing.What surprised me with this test was how the Rokoko software animated the body, even though I was only recording motion via the Smartgloves. The software uses the hand data to estimate what the rest of the body is doing; its possible to split off the hands in a DCC, but not only was the body estimation pretty good, it had way more character than I expected.Comparing to alternativesFull disclosure, as stated earlier Im not a mocap person, so what follows is results of some quick google/youtube research.The main competitors appear to be Manus and Stretchsense. A key differentiating factor is Manus and Stretchsense are designed to be used with another mocap system, while Rokoko are pushing a unified body+gloves package.As such this makes direct comparisons a little tricky. All 3 systems track fingers, but to get accurate hand positions where collisions matter, all need augmentation; Rokoko via the Coil, Manus and Smartsense from an optical system like an Optitrack. If the Manus or Smartsense are paired with a IMU system like Xsens, their ability to track claps and other two handed gestures will be limited.Cost is relevant here too, the Smartgloves and Coil combination is cheaper than either of the top of the line options for Manus or Smartsense, and the 2 later options would still require another system to do accurate positional tracking. Theres analogies to be made here to the mac vs pc world; Rokoko are pushing a single Apple style ecosystem, while the others are modular and designed to work with a few different systems.Moving away from dedicated systems, theres the option of using a Quest headset to track hands. The Glycon app is cheap and cheerful, but suffers the same issues of occlusion; if the cameras on the headset cant see your fingers, it will guess what your fingers are doing, often incorrectly. The location of the cameras means your most neutral handby-sides idle pose is not tracked well. Further, while a mocap suit+gloves setup is designed to handle extreme motion, a VR headset is not, so youre limited to gestures you can do comfortably and safely while wearing a high tech shoebox on your face.The final alternative is keyframing hand and finger animation manually. Hand animation is invisible when done well, but draws attention to itself when done poorly. Like faces, our brains are so tuned to the behaviour of hands that we spot bad hand animation immediately. To get hand animation comparable to the results I captured, even for relatively simple idle and keepalive animations, would take hours to keyframe. If you require lots of hand animation, that time (and artist cost) adds up quickly.As a very rough matrix of options, the full list looks like this:Other thoughtsWas interesting chatting with Jakob Balslev, the CEO of Rokoko. It reminded me of the old adage of the difference between theory and practice:In theory there is no difference but in practice there is.The basic theory of using IMU and EMF for motion capture makes sense, but the engineering work required to make it happen, to get to manufacture, to hit a price point, is huge. Hardware is way harder to develop than most of us take for granted. Jakob quipped we would probably have never started on if we knew how hard it would be but now we are glad we did!. It was also interesting to hear how lessons from each product informed the next, so the gloves are better than the suit, and the coil is better than both. The tricky part is theyre all meant to work together, an interesting balancing act. Rokoko definitely seem to love the work they do, and are constantly refining their products with both software and firmware updates.ConclusionAs I said at the start, its good. It solves many of the issues that exist with older or cheaper hand setups, while avoiding the cost of more advanced setups. I was impressed that while all mocap gloves are expected to only track fingers and maybe some wrist rotation, I was able to get some fun and plausible full body mocap with very expressive arm animation. If your mocap needs are largely fingers and hand based, and occlusion issues with AI or Quest setups have bothered you, the smartgloves and coil are an ideal solution.Bonus: testing with full suit at UTS ALAId been chatting with friends at UTS ALA who had the Smartsuit and Smartgloves. As far as we could tell the studio space is made of magnets, iron filings, van der graf machines, as a result the system never worked as well as they hoped. Alex Weight, the creative lead at ALA, is a very experienced film character animator and director, and found that while the system might have been ok for, say a Vtuber, it wasnt at the level he needed for previs; hands would drift through each other too easily, legs would kick out at strange angles, no matter how much calibration or wifi adjustments they made.Rokoko were happy for me to pop down with the Coil and test. Given their previous results the team at ALA were a little sceptical, but after the Coil was set up, the difference was astonishing. Practically no drift, worldspace positions of the hands remarkably clean. We got increasingly ambitious with tests, so holding props, picking up a chair, leaning a hand on a desk, all worked exactly as expected. I know that Rokoko are working on a big firmware upgrade for the suit that will improve results with the Coil further still.Do you have a product (hardware or software) that youd like to see in befores & afters On The Box? Send us an email about it.The post Roadtesting Rokokos Smartgloves and Coil Pro appeared first on befores & afters.
    0 Commenti 0 condivisioni 103 Views
  • BEFORESANDAFTERS.COM
    iClone 8.5 Free Update SIM Builder with Prop Interaction & Smart Accessory
    Digital Twin and Crowd Simulation with Smart Environment Interaction.The automation of animated crowds has been widely used in the movie and entertainment industries. As the demand for digital twin, machine learning, and AI training grows, a new industrial revolution is emerging, particularly in areas like autonomous driving, smart surveillance, factory automation, and intelligent consumer devices.These fields require sensors or cameras not only to see but also to understand the world around them, especially in recognizing human behaviors. To achieve this, there is a growing need for applications that can generate realistic scenarios for AI training purposes, taking into account various angles, lighting conditions, and occlusions. This development is bringing the automation of 3D crowds to a new stage, with a higher level of interaction and realism.As a leader in 3D character animation, Reallusion has combined its expertise in character generation and automated animation, transforming iClone into a simulation platform. With the latest release of iClone 8.5, users can rapidly build live environments, from crowd simulation to world interaction, with the ability to automatically load and manipulate accessories.iClone 8.5 New ReleaseiClone 8.5 introduces two key innovations: World Interaction and the Smart Accessory system. Building on core features like Motion Director and Crowd Sim, these enhancements empower the creation of dynamic environments. In these interactive spaces, 3D characters can explore, operate props, and seamlessly load and manipulate accessories through interactive triggers or motion files.WORLD INTERACTION: Engaging Environments with Interactive MD Props and Intuitive ControlsMD Props presents advanced features designed to complement iClone Crowd SIM, marking a significant leap forward in ways that 3D actors interact with their virtual environments.Give Life to PropsMD Prop Tools gives iClone characters the ability to interact with 3D environments. By replacing the proxy components with custom 3D models, MD props manifest into visceral objects that 3D actors engage with. All manners of interactive behavior can be generated from just five templates representing the core of MD TOOLS.Intuitive Radial MenuMD Prop enhances user experience through its intuitive Radial Menu system, facilitating easy access to customizable, multi-level command groups and assignable hotkeys. Additionally, MD Prop introduces Self Interaction, enabling 3D actors to perform gender-specific and profile-adaptive actions such as using a phone or smoking, enhancing realism.Action List & Concurrent BehaviorsAction Lists support the chaining of multiple motions to extend action sequences. When the scene simulation is paused, individual Action Lists can be assigned to every character in the scene. Upon playback, all the appointed characters will move simultaneously according to their own set of instructions.Prop CustomizationCustomizing an MD template tool is straightforward: simply replace the proxy objects with a 3D model of your choice. Once the prop appearance is finalized, you can assign new animations or adjust the actors associative movements with minimal tweaks. This includes repositioning reach target positions to account for hand placements and varying arm lengths, as well as adjusting the look-at point to focus the character on specific features of the prop.Object AnimationMD Prop offers robust tools for animating itself. Creators can freely move, rotate, and scale any 3D item, with additional options for object morphing and particle effects. Morph Animator in iClone allows for intricate morph animations, while PopcornFX enables particle effects for added realism. Additionally, texture UVs can be adjusted and scrolled to create dynamic visuals, and video textures can be used to mimic interactive slideshows or film sequences.Reactive Crowd BehaviorBeyond just triggering interactive behavior from individual actors, the MD Prop system enables autonomous crowd behavior. Crowd characters no longer roam aimlessly; they can now gravitate toward points of interest and even explore various props around the scene. Groups or individuals can find a spot to sit, visit the vending machine, or gather around a busker showunleashing endless possibilities for dynamic crowd interactions.Smooth Motion TransitionsiClone Turn to Stop and Multiple Entry functionality simulate natural human behavior in approaching and engaging with a target prop. This includes slowing to a stop, making a turn, and approaching with measured pace, distance, and orientation. MD props take care of the rest with smooth and natural interactive motions that are tailored to the gender and characteristics of the interfacing actor.Multi-Platform Export & RenderExporting these creations is seamless, with iClone Auto Setup plugins available for 3ds Max, Maya, Blender, Unreal Engine, and Unity. Live Link support also allows for synchronization with Unreal Engine and NVIDIA Omniverse, providing a two-way workflow thats essential for modern animation and game development pipelines.Learn more about World InteractionSMART ACCESSORY: Easy Motion Editing and Automatic Accessory AttachmentSmart Accessory is a cutting-edge system that dramatically streamlines the process of editing motions and integrating accessories in animation projects.MotionPlus Pairing with Dynamic AccessoriesThe iClone MotionPlus format seamlessly integrates facial performance and accessory metadata for complex animations like cycling and skateboarding, where precision in motion and accessory alignment is essential. MotionPlus automates accessory attachment, randomizes models and materials, and ensures perfect synchronization between character movement and accessory interaction.Creating Smart AccessoriesThe Smart Accessory system offers robust features for creating and customizing accessories. With Motion-Accessory Pairing, animators can assign multiple accessories to specific motions, restoring default accessories or assigning new ones as needed. This flexibility allows for highly personalized animations, where characters can interact with various accessories in a lifelike manner.Creating Animated AccessoriesThe ability to synchronize human motion with accessory animations adds a new layer of realism to animated projects. The Smart Accessory system simplifies this process, allowing for seamless data handling and enhanced visual fidelity.Flexible Motion ControlsOne of the most powerful features of the Smart Accessory system is its support for a wide range of motion controls, which are essential for crowd simulation and interactive animations made by Motion Director.Learn more about Smart AccessoryACTORCORE Library of 3D AssetsDiscover a vast collection of high-quality 3D content in ActorCore. From mocap animations and hand-keyed motion to fully rigged characters, accessories, and propseverything you need to enhance your scenes is in store. Why wait? Take your 3D simulations to the next level with this extensive content library today.MD Prop Expansion Packs: Indoor & Outdoor InteractionThe STAYING AT HOME and DOWN THE STREET expansion packs provide numerous ways to showcase the versatility of MD Props. Each prop includes both male and female animation sets, offering unique gender-specific performances. Easily swap the MD Prop placeholders with your custom models or adjust the animations to fit different interactive scenarios. The iClone 8.5 Grand Release is a free update for all iClone 8 owners! New users can download the Free Trial to experience advanced virtual simulation with intuitive character controls, motion editing, and Smart Accessories.Brought to you by Reallusion:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post iClone 8.5 Free Update SIM Builder with Prop Interaction & Smart Accessory appeared first on befores & afters.
    0 Commenti 0 condivisioni 113 Views
  • BEFORESANDAFTERS.COM
    Starting a startup in the world of VFX training
    Urban Bradeko, founder of DoubleJump Academy, discusses launching a VFX training company and what it took to get there.Today on the befores & afters podcast were chatting to Urban Bradesko, CEO and Founder of DoubleJump Academy. DoubleJump is a VFX and CG learning platform that leans heavily into Houdini and Unreal Engine.I was interested in talking to Urban about the Academy not necessarily about specific courses, but about the challenges in starting up a VFX training company. So, we talk about getting started, why he thinks DoubleJump is different than whats already there, whats tricky in the current VFX climate, crafting a community in this industryandyes, how to deal with AI.The post Starting a startup in the world of VFX training appeared first on befores & afters.
    0 Commenti 0 condivisioni 96 Views
  • BEFORESANDAFTERS.COM
    Watch Barnstorms VFX breakdown for Deadpool & Wolverine
    The post Watch Barnstorms VFX breakdown for Deadpool & Wolverine appeared first on befores & afters.
    0 Commenti 0 condivisioni 126 Views
  • BEFORESANDAFTERS.COM
    Why dont we just put a facehugger on top of a radio-controlled car and drive it around?
    The wide variety of practical facehuggers made by Wt Workshop for Alien: Romulus. An excerpt from befores & afters magazine in print.The central poster for Alien: Romulus features one of the human characters being dramatically hugged by a facehugger, the film franchises parasitoid, multi-limbed alien creature.It certainly gave a clue as to what audiences could expect in the Fede lvarez movie. What the audience would eventually see was a whole host of facehuggers menacing visitors to the abandoned Weyland-Yutani research station, the Renaissance.The on-set facehuggersof which there were several in variety, including ones that scamper along floors and walls, and others that could enter a human host via its mouth and a long proboscis and even ones that were remote-controlled carswere realized by Wt Workshop for the film, under creative lead Richard Taylor.Wt Workshop also built the F44AA pulse rifle used by Rain (Cailee Spaeny) in the film. Here, Rob Gillies, general manager of manufacture, and Cameron May, a supervisor of robotics and animatronics, explore with befores & afters how Wt Workshop brought the practical facehuggers to life, and manufactured the pulse rifle.issue #22 Alien: RomulusThe facehugger buildDrawing upon concept designs for the facehuggers established by production, Wt Workshop set about a build methodology for the many varieties needed for the film. What was apparent on Romulus was that the facehuggers are all through it, declares Gillies. We ended up delivering 73 facehuggers to the show, which is an incredible number. They ranged from animatronic facehuggers to what we call comfort huggers that youd wear on your face with breathing mechanisms, to static prop facehuggers. There were all these different variations and iterations that were called out in the script that we then developed a build list of. From there we could design and build these creatures specifically to the needs of the show. A lot of these breakdowns of the specific gags and builds were masterminded by Joe Dunckley, one of our manufacturing art directors.The build would be driven primarily around both practical and aesthetic considerations, as May points out. We paid specific attention, for example, around what the knuckle joints look like. How were they going to be big enough so that we could actually practically make these things work? How is the skin going to interact with the mechanisms underneath? We were actively thinking about those things and as we were trying to refine the design aesthetic around it, we were trying to already formulate a plan for how we were going to build these things and turn them into practical puppets so we didnt back ourselves into a corner.3D printing and the generation of mass molds for large-scale casting reproductions allowed Wt Workshop to produce so many of the critters. To get the product out wasnt the heart of the challenge, notes Gillies. For us, it was ensuring that the facehuggers actually looked lifelike and could actually wrap around someones head or breathe with the performers body. That was actually the true tricky part.To ensure that occurred, it was vital for Wt Workshop to break down the specific gags that the facehuggers would be required to perform. We visually broke those and mapped those down, outlines May. We said, Right, thats going to have a movable joint over here and this is going to have this type of control. And were going to have rods that are going to go on here. Or, this is going to have this type of digital mold that were going to use to create a silicone cast from. Even though theres such a complex array of them, we were able to break those things down so we had a nice structure in terms of how we were going to approach them. That ended up working really well.In general, the facehuggers were crafted with an aluminum interior armature, 3D printed nylon joints and silicone skinwith different additional materials used depending on whether the creatures were animatronic or more static. Even though the facehuggers movements are quite different, a lot of their end joints were identical just to keep a design language that was quite consistent amongst them, says May.Go much further in-depth on the facehuggers in the print magazine article.The post Why dont we just put a facehugger on top of a radio-controlled car and drive it around? appeared first on befores & afters.
    0 Commenti 0 condivisioni 134 Views
Altre storie