Befores & Afters
Befores & Afters
A brand new visual effects and animation publication from Ian Failes.
3 A la gente le gusta esto.
303 Entradas
2 Fotos
0 Videos
0 Vista previa
Actualizaciones Recientes
  • On The Set Pic: Deadpool & Wolverine
    beforesandafters.com
    This new pic comes from an upcoming book from Marvel Studios showcasing images from the set.The post On The Set Pic: Deadpool & Wolverine appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·35 Views
  • Behind the titles for Severance s2
    beforesandafters.com
    Oliver Latta from extraweg.studio has posted this breakdown. You can also see lots of behind the scenes at Behance.The post Behind the titles for Severance s2 appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·30 Views
  • Watch CGEVs VFX breakdown for The Substance
    beforesandafters.com
    A whole range of invisible effects work, make-up effects enhancements, and more. Note: this breakdown contains nudity.The post Watch CGEVs VFX breakdown for The Substance appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·47 Views
  • On The Set Pic: Avatar: Fire and Ash
    beforesandafters.com
    Go behind the scenes.(L-R) Stephen Lang and Director James Cameron on the set of 20th Century Studios AVATAR: FIRE AND ASH. Photo by Mark Fellman. 2024 20th Century Studios. All Rights Reserved.The post On The Set Pic: Avatar: Fire and Ash appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·37 Views
  • Behind the scenes of that crazy plane fight, plane crash and parachute jump in Back in Action
    beforesandafters.com
    Im going to be merciless when it comes to the CG parachute.Seth Gordons Back in Action contains a dramatic aerial sequence featuring a plane crash in the mountains, an avalanche and a thrilling parachute jumpall within a matter of minutes. It occurs as CIA agents Matt (Jamie Foxx) and Emily (Cameron Diaz) are ambushed on the plane for a key device they hold, and must fight off their attackers and then escape the crashing aircraft.Interestingly, the initial version of the sequence was to take place on a train. Locations were scouted in London and previs was produced for that version of the scene. However, when another Netflix film also featured a train action moment, it was decided to switch to the plane approach.We previsd the plane shots with MPCs visualization team, outlines production visual effects supervisor Erik Nash. We didnt previs all the interior action because that was done as stuntvis by the stunt team. The previs and stuntvis then helped us work out how to shoot everything.Back In Action. Behind the scenes on the set of Back In Action. Cr. John Wilson/Netflix 2024.Nash narrowed in, at first, on the parachute jump side of the sequence, since the visual effects supervisor is himself a trained parachutist (he had previously lent his expertise in that area at Digital Domain on the skydiving scene in Iron Man 3). I got to talking to second unit director J.J. Perry, who is former airborne military, and he said, You know what? You do this. This is right up your alley. You worry about the parachute part of it.Looking to film as much of that parachute section practically, Nash worked out that the unusual orientation of the jump could be donein the film, it is effectively a tandem jump where Matt and Emily are facing each other holding on, rather than one behind the other. Having seen it done a few times and knowing you can do it face-to-face, I know we could stage it so they would survive, relates Nash.The next step was to establish where to film a jump. It needed to be a snowy Alps-type environment. Production secured permissions to film in Slovenia which featured the desired mountainous location. We found a small town called Bovec that is a ski resort but also has a grass airstrip where they do skydiving from in summer, says Nash. There was also this grass landing area which was a meadow with this very steep granite mountain face that had snow all over it. It was perfect.Back In Action. BTS (L to R) Jamie Foxx as Matt and Cameron Diaz as Emily on the set of Back In Action. Cr. John Wilson/Netflix 2024.Parachutist Dave Emerson cast two stunt performers for the jump, Yolanda Lee and Christian Botakwame. Nash was then part of the helicopter shoot over Bovec to film the jump. We had a Shotover helicopter camera rig with a long lens on it. We had an even longer lens on a ground-based camera, and we had a second parachutist with a helmet mounted-camera who jumped with our stunt players. And then we had a drone camera, too. We did two jumps in one afternoon and got tons of amazing footage, and it worked better than I ever could have dreamed. No visual effects were applied to any of that footage other than the first shot where we tie in the avalanche and the explosion from where the jet cratered in. Ironically, I get a kick out of doing stuff that doesnt involve visual effects every now and then.Close-ups of Foxx and Diazs characters did involve two bluescreen inserts. I thank Seth in my head for not playing a bunch of close-ups, because theyre often really hard to do, shares Nash. I made sure that we could shoot these outside. I did not want to fake daylight on a sound stage.Similarly, Nash was adamant that the practical parachute seen in these close-ups also not appear fake. Ive seen a lot of these types of bluescreen parachuting shots just not work out. So, I had pitched an idea of doing it rigged off a flatbed truck so that theyre actually moving through space, not hanging static in place. However, it was logistically and budgetarily prohibitive. We ended up doing it the old-fashioned way by hitting them with a big rush of air. There were only two of those shots, and I thought they turned out pretty well.Part of the crash, from the films trailer.Thankfully theyre short shots, continues Nash. Its really tough to fake all of the complex dynamics of that kind of shot in terms of, okay, what is the camera platform thats photographing these? So, we have to imagine that theyre traveling through space at 30 miles an hour. If the cameras in any proximity, then really youre implying that its another parachutist that shot it. I think sometimes when these shots fail, its because the camera is often doing something it cant do. But to really shoot these close-ups theres not a lot you can do if you really had to shoot them for real.The actual deployment of the parachute was achieved digitally. I had a heart-to-heart with Malte Sarnes, MPCs visual effects supervisor for this sequence, and I told him upfront, We have to do a CG parachute for the deployment and for the landing, and it has to be great. This was because we couldnt shoot these parts with a real parachute. I said, Im going to be merciless when it comes to the CG parachute. Im going to be a hard ass. Ive seen so many unconvincing to downright bad CG parachutes over the years, and it just bugs me to no end.I think thats one of the places where CG parachutes often break down is that theres not enough transmitted light, Nash adds. Its all reflected. If you hold the fabric up to a bright sky, you can almost see through it, and theres multiple layers. So youve got the upper layer affecting the light, hitting the lower layer, but theyre both semi-translucent.Back In Action. (L to R) Cameron Diaz as Emily and Jamie Foxx as Matt in Back In Action. Cr. Courtesy of Netflix 2024.Nash arranged for the props department to send the practical parachute to the assets team at MPC in London for them to examine it. I said, Take it outside, have all your asset guys feel the fabric, hold the fabric up to the light, see what its like. We also had about 20 minutes of actual parachute jump footage from flying around in that valley. I said, Here, this is all the reference you could ever ask for.Whats more, Nash arranged for free-fall camera operator Andy Ford to deliver some parachute deployment footage of unfolding and inflating for further reference. What he did was take his helmet-mounted camera, which normally faces forward, and put it on backwards, so that when he jumped out of the plane and opened the chute, the camera was pointed in the direction of the parachute unfurling. He did several of those and then gave all of that to MPC. It let the team see all the cloth dynamics, which are incredibly noisy and erratic and random.One thing to note, says Nash, is that free-fall cameramen typically pack their parachute to open slowly because its easier on their neck with all that weight on their head from the camera. The issue was, while we got great reference, it took way too long for the parachute to inflate, in terms of what we needed to see in our scene. They pull the chute and get yanked out of the plane, so it had to happen really quickly. So, I did a re-timed, cut down version of the deployment, and it all paid off because I think MPC nailed it.Fight on a planePrior to the parachute jump, and the actual crash, a fight ensues on the plane between Matt and Emily versus a rogue crew. The fight even continues once the plane hits the side of the mountain and ends up sliding down the snowy slope. Interior scenes were filmed on a large gimbal that held a plane mock-up. It could roll 360 degrees, advises Nash. The original sequence as shot was actually quite a bit longer. There was a whole stretch in the middle where the plane was rolling down the mountain like a pencil on a sloped table. In the finished sequence, the plane rolls up on its side and then rolls back.Back In Action. (L to R) Jamie Foxx as Matt and Cameron Diaz as Emily in Back In Action. Cr. Courtesy of Netflix 2024.For background environments, production was able to film mountains, skies and clouds from a helicopter, as well as with a drone. Says Nash: We went up the ski lift to the top of the mountain, had one of those snow cats that could get us a certain distance away from all of the ski gear, and then we had a drone up there that we shot plates with. We werent able to utilize as much of that plate photography as I had hoped, partly because the lighting conditions were constantly changing on the top of this mountain.Ultimately, the bulk of the exterior shots, including the avalanche, were realized as fully CG environments by MPC. We used some of the plate photography to build some of these environments, describes Nash. Wed pick some of the plate photography, which maybe didnt do what it needed to do in terms of camera moves, and then we said, This is the lighting condition that we want to build into all our CG environments. We did use plate photography for all the air-to-air shots preceding impact with the mountain. I think there were four or five of those, including a sunset shot that Seth absolutely loved. And Im like, Oh, but theres no sunset in any of the other shots? I think you get away with it and its a gorgeous shot.Back In Action. Behind the scenes on the set of Back In Action. Cr. John Wilson/Netflix 2024.The jet itself was CG and had to match a practical jet that was filmed for a hangar scene of the characters boarding the aircraft. In that hangar, however, the jet was plain white. When it was modelled and textured and placed into the white-ish mountainous environments, the plane was deemed too nondescript. So, recounts Nash, I did a quick little Google search and looked for a simple two-color stripe scheme that we then put on our CG jet, which meant we then had to add to the practical jet in the hangar in a handful of shots for continuity.As the plane slides down the mountain, a further consideration became the amount of damage to represent on the digital fuselage. We wanted to imply that there was a lot of scraping and denting going on outside the plane while were inside covering the fight, states Nash. So, we went to 11 on the plane damage dial and had to track the progression of it all. What helped was that I had done a CG Air Force One for Iron Man 3, and in reality that is the cleanest plane ever! Which is hard to do convincingly in CG. It does not look real. So even there we had to add dirt and oil streaks on the wings.Back In Action. BTS Jamie Foxx as Matt on the set of Back In Action. Cr. Parrish Lewis/Netflix 2024.Inside the plane, as it crashes, became a messier and messier environment of ice, snow and debris. All of it was simulated, notes Nash. We did some early tests with the special effects team but found it was really hard to control and get enough airflow through the airplane set that the stuff would enter the gash and travel all the way out the far end and not settle somewhere in the middle. So it became all-CG. That was another thing where we did a first pass and the director said, More. Okay, second pass. More. There were some things like napkins and paper that were practical. Ironically, the thing that we kept seeing over and over were these red napkins and we wound up painting most of them out because they were so distracting!The post Behind the scenes of that crazy plane fight, plane crash and parachute jump in Back in Action appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·79 Views
  • Breaking Hardware Limits: A Render Farm Solution for Redshift Users
    beforesandafters.com
    It always starts with a deadline. Eye-catching VFX shots, lifelike animation, breathtaking architectural visualizations that need to be rendered at the highest qualitybut time is running out. You push your local workstation to the limit, listening to the fans whir at full speed as your GPU strains to keep up. Hours pass, and youre still waiting. Then, a crash. Your machine just cant take it anymore.This is the reality that many 3D artists face daily. The demand for high-quality, photorealistic rendering has never been greater, but local machines often fall short of the power required.That is where online render farms like iRendercome in. A solution that eliminates the barriers of hardware limitations, giving artists the power they need without the burden of costly upgrades.Rendering a C4D scene with Redshift on iRenders 6 x RTX 4090 serverThe Struggles of Local RenderingRendering locally is a battle against time and hardware.The Hardware BottleneckIt doesnt matter if you have a high-end GPU or a workstation with maxed-out RAM, complex 3D scenes will eventually push your hardware to its limits. High-resolution textures, volumetric lighting, and intricate simulations require massive processing power, and even the most powerful local machines can slow to a crawl.The Never-Ending Upgrade CycleJust when you think youve built the perfect workstation, software advancements and hardware improvements make it obsolete. Newer versions of Redshift demand more VRAM, more processing power, and faster GPUs, leaving artists in a constant cycle of expensive upgrades.The Cost of TimeTime is money, and nowhere is this more true than in the 3D industry. Rendering on a local machine means waitingsometimes for hours or even daysto complete a single animation or still frame. Missed deadlines, slow iteration cycles, and frustration become the norm rather than the exception.Heat, Wear, and Power ConsumptionProlonged rendering sessions put enormous stress on GPUs, leading to overheating, component wear-and-tear, and high electricity costs. A workstation running at full power overnight can significantly spike your energy bills, which makes local rendering not just time-consuming but expensive.These are the challenges that cloud render farms like iRender are solving. They eliminate hardware bottlenecks, reduce rendering times, and give artists access to enterprise-level computing power without the enterprise-level cost.iRender: A New Era of Cloud Rendering for RedshiftiRender isnt just an ordinary render farm; its an Infrastructure-as-a-Service (IaaS) platform that gives artists full control over dedicated high-end GPU servers. Unlike traditional render farms, iRender allows you to interact with your projects in real time, just as if you were working on your own machine, but significantly faster.Key Benefits of iRender for Redshift Users High-Speed Rendering with High-End GPUsWith access to up to 8x RTX 4090 or RTX 3090 GPUs, iRender delivers lightning-fast rendering times. This allows artists to go from hours to minutes without compromising quality. Whether its an animation sequence or a high-resolution product visualization, speed is no longer a barrier. Flexible, Cost-Effective PricingNo more spending thousands on hardware upgrades. iRender operates on a pay-as-you-go model, ensuring that you only pay for what you use. For long-term projects, rental plans offer even better pricing and custom hardware configurations tailored to your needs. Scalability Without LimitsNeed more power? iRenders scalable infrastructure means you can instantly upgrade from a single GPU setup to an 8x RTX 4090 / RTX 3090 powerhouse, ensuring that you never run into hardware limitationsno matter the project size. Full Creative Control Over Your WorkflowUnlike traditional render farms that process jobs in a queue, iRender gives artists full remote desktop access to their dedicated GPU servers. This means you can:Pause and resume renders at any time.Make last-minute adjustments without starting over.Install any Redshift version or software (Cinema 4D, Blender, Maya, Houdini, etc).Work as if you were on your own local machine, just with way more power. Reliability & 24/7 Professional SupportRendering failures can be a nightmare. With iRenders optimized high-performance servers, crashes are minimized, and artists can count on stable, uninterrupted rendering sessions. Plus, 24/7 expert support ensures that help is always available when you need it.Whats Coming in 2025?In short, iRender will give you the competitive edge you need in 2025.Best performance-to-price ratio for Redshift usersFull control over powerful GPU servers (no queue waiting)Perfect for both freelancers and large studiosSeamless remote access & real-time renderingThe industry is moving faster than ever, and 2025 is going to be a game-changing year for all. iRender continues to lead the way in innovation, first to provide cutting-edge solutions that make rendering faster, easier, and more affordable. One of its recent innovations is the Staking (Stake to Earn) feature, artists can stake unused iRender Points and earn bonuses.With the latest NVIDIA RTX 5090 launch, can you guess what exciting upgrades are coming up next at iRender? But heres a little secret: iRender is also expanding with a brand-new Data Center in South Korea. Stay tuned!Rendering Redshift with iRender today and never let hardware limitations hold you back again!Brought to you by iRender:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post Breaking Hardware Limits: A Render Farm Solution for Redshift Users appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·90 Views
  • CGA Belgrade speaker preview: Hristo Velev
    beforesandafters.com
    The founder of Bottleship VFX will be speaking on his studio, developments in virtual production and AI agent tech.CGA Belgrade 2025 is coming up on 10 and 11 April get your tickets here!. In this special preview interview, befores & afters talks to one of the speakers from the conference, Hristo Velev, about his session at the event.Velev is a founding partner at Bottleship VFX, a boutique studio specializing in effects sims like water, destruction, fire. He previously worked at Pixomondo, Screen Scene, and Scanline VFX on films including Iron Man 3. Here he discusses the history of his own studio, getting into virtual production and AI, and what hell present at CGA Belgrade.b&a: Tell me a little about the history of Bottleship VFX and some of the main recent projects the studio has delivered?Hristo Velev: We started in 2013 as a simulation specialized VFX house. Worked on a globally diverse feature film slate over the years, added creature and environment capacity to our portfolio, and organically grew to about 20 artists. Very technically minded team, squeezing productivity by developing in house tools that automate as much as possible. Lately with the industry slump post COVID, we added software development and virtual production to our offerings.Comandante.b&a: Yes, youve dived into virtual production in recent times what kinds of things have you been doing and looking at in this area?Hristo Velev: We started a few years ago on an Italian feature called Comandante, led by Kevin Haug and Dave Stump. They had the idea of doing virtual production without greenscreens and LED stages, by tracking the camera in real time, rendering the virtual set in Unreal, and using AI to roto out the foreground, and composite for the director to review at the end of the day, then move quickly to post to present a complete version next day.We joined as the post house on set, but our responsibilities grew until we were contributing at most points in the workflow. It was a great start, and we delivered 36 shots while still on set, but that was just the early days. In 2024, with key technologies advancing, we updated our platform to use real-time raytraced rendering of a USD set by Chaos Vantage, real-time AI roto, and real-time compositing.CLAROS.This is a killer app now, that we call CLAROS you can shoot virtual production anywhere, and instantly jump to full featured post production. We showcased it in an experimental short film called Out of a dream that you can check out on the site at clarosvp.com, and well unveil at a showroom in Sofia on Apr 29, and at the FMX May 6-9.b&a: How about AI? Tell me what youve been experimenting with and releasing with cairos.ai?Hristo Velev: While theres some cool AI in Claros, the one in Cairos is maybe even more fun. It lets you speak to a virtual actor and direct it. It works by letting an AI agent access a semantically encoded database of motion descriptions that is fed by our inhouse mocap team, producing about 600 motions a week.Cairos in action.Conversing with you, the agent produces a list of animations that is then passed to an animation sequencer that splices them together and retargets to your character, and renders the animation in your browser. When youre happy, you request a download and get a package that you can use downstream in your pipeline. Were in an early phase, and working with first adopters to build it up to their needs, so we can open up to the public in a few months.b&a: Can you give a very short preview of what youll be talking about at CGA Belgrade?Hristo Velev: Ill give the audience a tour of both Cairos and Claros the tech stack, how they have evolved on top of traditional VFX tools, incorporating the new wave of AI, real-time rendering and other cutting edge tech.Find out more at https://cgabelgrade.comThe post CGA Belgrade speaker preview: Hristo Velev appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·86 Views
  • The Evolution of 3D Gaussian Splatting in Blender: A Look at the Latest 3DGS Render Addon Update
    beforesandafters.com
    This article explores the latest update of the 3DGS Render Blender Addon, which expands the functionality of 3D Gaussian Splatting in Blender for point cloud workflows. Key points include: Mesh-to-Gaussian Splats conversion: The new release enables conversion of .OBJ files into 3DGS .PLY format, allowing for exclusive point-cloud-based editing of existing mesh models. Exportable Face Edits: Now, face edits can be exported, ensuring 3DGS objects retain mesh adjustments for other software or collaborative work. Exportable Transforms: Object transformations like scaling, rotation, and position are preserved during export, addressing a previous limitation. 3DGS Painting and Texturing: Painting and texturing for 3DGS objects are introduced, providing creative freedom and remaining intact through rendering and export. Baking (Experimental): The new baking feature, based on Blenders node bake system, can reduce rendering or playback times for heavy scenes. Notable Minor Improvements: These include 3DGS UV Generation, optimized editing workflows, independent LQ/HQ and color edits, a new import method, scene refresh, and a revamped UI. Free and Open-Source: The addon remains free and open-source, promoting community-driven development for wider adoption of 3D Gaussian Splatting.3D Gaussian Splatting (3DGS) continues to gain momentum as a compelling approach for visualizing, editing, and animating point clouds in Blender. Now, the developers of the 3DGS Render Blender Addon have unveiled a major update that significantly expands the functionality of this workflowoffering new ways to convert, paint, bake, and export 3DGS objects. In this article, well dive deep into the exciting new features, explain how they benefit artists, and touch on why they mark a significant milestone in Blenders ongoing integration of 3D point cloud workflows.A Quick Primer on 3DGS Render Blender AddonFor those who may be new to the concept, 3D Gaussian Splatting (3DGS) involves representing 3D objects as a constellation of splat points or ellipsoids. This method allows for ultra-fast rendering and editing of point-cloud-based geometry, making it ideal for dense scene visualizations. Over time, 3DGS has evolved to support more features typically found in polygon-based workflows, such as crop-editing and render exports. While past versions of the 3DGS Render Addon set the groundwork, this latest update significantly broadens what you can achieve in Blender with point cloud data.Key Enhancements1. Mesh-to-Gaussian SplatsOne of the major highlights of the new release is the ability to convert .OBJ files into 3DGS .PLY format. This streamlined process transforms existing mesh models into 3DGS objects, unlocking exclusive point-cloud-based editing and processing methods. Its particularly useful for applying specialized effects to conventional meshes, as well as unifying file formats across entire scenes. 2. Exportable Face EditsUntil now, only edits to the point cloud data could be exported via the addon. In this release, you can also export face editswhich makes a big difference when refining or cleaning up geometry. Being able to carry those mesh adjustments out of Blender ensures your 3DGS objects remain true to the changes youve made, whether you need them for other software or for collaborative workflows.3. Exportable TransformsAnother limitation that has been addressed is the handling of object transformationsscaling, rotation, and position. With this update, all transforms applied will remain when exported, so you no longer lose this crucial information when moving your 3DGS objects to external tools.4. 3DGS Painting and TexturingThis update introduces painting and texturing for 3DGS objects, allowing you to color them with a direct brush or image-based textures. These enhancements remain intact through rendering and export, providing a new layer of creative freedom often missing in point-cloud-based workflows.5. Baking (Experimental)The new baking feature focuses on performance by locking in modifier effects, rather than recalculating every frame. Built on Blenders node bake system, it can significantly reduce rendering or playback times for heavy scenes, although it remains experimental. Large-scale projects may see considerable benefits but should also be mindful of potential stability issues.Notable Minor Improvements3DGS UV Generation: Automatically create UV maps on import, simplifying shading and animation.All editing modifiers can be added and re-added: Optimized workflows for iterative editing and modifiers can be used multiple times.Independent Low/High Quality and Color Edits: Individual material and color settings are now available for each 3DGS object, rather than one global configuration.New Import Method: Removed reliance on external dependencies, eliminating the warning banner in Blender Preferences.Scene Refresh: Re-initializes scene and object properties, aiding file transfers and preventing setup issues.Revamped UI: Improves mode-switching and provides performance tips, enhancing workflow efficiency.Takeaways and Next StepsWith the addition of mesh conversion, exportable face edits, enhanced transform capabilities, painting and texturing options, and experimental baking, the 3DGS Blender Render Addon demonstrates significant progress in integrating point-cloud techniques with standard 3D workflows. These improvements streamline the user experienceremoving external dependencies, refining the interface, and providing greater flexibility for editing and rendering.Notably, the addon remains free and open-source, reflecting the developers belief that breakthroughs in 3DGS result from iterative adaptations and community-driven collaboration. By sharing the project openly, contributors can refine point-cloud workflows and move 3D Gaussian Splatting closer to a widely adopted technique across diverse industries.Check out and download the addon for free on Blender Market and Github.See KIRI Engines official Update Release Video.Brought to you by KIRI Innovations:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post The Evolution of 3D Gaussian Splatting in Blender: A Look at the Latest 3DGS Render Addon Update appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·89 Views
  • Snow White puppeteer breaks down the process for on-set dwarf performance
    beforesandafters.com
    Check out Robin Guivers great Insta post.Movement and puppetry expert Robin Guiver has posted some fun behind the scenes photos from Snow White, showcasing how the seven dwarfs were crafted for the film. It involved some incredible on-set puppetry and choreography (then, of course, MPC animated the dwarfs as CG characters).Highly recommend checking out the Instagram post showcasing dwarf work and animals. View this post on InstagramA post shared by Robin Guiver (@robinguiver)The post Snow White puppeteer breaks down the process for on-set dwarf performance appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·121 Views
  • Metaphysics neural HMC performance injection tech
    beforesandafters.com
    We chat to Jo Plaete from Metaphysic about the use of its neural HMC performance injection tech, including for the character Rook in Alien: Romulus.Welcome to the brand new AI and machine learning in VFX season of episodes on the befores & afters podcast. This season is all about where AI and machine learning are being used right now in visual effects. Today Im joined by Jo Plaete, chief innovation officer and visual effects supervisor from Metaphysic.On the podcast: Ian Failes (left) is joined by Metaphysics Jo Palate.Youll likely be familiar with Metaphysics work on three big projects from last year Furiosa, where they worked on the Bullet Farmer, Alien: Romulus, where they helped craft Rook, and Here, where they de-aged (and aged) Tom Hanks and Robin Wright, among several other characters. For those projects, Metaphysic utilized its Neural Performance Toolset, including its Neural Editing and Animation tools.Go in-depth on Alien: Romulus in issue #22 of befores & afters magazine.For Rook in Alien: Romulus, in particular, Metaphysic also relied on something it coined Neural HMC performance injection, where the performance of an actor captured in an HMC was also part of the mix of delivering that character. Thats what we focus on today, as part of Metaphysics approach to AI-driven facial animation.Listen in, above.The post Metaphysics neural HMC performance injection tech appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·89 Views
  • That all-greenscreen version of Sin City rocked my world 20 years ago
    beforesandafters.com
    Celebrating the two-decades anniversary of the Robert Rodriguez film.I remember getting the two-disc DVD collectors edition of Sin City and watching a featurette on there called The Movie in High-Speed Green Screen. It was, exactly that. Director Robert Rodriguez presented just the raw elements shot at his Troublemaker Studios on greenscreen, which had been sped up about 800 times to form a 10 minute take on the movie.This was an amazing thing to see. Along with the DVDs other informative featuretes, I learnt a lot. I certainly wish other filmmakers did something similar on these kinds of films. Of course, Wes Ball released the ENTIRE version of his Kingdom of the Planet of the Apes with the raw performance capture plates (called Inside the Lens: The Raw Cut) on Blu-ray recently, and I highly recommend it. Please, lets make this a thing again. In the meantime, this week represents the 20th anniversary of Sin City, a film that capitalized appropriately on the digital backlot style of filmmaking and made it work perfectly for Frank Millers source material. A bevy of effects studios contributed to the film, including KNB EFX Group, Troublemaker Digital Studios, Hybride, CafeFX and The Orphanage. Congrats to all of them, and thank you Robert Rodriguez for bringing Sin City to life, and being willing to explain and reveal the process.The post That all-greenscreen version of Sin City rocked my world 20 years ago appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·93 Views
  • New video sheds light on mocap used for Transformers One
    beforesandafters.com
    Motion capture done at ILM was used to help craft virtual story reels.Read more about the film issue #25 of befores & afters magazine.The post New video sheds light on mocap used for Transformers One appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·116 Views
  • On The Set Pic: Snow White
    beforesandafters.com
    The post On The Set Pic: Snow White appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·110 Views
  • Watch ReDefines VFX breakdown for Those About To Die
    beforesandafters.com
    Horses, crocodiles and lions!The post Watch ReDefines VFX breakdown for Those About To Die appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·90 Views
  • Watch Absolutes VFX breakdown for the one-shot Adolescence series
    beforesandafters.com
    The studio delivered 80 shots across the 4 eps, including invisible reflection and shadow removals, environment clean-ups, and helping to enable that camera through the window shot.Watch the breakdown here at Absolutes site.The post Watch Absolutes VFX breakdown for the one-shot Adolescence series appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·110 Views
  • Golaem crowd tools have now been added to the Autodesk M&E Collection
    beforesandafters.com
    Watch the video to find out more.The post Golaem crowd tools have now been added to the Autodesk M&E Collection appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·100 Views
  • We asked them to render out gameplay via nine different cameras
    beforesandafters.com
    How The Penguins driving plates included footage from driving around in the video game, Gotham Knights. An excerpt from issue #29 of befores & afters magazine.The Penguin features a vast amount of driving scenes, which VFX helped to complete via plate shoots and even a virtual driving solution based on a video game. We really tried to really distinctly say were going from point A to point B, to be clear about the geography, remarks visual effects supervisor Johnny Han. I said, Lets buy the biggest maps we can get. We put on the wall the five boroughs of New York. We knew we were going to shoot somewhere in these five boroughs. This is before we even knew exactly where our shooting locations would be. But we wanted to be ready to know where to shoot paths for our driving plates.Get issue #29 in print at Amazon from your local store:USAUKCanadaGermanyFranceSpainItalyAustraliaJapanSwedenPolandNetherlandsOr grab the digital edition at Patreon: https://www.patreon.com/beforesandafters/shop/issue-29-penguin-digital-1313436You can also subscribe to the DIGITAL MAGAZINE tier at Patreon:https://www.patreon.com/c/beforesandaftersThe can-do attitude of the on-set VFX team came into play as a preparation stage before final driving plates would be filmed, as Han explains. Our PA had a production rental car, and all we did was strap an Insta360 camera on top. Then we waited until it was about 5pm in New York, when it got dark in wintertime, and we tried out some routes. Basically we were shooting 360 degree video. Erin Sullivan, our VFX editor, put it all together, and then wed show everyone. She cut it to time with text at the bottom that represented the dialogue to get the pacing right. We would get sign-off from executive producer Craig Zobel and Lauren in terms of tone, neighborhood and speed of car. We even acted out parts of the scenes, like, theres a moment in the script where they would stop at a red light and Oz throws a phone out the window. So, we actually integrated those beats into our driving footage.When it came time to shoot the real plates, PlatePros was enlisted to drive those predetermined paths and shoot with their multi-camera array. The resulting plates were then played back at Carstages Long Island City facility for scenes of actors inside vehicles.Then, in addition to these live-action plates where New York locations stood in for Gotham, some further driving scenes were realized that were virtual. For some scenes, says Han, we needed a bit more of a richer Gotham, something that felt a little bit more like were in the city. It can be too hard sometimes to get plates in Manhattan because its just too crowded. So, we had this idea related to the video game, Gotham Knights.The idea was to take the open world game in the Batman ecosystemwhich, like HBO, fell under the Warner Bros. Discovery group of companiesand collaborate in terms of rendered out driving plates that would again be played at Carstage. We got a PlayStation 5 and took screengrabs of certain moments as we drove along in the open world, explains Han. Then we asked Warner Bros. Games to render out gameplay through these driving paths via nine different cameras, as if they were driving plates. They had never done anything like this, and thus were so excited and eager to contribute.That process involved some reverse-engineering starting with the multi-camera array live-action results from PlatePros, and then replicating the lens values, height of the ground and angles for the game footage. We ran into an interesting problem where, describes Han, there were some non-deterministic aspects of the game, which is typical of game engines. Some of the traffic lights are randomized or people crossing the streets are randomized. Or, FX like steam from pipes, if you played it twice, its never exactly the same. We also had them turn off the lens flares from lamp posts as those were the kinds of things that would happen optically once we shot the actual scenes.One major benefit of the approach was that the plates could be orchestrated into a perfect loop. Since its a game, notes Han, we could link in right to the start of the path, so that you had this endless driving plate. In usual driving plates, this is actually often a problem. You never know when youre going to hit the end of your footage from traditional driving plates. You might be in the middle of an important line but then you see the background switch. That was one nice thing we avoided.The post We asked them to render out gameplay via nine different cameras appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·106 Views
  • First look video for The Legend of Ochi goes behind the scenes of on-set puppetry
    beforesandafters.com
    A24 released the behind the scenes video.The post First look video for The Legend of Ochi goes behind the scenes of on-set puppetry appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·104 Views
  • Wt FX used its Loki state machine for muscle simulations on Red Hulk in Captain America: Brave New World
    beforesandafters.com
    Plus, the challenge of realizing the color RED on screen.Today on the befores & afters podcast, Im joined by Wt FX VFX supervisor Dan Cox and animation supervisor Sidney Kombo-Kintombo, who break down in a lot of detail all the challenges in bringing Red Hulk in Captain America: Brave New World to life. A couple of really interesting things they mention include the fact that Wt FX actually took the original Marvel design for Red Hulk and re-modeled and sculpted it to have more Harrison Ford traits, especially the eyes. Wt FX had to find ways to distinguish between previous versions of the green Hulk from past films where that character was like a gorilla, their red Hulk was originally more like a bear, and then a honey badger in how relentless it became.It was also the first time Ford has ever done some motion capture on a film, although ultimately the red Hulk was one of the most keyframed characters done in a long time at the studio. Wt FX looked to a real bodybuilder brought in for reference to see specifically how the muscles would move. It was the first film in which the VFX studio used its Loki state machine for muscle simulation. They also built a new muscle set for their generic gen-man to accommodate more define muscle fibers. The result was a lot of detail and a lot less shot-sculpting only around 7% of the final shots.Another fun thing the pair share is about transformations of Harrison Fords character into red Hulk Weta FX developed all kinds different approaches include breaking bones and skinjust in case they were needed in the film, although it was ultimately a little more subtle than that. Finally we talk about the challenges of realizing a red character on screen, even to the point of giving accurate red color details to Legacy Effects which made a reference bust of red Hulk for shooting.Check out the podcast above, and this progression of images showcasing the work.Click to view slideshow.The post Wt FX used its Loki state machine for muscle simulations on Red Hulk in Captain America: Brave New World appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·113 Views
  • See the original mocap Kid Cosmo test in this new video from The Electric State
    beforesandafters.com
    A test for Kid Cosmo was devised with a little girl. The full video breaks down the motion capture and VFX for the film.The post See the original mocap Kid Cosmo test in this new video from The Electric State appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·108 Views
  • Watch this new Squid Game 2 VFX breakdown from Gulliver Studios
    beforesandafters.com
    Shows how large scenes were filmed, and the digital visual effects behind them.The post Watch this new Squid Game 2 VFX breakdown from Gulliver Studios appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·135 Views
  • Scanline VFX releases new Making of Senna featurette
    beforesandafters.com
    Behind the scenes on the shooting, virtual production and digital VFX work in the series.The post Scanline VFX releases new Making of Senna featurette appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·120 Views
  • It was very visceral for the actor to see that light
    beforesandafters.com
    How a new kind of safe flash gun for interactive light, linked to the camera, was invented for The Penguin. An excerpt from issue #29 of befores & afters magazine.Many of the blood hits in The Penguin are, of course, inflicted via gunshots. Recent heightening of set safety in relation to firearms has meant that prop guns with chargesand therefore muzzle flashesare now less used. But in the series, guns were a key storypoint, and many scenes would be filmed in dark environments. The concern from visual effects supervisor Johnny Han was that there would be little or no interactive light generated on set since there would not be the usual muzzle flashes acquired.In my experience, breaks down Han, it often ends up a lot more expensive in post to add in interactive light, because artists have to get in there and start painting in light around actor faces and clothing. Its very delicate, fine artist work. Frankly, a lot of movies dont put that interactive light in and it really doesnt look good, in my opinion.The dilemma of interactive on-set lighting for gunshots forced Han to think differently about the issue. It was the third day of shooting. I brought my photography strobe from home. I got a sound sensor, which is used for sports photography, where it will be triggered by the sound of a ball hitting a baseball bat. We had a prop gun that still went Pop, pop, pop. So I figured, lets try the strobe and the sound sensor. We did it for a scene, and it worked! Everyone was like, This is so cool. It was very visceral, not just for us, but for the actor to see that light and for all parties involved. It just made such a tactile, visceral experience.Still, there were some initial challenges, notes Han. When a flash was utilizedand this is also a concern with real muzzle flashesit can sometimes only hit onto half the frame because of the action of the camera shutter. Youll see this on paparazzi footage where you see bands of light, outlines Han. This was just not a good look. You had to do multiple takes and get lucky. Also, the lights were always off camera, which didnt always make sense, depending on where the gun was located.Original plate with flash gun.Final shot by ReDefine.Ultimately, the promise of some real (and useful) interactive lighting was there, and led to developing what became known as the Phase Synced Flash-Gun System. We worked with props on this, says Han. We created our own identical models of the guns. I went through dozens of different light bulbs until I found the brightest thing that, if you charged with extra oomph from a capacitor, you could really get a bright camera strobe flash. What we ended up doing was developing our own guns with our own custom electronics to create as bright a photography flash at the tip of the gun as possible.The flash guns worked by wirelessly communicating to a base device dubbed the phaser, which was also in sync with the camera and timecode syncd. We could phase tune the flash until it would cover the frame fully with no banding, and consistently so, describes Han. So it became, then, a great collaboration between camera and lighting. Camera had to get on board to sync their cameras. And lighting helped colorize the light to the desired look with gels.ReDefine handled gunshot hits, adding in the actual fiery muzzle flash cloud, tweaking the color, and adding in other elements like smoke, sparks and shells.You can get issue #29 of the magazine in PRINT or DIGITAL.The post It was very visceral for the actor to see that light appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·115 Views
  • Watch this VFX breakdown for The Electric State
    beforesandafters.com
    Go behind the scenes with Netflix, and also check out this post from Wonder Dynamics.One of the most tech-intensive films to date, The Electric State, seamlessly blended traditional and modern VFX tools to create a world where humans and robots collide. We joined the project after production wrapped, with actors already filmed in mocap suits. At that time, pic.twitter.com/86Zq8f0aEk Wonder Dynamics (@WonderDynamics) March 20, 2025 The post Watch this VFX breakdown for The Electric State appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·112 Views
  • Watch BlueBolts VFX breakdown for Nosferatu
    beforesandafters.com
    Go behind the scenes.The post Watch BlueBolts VFX breakdown for Nosferatu appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·118 Views
  • Issue #29 of befores & afters magazine is a full issue on the VFX of The Penguin
    beforesandafters.com
    All about the latest print magazine, PLUS, how to subscribe to the DIGITAL edition!Issue #29 of befores & afters magazine in print is now out! It covers the HBO series, The Penguin.With production visual effects supervisor Johnny Han and several VFX vendors, we look behind the scenes at the visual effects effort to depict the low and high points of Gotham featured in the series, correlating the look of the series with that of The Batman via lens choices, plus key moments such as a seawall flooding flashback, various blood and gore scenes, the approach to driving shots, and cosmetic VFX work.Also, befores & afters magazine now has a DIGITAL EDITION! You can access this exclusively on the befores & afters Patreon, either by buying individual issues, or by subscribing to the DIGITAL EDITION tier (see the Membership options). Under that tier, youll be sent a new issue every time one is released!Meanwhile, the magazine will still ALWAYS be in print. Find issue #29 at your local Amazon store:USAUKCanadaGermanyFranceSpainItalyAustralia JapanSwedenPolandNetherlandsThe post Issue #29 of befores & afters magazine is a full issue on the VFX of The Penguin appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·109 Views
  • An exclusive look at the sessions coming to CGA Belgrade 2025
    beforesandafters.com
    All about the upcoming computer graphics and arts conference. Plus, how to get in early for tickets.befores & afters is proud to be a media partner for CGA Belgrade 2025. The conference is taking place on April 10th and 11th, 2025 in Belgrade, Serbia, and covers a wide area of creative industries topics: animation, games, visual effects, technology, AI and more. Sessions are presented in English.We have an exclusive look at the program for CGA Belgrade. Check it out below.MAIN STAGE: KNOW ALL This will consist of spotlights on new game, film and animation releases, and new tech presentations Companies presenting: Nebius (AI video content creation), Wonder Dynamics (movie: The Electric State), Untold Studios, Woodblock, V House Animation (animated series: Agent 023), Golaem (history of crowd systems), Archangel Studio (game: Bleak Faith:Forsaken), Onyx Studio (game: South of Midnight)MAIN STAGE: PANELS Changing industry paradigm in a wake of AI (Wonder Dynamics, Nebius, 3Lateral/Epic Games, Golaem) How to work with brands in delivering real value with CGI (Telekom, DAT, ika) Creative Moxie: The Rise of Young Digital CreatorsTECH STAGE: KNOW HOW This will consist of looks at new tools and workflows Maya to Unreal Engine USD Workflows (Autodesk) Modeling, rigging, animation and VFX in Bifrost (Autodesk) Procedural world building with Houdini and Unreal (Crater Studio) Get smart with Mari: How to create and reuse Smart Materials (Foundry) The Crowds of Game Of Thrones (Golaem/Autodesk) Gaussian splatting in video production (Yandex)MASTERCLASSES These will consist of 2 hr deep dives into specific industry topics Fortnite Ecosystem The Next Frontier of Global Branding Marketing Motion Capture on spot (Centroid) Art of Combat Design (Sperasoft) Interactive and Procedural Environmental Effects (EBB Software) How to properly negotiate for a raise, as an Artist (Steamroller Animation)CGA BOARDROOM TALKS These are 1.5 hr invitation-only discussions on different industry topics Open-source formats (USD) (Autodesk, SideFX) Houdini User Group Enhancing the role of producers in VFX (Crater Studio, Digitalkraft) Brand marketing in the Metaverse (ika) Legal & security challenges in AI (Moravevi Vojnovi i Partneri) DoPs in the VFX Framing the virtual landscapeLIST OF SPEAKERSDragana Stamenkovi (Lead Animator @ Steamroller Animation), Djordje Stojiljkovic (Freelancer/Director of Photography), Mirko Boovi (Senior Game Designer @ Sperasoft), Bojana Simi (People Operations manager @ Materriya Talent Development), Paul Ringue (VFX Content Creator & Producer @ Foundry), Nicolas Chaverou (Principal Technical Product Manager @ Autodesk), Damjan Mitrevski (CEO @ V House Animation), Branimir ugi (Founder and programme director @ Art 365 / CIM forum), Roland Reyer (Technical Sales Specialist @ Autodesk), Luka Budia (VFX Artist @ Ebb Software), Aleksandra Todorovi (Senior producer @ Woodblock), John Paul Giancarlo (Technical Sales Specialist @ Autodesk), Igor Kovaevi (Director @ Centroid Serbia), Timon Tomaevi (Motion Capture Technician @ Centroid Serbia), Ivica Milari (Academy of Arts, Novi Sad), Bogdan Amidi (Technical Director @ Crater Studio)HOW TO GET TICKETS TO CGA BELGRADE Regular tickets are 110 EUR until March 28th. A bundle of 5 tickets is 500 EUR until March 28th Last minute tickets are 150 EUR Head to the ticket website to grab your tickets now!The post An exclusive look at the sessions coming to CGA Belgrade 2025 appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·113 Views
  • How Cosmo in The Electric State was made
    beforesandafters.com
    Watch Netflixs new featurette, which you can see here.The post How Cosmo in The Electric State was made appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·120 Views
  • Does Bridget Jones: Mad About the Boy have VFX? It sure does
    beforesandafters.com
    Check out Framestores breakdown of its invisible effects in the film.The post Does Bridget Jones: Mad About the Boy have VFX? It sure does appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·151 Views
  • Why planes flying at hundreds of miles an hour is really tough to pull off in VFX
    beforesandafters.com
    Making a dogfight.Today on the befores & afters podcast, were chatting to Digital Domain about Marvels new film Captain America: Brave New World with visual effects supervisor Hanzhi Tang and digital effects supervisor Ryan Duhaime. Digital Domain were principally responsible for the Celestial Island encounter, which includes a very dynamic dogfight featuring Captain America and Falcon.In the podcast we talk about how DD took in the original asset of Tiamut from Eternals and actually shrunk that down a little. We also talk about building digital ocean and sky assets, plus a new cloud shader, for the sequence. And going from previs which DD handled through to the final VFX of the dogfight and flying scenes. All while dealing with planes that very quickly move away from the point of origin in Maya scenes, making it a tricky task to light and render.Check out the previous coverage of Brave New World here at befores & afters, too.The post Why planes flying at hundreds of miles an hour is really tough to pull off in VFX appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·144 Views
Quizás te interese…