Befores & Afters
Befores & Afters
A brand new visual effects and animation publication from Ian Failes.
3 A la gente le gusta esto.
297 Entradas
2 Fotos
0 Videos
0 Vista previa
Actualizaciones Recientes
  • CGA Belgrade speaker preview: Hristo Velev
    beforesandafters.com
    The founder of Bottleship VFX will be speaking on his studio, developments in virtual production and AI agent tech.CGA Belgrade 2025 is coming up on 10 and 11 April get your tickets here!. In this special preview interview, befores & afters talks to one of the speakers from the conference, Hristo Velev, about his session at the event.Velev is a founding partner at Bottleship VFX, a boutique studio specializing in effects sims like water, destruction, fire. He previously worked at Pixomondo, Screen Scene, and Scanline VFX on films including Iron Man 3. Here he discusses the history of his own studio, getting into virtual production and AI, and what hell present at CGA Belgrade.b&a: Tell me a little about the history of Bottleship VFX and some of the main recent projects the studio has delivered?Hristo Velev: We started in 2013 as a simulation specialized VFX house. Worked on a globally diverse feature film slate over the years, added creature and environment capacity to our portfolio, and organically grew to about 20 artists. Very technically minded team, squeezing productivity by developing in house tools that automate as much as possible. Lately with the industry slump post COVID, we added software development and virtual production to our offerings.Comandante.b&a: Yes, youve dived into virtual production in recent times what kinds of things have you been doing and looking at in this area?Hristo Velev: We started a few years ago on an Italian feature called Comandante, led by Kevin Haug and Dave Stump. They had the idea of doing virtual production without greenscreens and LED stages, by tracking the camera in real time, rendering the virtual set in Unreal, and using AI to roto out the foreground, and composite for the director to review at the end of the day, then move quickly to post to present a complete version next day.We joined as the post house on set, but our responsibilities grew until we were contributing at most points in the workflow. It was a great start, and we delivered 36 shots while still on set, but that was just the early days. In 2024, with key technologies advancing, we updated our platform to use real-time raytraced rendering of a USD set by Chaos Vantage, real-time AI roto, and real-time compositing.CLAROS.This is a killer app now, that we call CLAROS you can shoot virtual production anywhere, and instantly jump to full featured post production. We showcased it in an experimental short film called Out of a dream that you can check out on the site at clarosvp.com, and well unveil at a showroom in Sofia on Apr 29, and at the FMX May 6-9.b&a: How about AI? Tell me what youve been experimenting with and releasing with cairos.ai?Hristo Velev: While theres some cool AI in Claros, the one in Cairos is maybe even more fun. It lets you speak to a virtual actor and direct it. It works by letting an AI agent access a semantically encoded database of motion descriptions that is fed by our inhouse mocap team, producing about 600 motions a week.Cairos in action.Conversing with you, the agent produces a list of animations that is then passed to an animation sequencer that splices them together and retargets to your character, and renders the animation in your browser. When youre happy, you request a download and get a package that you can use downstream in your pipeline. Were in an early phase, and working with first adopters to build it up to their needs, so we can open up to the public in a few months.b&a: Can you give a very short preview of what youll be talking about at CGA Belgrade?Hristo Velev: Ill give the audience a tour of both Cairos and Claros the tech stack, how they have evolved on top of traditional VFX tools, incorporating the new wave of AI, real-time rendering and other cutting edge tech.Find out more at https://cgabelgrade.comThe post CGA Belgrade speaker preview: Hristo Velev appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·13 Views
  • The Evolution of 3D Gaussian Splatting in Blender: A Look at the Latest 3DGS Render Addon Update
    beforesandafters.com
    This article explores the latest update of the 3DGS Render Blender Addon, which expands the functionality of 3D Gaussian Splatting in Blender for point cloud workflows. Key points include: Mesh-to-Gaussian Splats conversion: The new release enables conversion of .OBJ files into 3DGS .PLY format, allowing for exclusive point-cloud-based editing of existing mesh models. Exportable Face Edits: Now, face edits can be exported, ensuring 3DGS objects retain mesh adjustments for other software or collaborative work. Exportable Transforms: Object transformations like scaling, rotation, and position are preserved during export, addressing a previous limitation. 3DGS Painting and Texturing: Painting and texturing for 3DGS objects are introduced, providing creative freedom and remaining intact through rendering and export. Baking (Experimental): The new baking feature, based on Blenders node bake system, can reduce rendering or playback times for heavy scenes. Notable Minor Improvements: These include 3DGS UV Generation, optimized editing workflows, independent LQ/HQ and color edits, a new import method, scene refresh, and a revamped UI. Free and Open-Source: The addon remains free and open-source, promoting community-driven development for wider adoption of 3D Gaussian Splatting.3D Gaussian Splatting (3DGS) continues to gain momentum as a compelling approach for visualizing, editing, and animating point clouds in Blender. Now, the developers of the 3DGS Render Blender Addon have unveiled a major update that significantly expands the functionality of this workflowoffering new ways to convert, paint, bake, and export 3DGS objects. In this article, well dive deep into the exciting new features, explain how they benefit artists, and touch on why they mark a significant milestone in Blenders ongoing integration of 3D point cloud workflows.A Quick Primer on 3DGS Render Blender AddonFor those who may be new to the concept, 3D Gaussian Splatting (3DGS) involves representing 3D objects as a constellation of splat points or ellipsoids. This method allows for ultra-fast rendering and editing of point-cloud-based geometry, making it ideal for dense scene visualizations. Over time, 3DGS has evolved to support more features typically found in polygon-based workflows, such as crop-editing and render exports. While past versions of the 3DGS Render Addon set the groundwork, this latest update significantly broadens what you can achieve in Blender with point cloud data.Key Enhancements1. Mesh-to-Gaussian SplatsOne of the major highlights of the new release is the ability to convert .OBJ files into 3DGS .PLY format. This streamlined process transforms existing mesh models into 3DGS objects, unlocking exclusive point-cloud-based editing and processing methods. Its particularly useful for applying specialized effects to conventional meshes, as well as unifying file formats across entire scenes. 2. Exportable Face EditsUntil now, only edits to the point cloud data could be exported via the addon. In this release, you can also export face editswhich makes a big difference when refining or cleaning up geometry. Being able to carry those mesh adjustments out of Blender ensures your 3DGS objects remain true to the changes youve made, whether you need them for other software or for collaborative workflows.3. Exportable TransformsAnother limitation that has been addressed is the handling of object transformationsscaling, rotation, and position. With this update, all transforms applied will remain when exported, so you no longer lose this crucial information when moving your 3DGS objects to external tools.4. 3DGS Painting and TexturingThis update introduces painting and texturing for 3DGS objects, allowing you to color them with a direct brush or image-based textures. These enhancements remain intact through rendering and export, providing a new layer of creative freedom often missing in point-cloud-based workflows.5. Baking (Experimental)The new baking feature focuses on performance by locking in modifier effects, rather than recalculating every frame. Built on Blenders node bake system, it can significantly reduce rendering or playback times for heavy scenes, although it remains experimental. Large-scale projects may see considerable benefits but should also be mindful of potential stability issues.Notable Minor Improvements3DGS UV Generation: Automatically create UV maps on import, simplifying shading and animation.All editing modifiers can be added and re-added: Optimized workflows for iterative editing and modifiers can be used multiple times.Independent Low/High Quality and Color Edits: Individual material and color settings are now available for each 3DGS object, rather than one global configuration.New Import Method: Removed reliance on external dependencies, eliminating the warning banner in Blender Preferences.Scene Refresh: Re-initializes scene and object properties, aiding file transfers and preventing setup issues.Revamped UI: Improves mode-switching and provides performance tips, enhancing workflow efficiency.Takeaways and Next StepsWith the addition of mesh conversion, exportable face edits, enhanced transform capabilities, painting and texturing options, and experimental baking, the 3DGS Blender Render Addon demonstrates significant progress in integrating point-cloud techniques with standard 3D workflows. These improvements streamline the user experienceremoving external dependencies, refining the interface, and providing greater flexibility for editing and rendering.Notably, the addon remains free and open-source, reflecting the developers belief that breakthroughs in 3DGS result from iterative adaptations and community-driven collaboration. By sharing the project openly, contributors can refine point-cloud workflows and move 3D Gaussian Splatting closer to a widely adopted technique across diverse industries.Check out and download the addon for free on Blender Market and Github.See KIRI Engines official Update Release Video.Brought to you by KIRI Innovations:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post The Evolution of 3D Gaussian Splatting in Blender: A Look at the Latest 3DGS Render Addon Update appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·10 Views
  • Snow White puppeteer breaks down the process for on-set dwarf performance
    beforesandafters.com
    Check out Robin Guivers great Insta post.Movement and puppetry expert Robin Guiver has posted some fun behind the scenes photos from Snow White, showcasing how the seven dwarfs were crafted for the film. It involved some incredible on-set puppetry and choreography (then, of course, MPC animated the dwarfs as CG characters).Highly recommend checking out the Instagram post showcasing dwarf work and animals. View this post on InstagramA post shared by Robin Guiver (@robinguiver)The post Snow White puppeteer breaks down the process for on-set dwarf performance appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·38 Views
  • Metaphysics neural HMC performance injection tech
    beforesandafters.com
    We chat to Jo Plaete from Metaphysic about the use of its neural HMC performance injection tech, including for the character Rook in Alien: Romulus.Welcome to the brand new AI and machine learning in VFX season of episodes on the befores & afters podcast. This season is all about where AI and machine learning are being used right now in visual effects. Today Im joined by Jo Plaete, chief innovation officer and visual effects supervisor from Metaphysic.On the podcast: Ian Failes (left) is joined by Metaphysics Jo Palate.Youll likely be familiar with Metaphysics work on three big projects from last year Furiosa, where they worked on the Bullet Farmer, Alien: Romulus, where they helped craft Rook, and Here, where they de-aged (and aged) Tom Hanks and Robin Wright, among several other characters. For those projects, Metaphysic utilized its Neural Performance Toolset, including its Neural Editing and Animation tools.Go in-depth on Alien: Romulus in issue #22 of befores & afters magazine.For Rook in Alien: Romulus, in particular, Metaphysic also relied on something it coined Neural HMC performance injection, where the performance of an actor captured in an HMC was also part of the mix of delivering that character. Thats what we focus on today, as part of Metaphysics approach to AI-driven facial animation.Listen in, above.The post Metaphysics neural HMC performance injection tech appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·9 Views
  • That all-greenscreen version of Sin City rocked my world 20 years ago
    beforesandafters.com
    Celebrating the two-decades anniversary of the Robert Rodriguez film.I remember getting the two-disc DVD collectors edition of Sin City and watching a featurette on there called The Movie in High-Speed Green Screen. It was, exactly that. Director Robert Rodriguez presented just the raw elements shot at his Troublemaker Studios on greenscreen, which had been sped up about 800 times to form a 10 minute take on the movie.This was an amazing thing to see. Along with the DVDs other informative featuretes, I learnt a lot. I certainly wish other filmmakers did something similar on these kinds of films. Of course, Wes Ball released the ENTIRE version of his Kingdom of the Planet of the Apes with the raw performance capture plates (called Inside the Lens: The Raw Cut) on Blu-ray recently, and I highly recommend it. Please, lets make this a thing again. In the meantime, this week represents the 20th anniversary of Sin City, a film that capitalized appropriately on the digital backlot style of filmmaking and made it work perfectly for Frank Millers source material. A bevy of effects studios contributed to the film, including KNB EFX Group, Troublemaker Digital Studios, Hybride, CafeFX and The Orphanage. Congrats to all of them, and thank you Robert Rodriguez for bringing Sin City to life, and being willing to explain and reveal the process.The post That all-greenscreen version of Sin City rocked my world 20 years ago appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·30 Views
  • New video sheds light on mocap used for Transformers One
    beforesandafters.com
    Motion capture done at ILM was used to help craft virtual story reels.Read more about the film issue #25 of befores & afters magazine.The post New video sheds light on mocap used for Transformers One appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·43 Views
  • On The Set Pic: Snow White
    beforesandafters.com
    The post On The Set Pic: Snow White appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·57 Views
  • Watch ReDefines VFX breakdown for Those About To Die
    beforesandafters.com
    Horses, crocodiles and lions!The post Watch ReDefines VFX breakdown for Those About To Die appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·57 Views
  • Watch Absolutes VFX breakdown for the one-shot Adolescence series
    beforesandafters.com
    The studio delivered 80 shots across the 4 eps, including invisible reflection and shadow removals, environment clean-ups, and helping to enable that camera through the window shot.Watch the breakdown here at Absolutes site.The post Watch Absolutes VFX breakdown for the one-shot Adolescence series appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·65 Views
  • Golaem crowd tools have now been added to the Autodesk M&E Collection
    beforesandafters.com
    Watch the video to find out more.The post Golaem crowd tools have now been added to the Autodesk M&E Collection appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·67 Views
  • We asked them to render out gameplay via nine different cameras
    beforesandafters.com
    How The Penguins driving plates included footage from driving around in the video game, Gotham Knights. An excerpt from issue #29 of befores & afters magazine.The Penguin features a vast amount of driving scenes, which VFX helped to complete via plate shoots and even a virtual driving solution based on a video game. We really tried to really distinctly say were going from point A to point B, to be clear about the geography, remarks visual effects supervisor Johnny Han. I said, Lets buy the biggest maps we can get. We put on the wall the five boroughs of New York. We knew we were going to shoot somewhere in these five boroughs. This is before we even knew exactly where our shooting locations would be. But we wanted to be ready to know where to shoot paths for our driving plates.Get issue #29 in print at Amazon from your local store:USAUKCanadaGermanyFranceSpainItalyAustraliaJapanSwedenPolandNetherlandsOr grab the digital edition at Patreon: https://www.patreon.com/beforesandafters/shop/issue-29-penguin-digital-1313436You can also subscribe to the DIGITAL MAGAZINE tier at Patreon:https://www.patreon.com/c/beforesandaftersThe can-do attitude of the on-set VFX team came into play as a preparation stage before final driving plates would be filmed, as Han explains. Our PA had a production rental car, and all we did was strap an Insta360 camera on top. Then we waited until it was about 5pm in New York, when it got dark in wintertime, and we tried out some routes. Basically we were shooting 360 degree video. Erin Sullivan, our VFX editor, put it all together, and then wed show everyone. She cut it to time with text at the bottom that represented the dialogue to get the pacing right. We would get sign-off from executive producer Craig Zobel and Lauren in terms of tone, neighborhood and speed of car. We even acted out parts of the scenes, like, theres a moment in the script where they would stop at a red light and Oz throws a phone out the window. So, we actually integrated those beats into our driving footage.When it came time to shoot the real plates, PlatePros was enlisted to drive those predetermined paths and shoot with their multi-camera array. The resulting plates were then played back at Carstages Long Island City facility for scenes of actors inside vehicles.Then, in addition to these live-action plates where New York locations stood in for Gotham, some further driving scenes were realized that were virtual. For some scenes, says Han, we needed a bit more of a richer Gotham, something that felt a little bit more like were in the city. It can be too hard sometimes to get plates in Manhattan because its just too crowded. So, we had this idea related to the video game, Gotham Knights.The idea was to take the open world game in the Batman ecosystemwhich, like HBO, fell under the Warner Bros. Discovery group of companiesand collaborate in terms of rendered out driving plates that would again be played at Carstage. We got a PlayStation 5 and took screengrabs of certain moments as we drove along in the open world, explains Han. Then we asked Warner Bros. Games to render out gameplay through these driving paths via nine different cameras, as if they were driving plates. They had never done anything like this, and thus were so excited and eager to contribute.That process involved some reverse-engineering starting with the multi-camera array live-action results from PlatePros, and then replicating the lens values, height of the ground and angles for the game footage. We ran into an interesting problem where, describes Han, there were some non-deterministic aspects of the game, which is typical of game engines. Some of the traffic lights are randomized or people crossing the streets are randomized. Or, FX like steam from pipes, if you played it twice, its never exactly the same. We also had them turn off the lens flares from lamp posts as those were the kinds of things that would happen optically once we shot the actual scenes.One major benefit of the approach was that the plates could be orchestrated into a perfect loop. Since its a game, notes Han, we could link in right to the start of the path, so that you had this endless driving plate. In usual driving plates, this is actually often a problem. You never know when youre going to hit the end of your footage from traditional driving plates. You might be in the middle of an important line but then you see the background switch. That was one nice thing we avoided.The post We asked them to render out gameplay via nine different cameras appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·78 Views
  • First look video for The Legend of Ochi goes behind the scenes of on-set puppetry
    beforesandafters.com
    A24 released the behind the scenes video.The post First look video for The Legend of Ochi goes behind the scenes of on-set puppetry appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·75 Views
  • Wt FX used its Loki state machine for muscle simulations on Red Hulk in Captain America: Brave New World
    beforesandafters.com
    Plus, the challenge of realizing the color RED on screen.Today on the befores & afters podcast, Im joined by Wt FX VFX supervisor Dan Cox and animation supervisor Sidney Kombo-Kintombo, who break down in a lot of detail all the challenges in bringing Red Hulk in Captain America: Brave New World to life. A couple of really interesting things they mention include the fact that Wt FX actually took the original Marvel design for Red Hulk and re-modeled and sculpted it to have more Harrison Ford traits, especially the eyes. Wt FX had to find ways to distinguish between previous versions of the green Hulk from past films where that character was like a gorilla, their red Hulk was originally more like a bear, and then a honey badger in how relentless it became.It was also the first time Ford has ever done some motion capture on a film, although ultimately the red Hulk was one of the most keyframed characters done in a long time at the studio. Wt FX looked to a real bodybuilder brought in for reference to see specifically how the muscles would move. It was the first film in which the VFX studio used its Loki state machine for muscle simulation. They also built a new muscle set for their generic gen-man to accommodate more define muscle fibers. The result was a lot of detail and a lot less shot-sculpting only around 7% of the final shots.Another fun thing the pair share is about transformations of Harrison Fords character into red Hulk Weta FX developed all kinds different approaches include breaking bones and skinjust in case they were needed in the film, although it was ultimately a little more subtle than that. Finally we talk about the challenges of realizing a red character on screen, even to the point of giving accurate red color details to Legacy Effects which made a reference bust of red Hulk for shooting.Check out the podcast above, and this progression of images showcasing the work.Click to view slideshow.The post Wt FX used its Loki state machine for muscle simulations on Red Hulk in Captain America: Brave New World appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·99 Views
  • See the original mocap Kid Cosmo test in this new video from The Electric State
    beforesandafters.com
    A test for Kid Cosmo was devised with a little girl. The full video breaks down the motion capture and VFX for the film.The post See the original mocap Kid Cosmo test in this new video from The Electric State appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·93 Views
  • Watch this new Squid Game 2 VFX breakdown from Gulliver Studios
    beforesandafters.com
    Shows how large scenes were filmed, and the digital visual effects behind them.The post Watch this new Squid Game 2 VFX breakdown from Gulliver Studios appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·120 Views
  • Scanline VFX releases new Making of Senna featurette
    beforesandafters.com
    Behind the scenes on the shooting, virtual production and digital VFX work in the series.The post Scanline VFX releases new Making of Senna featurette appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·99 Views
  • It was very visceral for the actor to see that light
    beforesandafters.com
    How a new kind of safe flash gun for interactive light, linked to the camera, was invented for The Penguin. An excerpt from issue #29 of befores & afters magazine.Many of the blood hits in The Penguin are, of course, inflicted via gunshots. Recent heightening of set safety in relation to firearms has meant that prop guns with chargesand therefore muzzle flashesare now less used. But in the series, guns were a key storypoint, and many scenes would be filmed in dark environments. The concern from visual effects supervisor Johnny Han was that there would be little or no interactive light generated on set since there would not be the usual muzzle flashes acquired.In my experience, breaks down Han, it often ends up a lot more expensive in post to add in interactive light, because artists have to get in there and start painting in light around actor faces and clothing. Its very delicate, fine artist work. Frankly, a lot of movies dont put that interactive light in and it really doesnt look good, in my opinion.The dilemma of interactive on-set lighting for gunshots forced Han to think differently about the issue. It was the third day of shooting. I brought my photography strobe from home. I got a sound sensor, which is used for sports photography, where it will be triggered by the sound of a ball hitting a baseball bat. We had a prop gun that still went Pop, pop, pop. So I figured, lets try the strobe and the sound sensor. We did it for a scene, and it worked! Everyone was like, This is so cool. It was very visceral, not just for us, but for the actor to see that light and for all parties involved. It just made such a tactile, visceral experience.Still, there were some initial challenges, notes Han. When a flash was utilizedand this is also a concern with real muzzle flashesit can sometimes only hit onto half the frame because of the action of the camera shutter. Youll see this on paparazzi footage where you see bands of light, outlines Han. This was just not a good look. You had to do multiple takes and get lucky. Also, the lights were always off camera, which didnt always make sense, depending on where the gun was located.Original plate with flash gun.Final shot by ReDefine.Ultimately, the promise of some real (and useful) interactive lighting was there, and led to developing what became known as the Phase Synced Flash-Gun System. We worked with props on this, says Han. We created our own identical models of the guns. I went through dozens of different light bulbs until I found the brightest thing that, if you charged with extra oomph from a capacitor, you could really get a bright camera strobe flash. What we ended up doing was developing our own guns with our own custom electronics to create as bright a photography flash at the tip of the gun as possible.The flash guns worked by wirelessly communicating to a base device dubbed the phaser, which was also in sync with the camera and timecode syncd. We could phase tune the flash until it would cover the frame fully with no banding, and consistently so, describes Han. So it became, then, a great collaboration between camera and lighting. Camera had to get on board to sync their cameras. And lighting helped colorize the light to the desired look with gels.ReDefine handled gunshot hits, adding in the actual fiery muzzle flash cloud, tweaking the color, and adding in other elements like smoke, sparks and shells.You can get issue #29 of the magazine in PRINT or DIGITAL.The post It was very visceral for the actor to see that light appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·99 Views
  • Watch this VFX breakdown for The Electric State
    beforesandafters.com
    Go behind the scenes with Netflix, and also check out this post from Wonder Dynamics.One of the most tech-intensive films to date, The Electric State, seamlessly blended traditional and modern VFX tools to create a world where humans and robots collide. We joined the project after production wrapped, with actors already filmed in mocap suits. At that time, pic.twitter.com/86Zq8f0aEk Wonder Dynamics (@WonderDynamics) March 20, 2025 The post Watch this VFX breakdown for The Electric State appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·97 Views
  • Watch BlueBolts VFX breakdown for Nosferatu
    beforesandafters.com
    Go behind the scenes.The post Watch BlueBolts VFX breakdown for Nosferatu appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·107 Views
  • Issue #29 of befores & afters magazine is a full issue on the VFX of The Penguin
    beforesandafters.com
    All about the latest print magazine, PLUS, how to subscribe to the DIGITAL edition!Issue #29 of befores & afters magazine in print is now out! It covers the HBO series, The Penguin.With production visual effects supervisor Johnny Han and several VFX vendors, we look behind the scenes at the visual effects effort to depict the low and high points of Gotham featured in the series, correlating the look of the series with that of The Batman via lens choices, plus key moments such as a seawall flooding flashback, various blood and gore scenes, the approach to driving shots, and cosmetic VFX work.Also, befores & afters magazine now has a DIGITAL EDITION! You can access this exclusively on the befores & afters Patreon, either by buying individual issues, or by subscribing to the DIGITAL EDITION tier (see the Membership options). Under that tier, youll be sent a new issue every time one is released!Meanwhile, the magazine will still ALWAYS be in print. Find issue #29 at your local Amazon store:USAUKCanadaGermanyFranceSpainItalyAustralia JapanSwedenPolandNetherlandsThe post Issue #29 of befores & afters magazine is a full issue on the VFX of The Penguin appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·98 Views
  • An exclusive look at the sessions coming to CGA Belgrade 2025
    beforesandafters.com
    All about the upcoming computer graphics and arts conference. Plus, how to get in early for tickets.befores & afters is proud to be a media partner for CGA Belgrade 2025. The conference is taking place on April 10th and 11th, 2025 in Belgrade, Serbia, and covers a wide area of creative industries topics: animation, games, visual effects, technology, AI and more. Sessions are presented in English.We have an exclusive look at the program for CGA Belgrade. Check it out below.MAIN STAGE: KNOW ALL This will consist of spotlights on new game, film and animation releases, and new tech presentations Companies presenting: Nebius (AI video content creation), Wonder Dynamics (movie: The Electric State), Untold Studios, Woodblock, V House Animation (animated series: Agent 023), Golaem (history of crowd systems), Archangel Studio (game: Bleak Faith:Forsaken), Onyx Studio (game: South of Midnight)MAIN STAGE: PANELS Changing industry paradigm in a wake of AI (Wonder Dynamics, Nebius, 3Lateral/Epic Games, Golaem) How to work with brands in delivering real value with CGI (Telekom, DAT, ika) Creative Moxie: The Rise of Young Digital CreatorsTECH STAGE: KNOW HOW This will consist of looks at new tools and workflows Maya to Unreal Engine USD Workflows (Autodesk) Modeling, rigging, animation and VFX in Bifrost (Autodesk) Procedural world building with Houdini and Unreal (Crater Studio) Get smart with Mari: How to create and reuse Smart Materials (Foundry) The Crowds of Game Of Thrones (Golaem/Autodesk) Gaussian splatting in video production (Yandex)MASTERCLASSES These will consist of 2 hr deep dives into specific industry topics Fortnite Ecosystem The Next Frontier of Global Branding Marketing Motion Capture on spot (Centroid) Art of Combat Design (Sperasoft) Interactive and Procedural Environmental Effects (EBB Software) How to properly negotiate for a raise, as an Artist (Steamroller Animation)CGA BOARDROOM TALKS These are 1.5 hr invitation-only discussions on different industry topics Open-source formats (USD) (Autodesk, SideFX) Houdini User Group Enhancing the role of producers in VFX (Crater Studio, Digitalkraft) Brand marketing in the Metaverse (ika) Legal & security challenges in AI (Moravevi Vojnovi i Partneri) DoPs in the VFX Framing the virtual landscapeLIST OF SPEAKERSDragana Stamenkovi (Lead Animator @ Steamroller Animation), Djordje Stojiljkovic (Freelancer/Director of Photography), Mirko Boovi (Senior Game Designer @ Sperasoft), Bojana Simi (People Operations manager @ Materriya Talent Development), Paul Ringue (VFX Content Creator & Producer @ Foundry), Nicolas Chaverou (Principal Technical Product Manager @ Autodesk), Damjan Mitrevski (CEO @ V House Animation), Branimir ugi (Founder and programme director @ Art 365 / CIM forum), Roland Reyer (Technical Sales Specialist @ Autodesk), Luka Budia (VFX Artist @ Ebb Software), Aleksandra Todorovi (Senior producer @ Woodblock), John Paul Giancarlo (Technical Sales Specialist @ Autodesk), Igor Kovaevi (Director @ Centroid Serbia), Timon Tomaevi (Motion Capture Technician @ Centroid Serbia), Ivica Milari (Academy of Arts, Novi Sad), Bogdan Amidi (Technical Director @ Crater Studio)HOW TO GET TICKETS TO CGA BELGRADE Regular tickets are 110 EUR until March 28th. A bundle of 5 tickets is 500 EUR until March 28th Last minute tickets are 150 EUR Head to the ticket website to grab your tickets now!The post An exclusive look at the sessions coming to CGA Belgrade 2025 appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·102 Views
  • How Cosmo in The Electric State was made
    beforesandafters.com
    Watch Netflixs new featurette, which you can see here.The post How Cosmo in The Electric State was made appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·108 Views
  • Does Bridget Jones: Mad About the Boy have VFX? It sure does
    beforesandafters.com
    Check out Framestores breakdown of its invisible effects in the film.The post Does Bridget Jones: Mad About the Boy have VFX? It sure does appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·140 Views
  • Why planes flying at hundreds of miles an hour is really tough to pull off in VFX
    beforesandafters.com
    Making a dogfight.Today on the befores & afters podcast, were chatting to Digital Domain about Marvels new film Captain America: Brave New World with visual effects supervisor Hanzhi Tang and digital effects supervisor Ryan Duhaime. Digital Domain were principally responsible for the Celestial Island encounter, which includes a very dynamic dogfight featuring Captain America and Falcon.In the podcast we talk about how DD took in the original asset of Tiamut from Eternals and actually shrunk that down a little. We also talk about building digital ocean and sky assets, plus a new cloud shader, for the sequence. And going from previs which DD handled through to the final VFX of the dogfight and flying scenes. All while dealing with planes that very quickly move away from the point of origin in Maya scenes, making it a tricky task to light and render.Check out the previous coverage of Brave New World here at befores & afters, too.The post Why planes flying at hundreds of miles an hour is really tough to pull off in VFX appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·132 Views
  • Imageworks has posted a bunch of Red One breakdowns online
    beforesandafters.com
    Go behind the scenes.The post Imageworks has posted a bunch of Red One breakdowns online appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·127 Views
  • Riding the Rollercoaster: How Scenario Planning Keeps Creative Businesses on Track
    beforesandafters.com
    The creative industries film, VFX, animation, games, advertising are a thrilling rollercoaster. One minute youre soaring high, riding the wave of a successful project or a booming market. The next, youre plummeting into the unknown, facing unexpected client changes, budget cuts, or industry-wide disruptions.The Fundamental Challenge: Navigating Constant ChangeCreative businesses (and their projects) rarely progress according to a single linear plan. When a key client pushes their timeline, when a critical team member becomes unavailable, or when scope expands mid-project, the ripple effects cascade across your entire operation.Unfortunately, most project planning tools do not address the key hypotheticals that drive your business decisions every day. They force you to either maintain multiple separate plans or make changes that overwrite your original assumptions, leaving you without a clear picture of alternatives or their implications. What creative businesses need isnt just project planning its the ability to model multiple scenarios as conditions inevitably change.What is Scenario Planning?Scenario planning is about creating what-if scenarios to explore different possible futures. Its about testing ideas, analyzing potential outcomes, and making informed decisions based on data, not guesswork.The Old Way: Spreadsheets and StressFor decades, creative studios have relied on spreadsheets for their planning. These trusty tools, while familiar, are often a source of stress and inaccuracy. Data entry errors, outdated information, and complex formulas that break under pressure its a recipe for sleepless nights and risky decisions based on hunches rather than actual data.The New Way: Scenario Planning with ProjectalNow imagine a world where you could confidently navigate the ups and downs, predict potential pitfalls, and seize new opportunities with clarity. Thats the power of scenario planning, and Projectal is a tool that makes that a reality for creative businesses.Why is Scenario Planning Crucial for Creative Businesses?Visibility:Get a real-time, accurate picture of your studios current state projects, departments, budgets, and staff.Agility:Respond quickly to client changes, market fluctuations, and unexpected challenges.Confidence:Make informed decisions with data-driven insights, reducing stress and anxiety.Collaboration:Bring your team together to contribute to strategic planning and build consensus.Innovation:Explore new markets, test new business models, and stay ahead of the competition.Change is the only constant in VFX production, making scenario planning essential for success,saysImke Fehrmann, Global Head of Production at RISE Visual Effects Studios.Projects evolve rapidly based on client needs and creative direction, so we must be nimble and prepared with adaptable scenarios that can be implemented quickly. Projectals new scenario planning features will be a game-changer in ensuring we efficiently manage resources and schedules in our upcoming VFX projects.How Projectal Solves the Scenario Planning PuzzleProjectal isnt just another project and resource management tool; its a strategic planning platform designed specifically for the complexities of creative businesses. Heres how it empowers you:Sandboxes:Create private workspaces for scenarios that test different ideas without affecting your live data.Comprehensive Data:Model scenarios with all your critical data companies, locations, staff, projects, budgets, schedules, and more.Real-time Insights:See the impact of your decisions instantly, with dynamic reports and visualizations.Collaboration Tools:Share scenarios with your team, gather feedback, and make collaborative decisions.Time Machine:Capture a moment in time, allowing you to compare scenarios and track changes.Tax Rebates:Model the impact of tax rebates on your projects and budgets.Grow Your Business:Easily model new departments, locations, or business models.With extensive experience at leading creative companies such as DNEG, Technicolor, and R/GA,Daniel Jurow, Founder & CEO of Sevoir Groupregularly advises his clients to add scenario planning into their management workflow.By continuously assessing multiple what ifs you will be more aware of the lurking risks as well as the smart opportunities that are lying beyond sight of your teams everyday reporting,he explains.Projectal provides creative businesses with the embedded tools needed to make informed capacity, scheduling, and pricing decisions and stay ahead of the curve. Its a significant improvement over current methods.Real-World Scenarios in Film, VFX, Animation, Games and AdvertisingManaging Client Change Orders:Quickly assess the impact of changes on schedules and budgets, enabling informed negotiations with clients.Bidding on New Projects:Model different staffing and budget scenarios to create competitive and profitable bids.Outsourcing Work:Evaluate the cost-effectiveness and scheduling impact of outsourcing specific tasks.Downsizing:Plan for necessary reductions while minimizing disruption and maintaining project momentum.Opening a New Office:Compare potential locations, analyze staffing needs, and evaluate financial implications.Mergers & Acquisitions:Model the combined entity to assess synergies and identify potential challenges.Performance Reviews:Analyze the impact of salary increases on budgets and project costs.Entering New Markets:Explore new opportunities by modeling different business models and market strategies.Financial Planning:Analyze project profitability to optimize financial outcomes and ensure long-term business sustainability.Paul Schumann, CEO at JanusKS, the developers of Projectal, added Scenario planning is crucial for ensuring the smooth and resilient operation of a creative business. Projectal enables management to anticipate challenges, create contingency plans through what-if modeling, and make data-driven decisions, replacing guesswork and spreadsheets. Projectal is the ideal tool for riding the rollercoaster ride at creative businesses.About RISE | Visual Effects StudiosRISE | Visual Effects Studios was founded in 2007 by Sven Pannicke, Robert Pinnow, Markus Degen and Florian Gellinger in Berlin. The plan was to focus with a small, hand-picked team on German TV and feature film effects but that plan failed. Today, over 260 artists call the award-winning company their creative home in Stuttgart, London, Munich, Cologne and Berlin, making it one of the biggest VFX studios in Europe. RISE has become partner in crime for directors like Tom Tykwer, Lisa Joy, Mike Flanagan, Matthew Vaughn, Guy Ritchie, Gore Verbinski, studios like Marvel (Eternals, Loki, WandaVision, Captain Marvel, ), Warner Bros. (Fantastic Beasts 3, Reminiscence, ), Netflix (Midnight Mass, Stranger Things), Sony Pictures (Uncharted), Studio Canal (Gunpowder Milkshake) and also produces more and more animated features (Richard the Stork, Dragon Rider). Its sister production company RISE PICTURES develops its own original content and co-produces films (Stowaway) and series for an international audience. RISE acts as collaborator for episodic series as well as television and feature film, from early concept to mastering, concept art and previs, every day on set, during effects production and animation, as an advisor for the Digital Intermediate process. Our supervisors are reliable, creative partners for directors, production designers and directors of photography alike from the first to the final step of the way.https://www.risefx.comAbout Sevoir GroupSevoir Group is a management consultancy built specifically for creative businesses, with deep roots in VFX, animation, advertising, design, and creative technology. The firms mission is to help studios, agencies and technology innovators achieve operational strength and sustained profitability without compromising what makes them special. Sevoir Group has decades of combined experience leading through industry disruption, technological shifts, and growth challenges, and combines state-of-the-art insights and methods with practical implementations suited to the unique dynamics of visual storytelling businesses.https://www.sevoirgroup.comAbout ProjectalProjectal is the leading project management and workflow platform for creative businesses. Leveraging AI and machine learning, Projectal streamlines departments, locations, staff, and projects, eliminating the need for spreadsheets, off-the-shelf tools, and manual steps.Projectal enhances how studios bid, budget, plan, schedule, track, and report on projects, ensuring the entire team works from the same real-time data. Scenario planning features allow in-depth what-if analysis to help deliver projects on time and within budget.Projectal is used by top creative businesses worldwide, including advertising agencies, animation companies, event and entertainment firms, game developers, sound studios, VFX studios, and virtual production teams. Projectal integrates seamlessly with tools like Shotgrid, ftrack, Kitsu, HR tools, and finance tools. Comprehensive developer APIs and documentation are available to connect and extend Projectal, fitting perfectly into any studios workflow.https://projectal.comBrought to you by Projectal:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post Riding the Rollercoaster: How Scenario Planning Keeps Creative Businesses on Track appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·147 Views
  • Crafting the creatures of The Gorge
    beforesandafters.com
    Behind the visual effects for the mutated Hollow Men in the film.In Scott Derricksons The Gorge, the characters Levi (Miles Teller) and Drasa (Anya Taylor-Joy) descend into the mysterious depths of the chasm they had been put in charge of surveilling, only to soon encounter a wealth of unusual creatures and atmospheric landscapes. These Hollow Men, other creatures and a toxic fog were visual effects realized by Framestore for the film, working with production visual effects supervisor Erik Nordby.Several parts of Framestore contributed, including Framestores Art Department, its pre-production team (FPS) and VFX studios in different locations around the world. Here, befores & afters finds out from Melbourne-based Framestore visual effects supervisor Joao Sita about some of the specific creature challenges (Framestores other VFX supervisors on the show were Pete Dionne and Jonathan Fawkner).The main Alpha of the Hollow Men featured in the movie was one of Framestores principal challenges. Production filmed a stand-in performer, actor James Marlowe, wearing make-up effects prosthetics. The characters are depicted as mutated humans who have merged with nature, often with tree-like branches and foliage growing out of them. Framestore took that on-set practical make-up and costume as a base and ultimately generated a fully CG character that could then be used to selectively add to the existing performance. The result, observes Sita, was something even scarier and creepy.We ended up having to replace most of the performance, really. We kept, most of the time, one eye and one cheek, along with the mouth. On the prosthetics they had a third eye, but the location was on the side of the head. When we were building the CG asset, we realized that we would get a stronger reading of the eye if it was actually lined up with the face and the other two eyes. It allowed us to see the three eyes always looking and gazing at something.Body tracking was a key component in making this character possible, identifies Sita. We focused on the features where the blend would happen. We would do very tight tracks for the face, shoulders and hips, making sure that those rotations and proportions were right. We even went right down to getting the body track artists to track certain expressions, since the blend would happen sometimes between areas of the plate that had a lot of performance, like the eyebrows or the side of the nose. The body track artists would do some specific shapes for us to then go into animation and drive the refinements of that.We use something here at Framestore which I think is a pretty neat workflow, continues Sita. From an initial body track version, well swap to the asset and the proper asset rigging, and there we do an animation review already. So before animation goes in and starts altering things, before that, well just look at what the base motion from the body track would be with the new asset. It helps us to more accurately be able to tell animation to focus on certain things like the hips or the ways the hands are moving, for example.One of the specific challenges Framestore faced was how much human anatomy to retain in the face of the Hollow Man. It comes down to the specifics of it, says Sita, in terms of working out how much human anatomy you need to read human performance. For example, wed ask ourselves, Could this branch become a human eyebrow? Would it move like an eyebrow? Is it stiffer? Does the third eye have some sort of impaired function, or does it have the same movement as the other two eyes of the actor? Another VFX aspect of the character involved adding in even more branches that stick out of its body. We wanted it to feel like there was this negative space in the forearm and in the shoulders, advises Sita That was another interesting piece of work because we started purely with a rotomation of the character, but as soon as we swapped to the digital version of his body, it had a bulkier, bigger stance, so we had to animate it to serve serve the performance, but now in a different body shape.The Alphas branches, foliage, clothing and the unique quill on its head required a close collaboration between animation and CFX (character effects). There were all those intertwined branches deforming and sliding against each other, explains Sita. So, CFX was a key component on the show to get the costume looking right or the quill reading dynamic enough without looking distracting. We used CFX for something else, too, adds Sita. When he is angry, there are these branches on the side of his body that expand and splay. Instead of going through the traditional route of taking that back into rigging and then rigging doing a first-pass animation test, we said, lets get CFX to work with those and find out creative ways to portray this idea that his mood would affect the branches on his body.Then there was the snake. It actually lives inside the Alphas body. Initially, remarks Sita, it was a much bigger character, in so far as it goes through the body and, at some point, it reveals itself. In the final cut we ended up just with a reveal of the snake in a shot, where it mimics what the Alpha is doing and is trying to protect him.At one point, Drasa faces off against the Alpha. The fight scene involved stunt doubles for both characters. Interestingly, says Sita, the prosthetics were different between the main Hollow Man performer and the stunt double. Also, with the stunt double, the prosthetics might sometimes fall off or break. So we had a few continuity challenges that werent straightforward or linear. We had to look at all this and make calls on where to go full CG. We ended up replacing things with a unified CG version that we made, which was based off of the prosthetics worn by the main actor.Drasa uses a fiery torch to fend off the Alpha during the fight. For that moment, Taylor-Joy held an LED light prop for interactive lighting, with Framestore adding in CG fire and more interactive elements on the CG character. Earlier, Drasa and Levi encounter all kinds of wild creatures (more Hollow Men) and plantlife that has been mutated by toxic fog. Hollow Men on horseback were realized on set as stunt performers in gray suits on horses that Framestore used to start its visual effects process. We did so much work on the horses, marvels Sita. Just all this tiny vegetation and vines growing over them. The mane was changed to vines and the skull was made to have a wood-like mutation that was really neat. In addition to those creatureswhich are also glimpsed previously attempting to escape the gorgeFramestore was responsible for a range of environments as Levi and Drasa traverse the lower gorge past biochemical facilities and mutated vegetation. Different colored and always moving toxic plumes were a large part of this work. Scenes of the characters escaping vertically in a Jeep up the gorge wall pursued by the creatures were filmed against bluescreen with a vehicle mock-up, and then also handled by Framestore. The post Crafting the creatures of The Gorge appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·146 Views
  • Watch this Severance VFX breakdown from season one
    beforesandafters.com
    It comes from VFX artist David Piombino, compositing supervisor at MPC on the first season.The post Watch this Severance VFX breakdown from season one appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·176 Views
  • How the gorge was made in The Gorge
    beforesandafters.com
    Behind the visual effects by DNEG in the film.The premise of Scott Derricksons The Gorge involves two charactersLevi (Miles Teller) and Drasa (Anya Taylor-Joy)finding themselves each responsible for the surveillance of a mysterious gorge, but on opposite sides of the chasm.For scenes of the characters in their watchtowers and around the top of the gorge, DNEG was tasked by production visual effects supervisor Erik Nordby for delivering the above-gorge environment. Production filmed the actors at Warner Bros. Studios, Leavesden. There was a strip of that pathway where Levi walks on the western edge, outlines DNEG visual effects supervisor Anelia Asparuhova. That pathway was practical, and the tops of the towers were practical, but everything else was CG. DNEG relied on LiDAR scans and texture photography from Norway to build the digital gorge asset. There was also some scanning that Erik Nordby did in Scotland that we used for the detailed level of the rocks, says Asparuhova. To build the gorge, we reasoned it would hypothetically be located in central Europe, and so we took ideas not only from Norway gorges but also ones in Greece, Turkey and Bulgaria. Thats why you see that the forests were mostly coniferous, but also with a few deciduous trees sprinkled here and there to suggest that were in that north central part of Europe.We even hired a geologist in the first few weeks, adds Asparuhova. We were really, really adamant to make sure that, even though we were driving it artistically, we didnt want to build things that couldnt exist in nature. DNEG also delivered digital environments for scenes when the characters are in the surrounding forest, including for an action scene involving drones. We had some scenes, for example, notes DNEG visual effects supervisor Sebastian von Overheidt, where Drasa is hiding behind a tree. Shes running through the forest. Thered be maybe two real trees behind her, but the rest is all full-CG forest. We had to figure out the forest ground, the exact species of the trees, the exact look and color of the leaves and all those things.In addition to the gorge itself, and surrounding foliage, DNEG was responsible for generating a thick layer of fog around the gorges top. For that, advises Asparuhova, we did a few tests of what the fog was supposed to look like. The fog is almost like another character in the movie, and we went through a few different levels of density. We wanted to make sure that it wasnt too flat because obviously that is just not something very interesting to look at. But we did look at a lot of canyons with fog in them just to get some ideas of what that could look like. We wanted to make sure that it had enough motion in it, enough movement to keep it alive, but not to distract from the rest of the action. We had to build the gorge walls deep enough for it to not end up with any issues and any edges, continues Asparuhova. They were quite heavy renders because we had the fog, we had the waterfall and other elements.Close-up views of the fog, and moments that required fog interaction, had their own challenges. For instance, Levi ziplines over the other side of the gorge. For those shots of Levi going over the gorge, says von Overheidt, you get quite unusual steep angles into the fog. You see the transition where the fog meets the rockthe fall-off. We would add extra simulations to make sure we got a nice fall-off and nice detail. Then later for the quadcopters that rise from the fog, we had a different kind of problem to solve for those, as we are looking flat across the fog and we are right on top of it. At one point, Drasa jumps down into the gorge (and through the fog) after Levis zipline snaps. For that, says von Overheidt, we had the base fog volume and then added additional cloud simulation inside the fog as shes jumping through.For the jump itself, Anya Taylor-Joy performed the leap and then landed on a crash mat. DNEG took over Taylor-Joys jump with a digital version of the character. We followed the animation of her jumping off with a body track and very close shot sculpting, explains von Overheidt. We then transition into the full CG moment, which means were decoupled from the matchmove and were able to do our own camera move as the dive into the wispy foggy clouds takes place. Once out of the gorge, Levi and Drasa are pursued through the forest by quadcopter drones, which were crafted by DNEG. We had a practical reference of a one-to-one size model of the drones that they really threw down the forest, says von Overheidt. This was a good reference for lighting and how much dirt got kicked off. There was also some pyro going off that we could use as a reference. We then created a full clean plate of that scene, and a full CG forest behind the scene, and then had our drone assets tumble through with full FX simulation. Drasa and Levis actions invoke the Stray Dog protocol that results in a nuclear blast, triggered by a series of smaller explosions. DNEG looked at various pieces of footage from nuclear tests and blasts. We had to tailor-make an explosion to fit into the gorge, describes Asparuhova. All the explosions we had seen had been happening on flat ground, but here we had to look into the physics of it happening in a gorge. The explosion actually happens below. Normally, you would get the explosion, then you would have the shock wave, and it would obliterate everything. Our challenge was, how do we do this several hundred meters under and still make it look realistic? It also included the destruction of the towers, adds Asparuhova, which we had to break into pieces and then blow them away. There were trees we had to animate, too. We had some really interesting footage from real trees showing how they bend and how they start smoking with the explosion. In the end, it was first a bunch of smoke that comes out and then the shockwave hits and literally bends and obliterates everything in its way.The explosion is a massive moment in the film, of course, but DNEG was also responsible for other much more subtle moments. Asparuhova identifies the times when Levi and Drasa are peering at each other through their binoculars. For those shots, there was this subtlety required in terms of, how do you not obstruct the story? We had to show that these two are falling in love with each other, but we still had to show it as them looking through binoculars so that the audience understands whats happening. Sometimes its the subtle things that you have to spend a lot of time and a lot of thought on, just so that its as seamless as possible. ScreenshotAll images courtesy of DNEG 2025 Apple Inc.The post How the gorge was made in The Gorge appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·96 Views
  • Watch these Kraven the Hunter VFX reels
    beforesandafters.com
    From Image Engine and Rodeo FX.The post Watch these Kraven the Hunter VFX reels appeared first on befores & afters.
    0 Commentarios ·0 Acciones ·145 Views
Quizás te interese…