• Why Ready or Not feels so real– how Unreal Engine 5 is delivering next-level immersion


    Void's art director and lead designer reveal what it takes to improve on PC perfection.
    Why Ready or Not feels so real– how Unreal Engine 5 is delivering next-level immersion Void's art director and lead designer reveal what it takes to improve on PC perfection.
    WWW.CREATIVEBLOQ.COM
    Why Ready or Not feels so real– how Unreal Engine 5 is delivering next-level immersion
    Void's art director and lead designer reveal what it takes to improve on PC perfection.
    2 Yorumlar 0 hisse senetleri
  • Into the Omniverse: World Foundation Models Advance Autonomous Vehicle Simulation and Safety

    Editor’s note: This blog is a part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advances in OpenUSD and NVIDIA Omniverse.
    Simulated driving environments enable engineers to safely and efficiently train, test and validate autonomous vehiclesacross countless real-world and edge-case scenarios without the risks and costs of physical testing.
    These simulated environments can be created through neural reconstruction of real-world data from AV fleets or generated with world foundation models— neural networks that understand physics and real-world properties. WFMs can be used to generate synthetic datasets for enhanced AV simulation.
    To help physical AI developers build such simulated environments, NVIDIA unveiled major advances in WFMs at the GTC Paris and CVPR conferences earlier this month. These new capabilities enhance NVIDIA Cosmos — a platform of generative WFMs, advanced tokenizers, guardrails and accelerated data processing tools.
    Key innovations like Cosmos Predict-2, the Cosmos Transfer-1 NVIDIA preview NIM microservice and Cosmos Reason are improving how AV developers generate synthetic data, build realistic simulated environments and validate safety systems at unprecedented scale.
    Universal Scene Description, a unified data framework and standard for physical AI applications, enables seamless integration and interoperability of simulation assets across the development pipeline. OpenUSD standardization plays a critical role in ensuring 3D pipelines are built to scale.
    NVIDIA Omniverse, a platform of application programming interfaces, software development kits and services for building OpenUSD-based physical AI applications, enables simulations from WFMs and neural reconstruction at world scale.
    Leading AV organizations — including Foretellix, Mcity, Oxa, Parallel Domain, Plus AI and Uber — are among the first to adopt Cosmos models.

    Foundations for Scalable, Realistic Simulation
    Cosmos Predict-2, NVIDIA’s latest WFM, generates high-quality synthetic data by predicting future world states from multimodal inputs like text, images and video. This capability is critical for creating temporally consistent, realistic scenarios that accelerate training and validation of AVs and robots.

    In addition, Cosmos Transfer, a control model that adds variations in weather, lighting and terrain to existing scenarios, will soon be available to 150,000 developers on CARLA, a leading open-source AV simulator. This greatly expands the broad AV developer community’s access to advanced AI-powered simulation tools.
    Developers can start integrating synthetic data into their own pipelines using the NVIDIA Physical AI Dataset. The latest release includes 40,000 clips generated using Cosmos.
    Building on these foundations, the Omniverse Blueprint for AV simulation provides a standardized, API-driven workflow for constructing rich digital twins, replaying real-world sensor data and generating new ground-truth data for closed-loop testing.
    The blueprint taps into OpenUSD’s layer-stacking and composition arcs, which enable developers to collaborate asynchronously and modify scenes nondestructively. This helps create modular, reusable scenario variants to efficiently generate different weather conditions, traffic patterns and edge cases.
    Driving the Future of AV Safety
    To bolster the operational safety of AV systems, NVIDIA earlier this year introduced NVIDIA Halos — a comprehensive safety platform that integrates the company’s full automotive hardware and software stack with AI research focused on AV safety.
    The new Cosmos models — Cosmos Predict- 2, Cosmos Transfer- 1 NIM and Cosmos Reason — deliver further safety enhancements to the Halos platform, enabling developers to create diverse, controllable and realistic scenarios for training and validating AV systems.
    These models, trained on massive multimodal datasets including driving data, amplify the breadth and depth of simulation, allowing for robust scenario coverage — including rare and safety-critical events — while supporting post-training customization for specialized AV tasks.

    At CVPR, NVIDIA was recognized as an Autonomous Grand Challenge winner, highlighting its leadership in advancing end-to-end AV workflows. The challenge used OpenUSD’s robust metadata and interoperability to simulate sensor inputs and vehicle trajectories in semi-reactive environments, achieving state-of-the-art results in safety and compliance.
    Learn more about how developers are leveraging tools like CARLA, Cosmos, and Omniverse to advance AV simulation in this livestream replay:

    Hear NVIDIA Director of Autonomous Vehicle Research Marco Pavone on the NVIDIA AI Podcast share how digital twins and high-fidelity simulation are improving vehicle testing, accelerating development and reducing real-world risks.
    Get Plugged Into the World of OpenUSD
    Learn more about what’s next for AV simulation with OpenUSD by watching the replay of NVIDIA founder and CEO Jensen Huang’s GTC Paris keynote.
    Looking for more live opportunities to learn more about OpenUSD? Don’t miss sessions and labs happening at SIGGRAPH 2025, August 10–14.
    Discover why developers and 3D practitioners are using OpenUSD and learn how to optimize 3D workflows with the self-paced “Learn OpenUSD” curriculum for 3D developers and practitioners, available for free through the NVIDIA Deep Learning Institute.
    Explore the Alliance for OpenUSD forum and the AOUSD website.
    Stay up to date by subscribing to NVIDIA Omniverse news, joining the community and following NVIDIA Omniverse on Instagram, LinkedIn, Medium and X.
    #into #omniverse #world #foundation #models
    Into the Omniverse: World Foundation Models Advance Autonomous Vehicle Simulation and Safety
    Editor’s note: This blog is a part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advances in OpenUSD and NVIDIA Omniverse. Simulated driving environments enable engineers to safely and efficiently train, test and validate autonomous vehiclesacross countless real-world and edge-case scenarios without the risks and costs of physical testing. These simulated environments can be created through neural reconstruction of real-world data from AV fleets or generated with world foundation models— neural networks that understand physics and real-world properties. WFMs can be used to generate synthetic datasets for enhanced AV simulation. To help physical AI developers build such simulated environments, NVIDIA unveiled major advances in WFMs at the GTC Paris and CVPR conferences earlier this month. These new capabilities enhance NVIDIA Cosmos — a platform of generative WFMs, advanced tokenizers, guardrails and accelerated data processing tools. Key innovations like Cosmos Predict-2, the Cosmos Transfer-1 NVIDIA preview NIM microservice and Cosmos Reason are improving how AV developers generate synthetic data, build realistic simulated environments and validate safety systems at unprecedented scale. Universal Scene Description, a unified data framework and standard for physical AI applications, enables seamless integration and interoperability of simulation assets across the development pipeline. OpenUSD standardization plays a critical role in ensuring 3D pipelines are built to scale. NVIDIA Omniverse, a platform of application programming interfaces, software development kits and services for building OpenUSD-based physical AI applications, enables simulations from WFMs and neural reconstruction at world scale. Leading AV organizations — including Foretellix, Mcity, Oxa, Parallel Domain, Plus AI and Uber — are among the first to adopt Cosmos models. Foundations for Scalable, Realistic Simulation Cosmos Predict-2, NVIDIA’s latest WFM, generates high-quality synthetic data by predicting future world states from multimodal inputs like text, images and video. This capability is critical for creating temporally consistent, realistic scenarios that accelerate training and validation of AVs and robots. In addition, Cosmos Transfer, a control model that adds variations in weather, lighting and terrain to existing scenarios, will soon be available to 150,000 developers on CARLA, a leading open-source AV simulator. This greatly expands the broad AV developer community’s access to advanced AI-powered simulation tools. Developers can start integrating synthetic data into their own pipelines using the NVIDIA Physical AI Dataset. The latest release includes 40,000 clips generated using Cosmos. Building on these foundations, the Omniverse Blueprint for AV simulation provides a standardized, API-driven workflow for constructing rich digital twins, replaying real-world sensor data and generating new ground-truth data for closed-loop testing. The blueprint taps into OpenUSD’s layer-stacking and composition arcs, which enable developers to collaborate asynchronously and modify scenes nondestructively. This helps create modular, reusable scenario variants to efficiently generate different weather conditions, traffic patterns and edge cases. Driving the Future of AV Safety To bolster the operational safety of AV systems, NVIDIA earlier this year introduced NVIDIA Halos — a comprehensive safety platform that integrates the company’s full automotive hardware and software stack with AI research focused on AV safety. The new Cosmos models — Cosmos Predict- 2, Cosmos Transfer- 1 NIM and Cosmos Reason — deliver further safety enhancements to the Halos platform, enabling developers to create diverse, controllable and realistic scenarios for training and validating AV systems. These models, trained on massive multimodal datasets including driving data, amplify the breadth and depth of simulation, allowing for robust scenario coverage — including rare and safety-critical events — while supporting post-training customization for specialized AV tasks. At CVPR, NVIDIA was recognized as an Autonomous Grand Challenge winner, highlighting its leadership in advancing end-to-end AV workflows. The challenge used OpenUSD’s robust metadata and interoperability to simulate sensor inputs and vehicle trajectories in semi-reactive environments, achieving state-of-the-art results in safety and compliance. Learn more about how developers are leveraging tools like CARLA, Cosmos, and Omniverse to advance AV simulation in this livestream replay: Hear NVIDIA Director of Autonomous Vehicle Research Marco Pavone on the NVIDIA AI Podcast share how digital twins and high-fidelity simulation are improving vehicle testing, accelerating development and reducing real-world risks. Get Plugged Into the World of OpenUSD Learn more about what’s next for AV simulation with OpenUSD by watching the replay of NVIDIA founder and CEO Jensen Huang’s GTC Paris keynote. Looking for more live opportunities to learn more about OpenUSD? Don’t miss sessions and labs happening at SIGGRAPH 2025, August 10–14. Discover why developers and 3D practitioners are using OpenUSD and learn how to optimize 3D workflows with the self-paced “Learn OpenUSD” curriculum for 3D developers and practitioners, available for free through the NVIDIA Deep Learning Institute. Explore the Alliance for OpenUSD forum and the AOUSD website. Stay up to date by subscribing to NVIDIA Omniverse news, joining the community and following NVIDIA Omniverse on Instagram, LinkedIn, Medium and X. #into #omniverse #world #foundation #models
    BLOGS.NVIDIA.COM
    Into the Omniverse: World Foundation Models Advance Autonomous Vehicle Simulation and Safety
    Editor’s note: This blog is a part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advances in OpenUSD and NVIDIA Omniverse. Simulated driving environments enable engineers to safely and efficiently train, test and validate autonomous vehicles (AVs) across countless real-world and edge-case scenarios without the risks and costs of physical testing. These simulated environments can be created through neural reconstruction of real-world data from AV fleets or generated with world foundation models (WFMs) — neural networks that understand physics and real-world properties. WFMs can be used to generate synthetic datasets for enhanced AV simulation. To help physical AI developers build such simulated environments, NVIDIA unveiled major advances in WFMs at the GTC Paris and CVPR conferences earlier this month. These new capabilities enhance NVIDIA Cosmos — a platform of generative WFMs, advanced tokenizers, guardrails and accelerated data processing tools. Key innovations like Cosmos Predict-2, the Cosmos Transfer-1 NVIDIA preview NIM microservice and Cosmos Reason are improving how AV developers generate synthetic data, build realistic simulated environments and validate safety systems at unprecedented scale. Universal Scene Description (OpenUSD), a unified data framework and standard for physical AI applications, enables seamless integration and interoperability of simulation assets across the development pipeline. OpenUSD standardization plays a critical role in ensuring 3D pipelines are built to scale. NVIDIA Omniverse, a platform of application programming interfaces, software development kits and services for building OpenUSD-based physical AI applications, enables simulations from WFMs and neural reconstruction at world scale. Leading AV organizations — including Foretellix, Mcity, Oxa, Parallel Domain, Plus AI and Uber — are among the first to adopt Cosmos models. Foundations for Scalable, Realistic Simulation Cosmos Predict-2, NVIDIA’s latest WFM, generates high-quality synthetic data by predicting future world states from multimodal inputs like text, images and video. This capability is critical for creating temporally consistent, realistic scenarios that accelerate training and validation of AVs and robots. In addition, Cosmos Transfer, a control model that adds variations in weather, lighting and terrain to existing scenarios, will soon be available to 150,000 developers on CARLA, a leading open-source AV simulator. This greatly expands the broad AV developer community’s access to advanced AI-powered simulation tools. Developers can start integrating synthetic data into their own pipelines using the NVIDIA Physical AI Dataset. The latest release includes 40,000 clips generated using Cosmos. Building on these foundations, the Omniverse Blueprint for AV simulation provides a standardized, API-driven workflow for constructing rich digital twins, replaying real-world sensor data and generating new ground-truth data for closed-loop testing. The blueprint taps into OpenUSD’s layer-stacking and composition arcs, which enable developers to collaborate asynchronously and modify scenes nondestructively. This helps create modular, reusable scenario variants to efficiently generate different weather conditions, traffic patterns and edge cases. Driving the Future of AV Safety To bolster the operational safety of AV systems, NVIDIA earlier this year introduced NVIDIA Halos — a comprehensive safety platform that integrates the company’s full automotive hardware and software stack with AI research focused on AV safety. The new Cosmos models — Cosmos Predict- 2, Cosmos Transfer- 1 NIM and Cosmos Reason — deliver further safety enhancements to the Halos platform, enabling developers to create diverse, controllable and realistic scenarios for training and validating AV systems. These models, trained on massive multimodal datasets including driving data, amplify the breadth and depth of simulation, allowing for robust scenario coverage — including rare and safety-critical events — while supporting post-training customization for specialized AV tasks. At CVPR, NVIDIA was recognized as an Autonomous Grand Challenge winner, highlighting its leadership in advancing end-to-end AV workflows. The challenge used OpenUSD’s robust metadata and interoperability to simulate sensor inputs and vehicle trajectories in semi-reactive environments, achieving state-of-the-art results in safety and compliance. Learn more about how developers are leveraging tools like CARLA, Cosmos, and Omniverse to advance AV simulation in this livestream replay: Hear NVIDIA Director of Autonomous Vehicle Research Marco Pavone on the NVIDIA AI Podcast share how digital twins and high-fidelity simulation are improving vehicle testing, accelerating development and reducing real-world risks. Get Plugged Into the World of OpenUSD Learn more about what’s next for AV simulation with OpenUSD by watching the replay of NVIDIA founder and CEO Jensen Huang’s GTC Paris keynote. Looking for more live opportunities to learn more about OpenUSD? Don’t miss sessions and labs happening at SIGGRAPH 2025, August 10–14. Discover why developers and 3D practitioners are using OpenUSD and learn how to optimize 3D workflows with the self-paced “Learn OpenUSD” curriculum for 3D developers and practitioners, available for free through the NVIDIA Deep Learning Institute. Explore the Alliance for OpenUSD forum and the AOUSD website. Stay up to date by subscribing to NVIDIA Omniverse news, joining the community and following NVIDIA Omniverse on Instagram, LinkedIn, Medium and X.
    0 Yorumlar 0 hisse senetleri
  • BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4

    By TREVOR HOGG
    Images courtesy of Prime Video.

    For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!”

    When Splintersplits in two, the cloning effect was inspired by cellular mitosis.

    “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!”
    —Stephan Fleet, VFX Supervisor

    A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith, who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.”

    Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed.

    Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.”

    “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.”
    —Stephan Fleet, VFX Supervisor

    The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be, so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a humanyou tend to want to give it human gestures and eyebrows. Erik Kripkesaid, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.”

    A building is replaced by a massive crowd attending a rally being held by Homelander.

    In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep wasin one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.”

    In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around.

    “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.”
    —Stephan Fleet, VFX Supervisor

    Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.”

    The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes.

    Once injected with Compound V, Hugh Campbell Sr.develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.”

    Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination.

    Homelanderbreaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.”

    “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.”
    —Stephan Fleet, VFX Supervisor

    Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution.

    A different spin on the bloodbath occurs during a fight when a drugged Frenchiehallucinates as Kimiko Miyashirogoes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.”

    Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4.

    When Splintersplits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker. “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.”
    #bouncing #rubber #duckies #flying #sheep
    BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4
    By TREVOR HOGG Images courtesy of Prime Video. For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” When Splintersplits in two, the cloning effect was inspired by cellular mitosis. “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” —Stephan Fleet, VFX Supervisor A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith, who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed. Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.” “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” —Stephan Fleet, VFX Supervisor The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be, so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a humanyou tend to want to give it human gestures and eyebrows. Erik Kripkesaid, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.” A building is replaced by a massive crowd attending a rally being held by Homelander. In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep wasin one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.” In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” —Stephan Fleet, VFX Supervisor Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes. Once injected with Compound V, Hugh Campbell Sr.develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.” Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination. Homelanderbreaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.” “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” —Stephan Fleet, VFX Supervisor Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution. A different spin on the bloodbath occurs during a fight when a drugged Frenchiehallucinates as Kimiko Miyashirogoes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.” Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4. When Splintersplits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker. “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.” #bouncing #rubber #duckies #flying #sheep
    WWW.VFXVOICE.COM
    BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4
    By TREVOR HOGG Images courtesy of Prime Video. For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” When Splinter (Rob Benedict) splits in two, the cloning effect was inspired by cellular mitosis. “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” —Stephan Fleet, VFX Supervisor A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith [Previs Director], who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed. Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.” “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” —Stephan Fleet, VFX Supervisor The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be [from Season 3], so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a human [is] you tend to want to give it human gestures and eyebrows. Erik Kripke [Creator, Executive Producer, Showrunner, Director, Writer] said, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.” A building is replaced by a massive crowd attending a rally being held by Homelander. In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep was [acting in a certain way] in one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.” In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” —Stephan Fleet, VFX Supervisor Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes. Once injected with Compound V, Hugh Campbell Sr. (Simon Pegg) develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.” Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination. Homelander (Anthony Starr) breaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.” “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” —Stephan Fleet, VFX Supervisor Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution. A different spin on the bloodbath occurs during a fight when a drugged Frenchie (Tomer Capone) hallucinates as Kimiko Miyashiro (Karen Fukuhara) goes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.” Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4. When Splinter (Rob Benedict) splits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker (Valorie Curry). “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.”
    0 Yorumlar 0 hisse senetleri
  • Monster Hunter Wilds’ second free title update brings fierce new monsters and more June 30

    New monsters, features, and more arrive in the Forbidden Lands with Free Title Update 2, dropping in Monster Hunter Wilds on June 30! Watch the latest trailer for a look at what awaits you.

    Play Video

    Monster Hunter Wilds – Free Title Update 2

    In addition to what’s featured in the trailer, Free Title Update 2 will also feature improvements and adjustments to various aspects of the game. Make sure to check the official Monster Hunter Wilds website for a new Director’s Letter from Game Director Yuya Tokuda coming soon, for a deeper dive into what’s coming in addition to the core new monsters and features.

    ● The Leviathan, Lagiacrus, emerges at last

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    The long-awaited Leviathan, Lagiacrus, has finally appeared in Monster Hunter Wilds! Floating at the top of the aquatic food chain, Lagiacrus is a master of the sea, boiling the surrounding water by emitting powerful currents of electricity. New missions to hunt Lagiacrus will become available for hunters at Hunter Rank 31 or above, and after clearing the “A World Turned Upside Down” main mission, and the “Forest Doshaguma” side mission.

    While you’ll fight Lagiacrus primarily on land, your hunt against this formidable foe can also take you deep underwater for a special encounter, where it feels most at home. During the underwater portion of the hunt, hunters won’t be able to use their weapons freely, but there are still ways to fight back and turn the tide of battle. Stay alert for your opportunities!

    Hunt Lagiacrus to obtain materials for new hunter and Palico armor! As usual, these sets can be used as layered armor as well.

    ● The Flying Wyvern, Seregios, strikes

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    Shining golden bright, the flying wyvern, Seregios, swoops into the Forbidden Lands with Free Title Update 2! Seregios is a highly mobile aerial monster that fires sharp bladescales, inflicting bleeding status on hunters. Keep an eye on your health and bring along rations and well-done steak when hunting this monster. Missions to hunt Seregios are available for hunters at HR 31 or above that have cleared the “A World Turned Upside Down” main mission.

    New hunter and Palico armor forged from Seregios materials awaits you!

    For hunters looking for a greater challenge, 8★ Tempered Lagiacrus and Seregios will begin appearing for hunters at HR 41 or higher, after completing their initial missions. Best of luck against these powerful monsters!

    Hunt in style with layered weapons

    With Free Title Update 2, hunters will be able to use Layered Weapons, which lets you use the look of any weapon, while keeping the stats and abilities of another.

    To unlock a weapon design as a Layered Weapon option, you’ll need to craft the final weapon in that weapon’s upgrade tree. Artian Weapons can be used as layered weapons by fully reinforcing a Rarity 8 Artian weapon.

    For weapons that change in appearance when upgraded, you’ll also have the option to use their pre-upgrade designs as well! You can also craft layered Palico weapons by forging their high-rank weapons. We hope this feature encourages you to delve deeper into crafting the powerful Artian Weapon you’ve been looking for, all while keeping the appearance of your favorite weapon.

    New optional features

    Change your choice of handler accompanying you in the field to Eric after completing the Lagiacrus mission in Free Title Update 2! You can always switch back to Alma too, but it doesn’t hurt to give our trusty handler a break from time to time.

    A new Support Hunter joins the fray

    Mina, a support hunter who wields a Sword & Shield, joins the hunt. With Free Title Update 2, you’ll be able to choose which support hunters can join you on quests.

    Photo Mode Improvements

    Snap even more creative photos of your hunts with some new options, including an Effects tab to adjust brightness and filter effects, and a Character Display tab to toggle off your Handler, Palico, Seikret, and more.

    Celebrate summer with the Festival of Accord: Flamefete seasonal event

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    The next seasonal event in Monster Hunter Wilds, the Festival of Accord: Flamefete, will take place in the Grand Hub from July 23 to August 6! Cool off with this summer themed celebration, where you can obtain new armor, gestures, and pop-up camp decorations for a limited time. You’ll also be able to eat special seasonal event meals and enjoy the fun of summer as the Grand Hub and all it’s members will be dressed to mark the occasion.

    Arch-Tempered Uth Duna slams down starting July 30

    Take on an even more powerful version of Uth Duna when Arch-Tempered Uth Duna arrives as an Event Quest and Free Challenge Quest from July 30 to August 20! Take on and defeat the challenging apex of the Scarlet Forest to obtain materials for crafting the new Uth Duna γ hunter armor set and the Felyne Uth Duna γ Palico armor set. Be sure you’re at least HR 50 or above to take on this quest.

    We’ve also got plenty of new Event Quests on the way in the weeks ahead, including some where you can earn new special equipment, quests to obtain more armor spheres, and challenge quests against Mizutsune. Be sure to keep checking back each week to see what’s new!

    A special collaboration with Fender

    Monster Hunter Wilds is collaborating with world-renowned guitar brand Fender®! From August 27 to September 24, a special Event Quest will be available to earn a collaboration gesture that lets you rock out with the Monster Hunter Rathalos Telecaster®.

    In celebration of Monster Hunter’s 20th anniversary, the globally released Monster Hunter Rathalos Telecaster® collaboration guitar is making its way into the game! Be sure to experience it both in-game and in real life!

    A new round of cosmetic DLC arrives

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    Express your style with additional DLC, including four free dance gestures. Paid cosmetic DLC, such as gestures, stickers, pendants, and more will also be available. If you’ve purchased the Premium Deluxe Edition of Monster Hunter Wilds or the Cosmetic DLC Pass, Cosmetic DLC Pack 2 and other additional items will be available to download when Free Title Update 2 releases. 

    Free Title Update roadmap

    We hope you’re excited to dive into all the content coming with Free Title Update 2! We’ll continue to release updates, with Free Title Update 3 coming at the end of September. Stay tuned for more details to come.

    A Monster Hunter Wilds background is added to the PS5 Welcome hub

    Alongside Free Title Update 2 on June 30, an animated background featuring the hunters facing Arkveld during the Inclemency will be added to the Welcome hub. Customize your PS5 Welcome hub with Monster Hunter Wilds to get you in the hunting mood.

    View and download image

    Download the image

    close
    Close

    Download this image

    How to change the backgroundWelcome hub -> Change background -> Games

    Try out Monster Hunter Wilds on PS5 with a PlayStation Plus Premium Game Trial starting on June 30

    View and download image

    Download the image

    close
    Close

    Download this image

    With the Game Trial, you can try out the full version of the game for 2 hours. If you decide to purchase the full version after the trial, your save data will carry over, allowing you to continue playing seamlessly right where you left off. If you haven’t played Monster Hunter Wilds yet, this is a great way to give it a try.

    Happy Hunting!
    #monster #hunter #wilds #second #free
    Monster Hunter Wilds’ second free title update brings fierce new monsters and more June 30
    New monsters, features, and more arrive in the Forbidden Lands with Free Title Update 2, dropping in Monster Hunter Wilds on June 30! Watch the latest trailer for a look at what awaits you. Play Video Monster Hunter Wilds – Free Title Update 2 In addition to what’s featured in the trailer, Free Title Update 2 will also feature improvements and adjustments to various aspects of the game. Make sure to check the official Monster Hunter Wilds website for a new Director’s Letter from Game Director Yuya Tokuda coming soon, for a deeper dive into what’s coming in addition to the core new monsters and features. ● The Leviathan, Lagiacrus, emerges at last View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The long-awaited Leviathan, Lagiacrus, has finally appeared in Monster Hunter Wilds! Floating at the top of the aquatic food chain, Lagiacrus is a master of the sea, boiling the surrounding water by emitting powerful currents of electricity. New missions to hunt Lagiacrus will become available for hunters at Hunter Rank 31 or above, and after clearing the “A World Turned Upside Down” main mission, and the “Forest Doshaguma” side mission. While you’ll fight Lagiacrus primarily on land, your hunt against this formidable foe can also take you deep underwater for a special encounter, where it feels most at home. During the underwater portion of the hunt, hunters won’t be able to use their weapons freely, but there are still ways to fight back and turn the tide of battle. Stay alert for your opportunities! Hunt Lagiacrus to obtain materials for new hunter and Palico armor! As usual, these sets can be used as layered armor as well. ● The Flying Wyvern, Seregios, strikes View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Shining golden bright, the flying wyvern, Seregios, swoops into the Forbidden Lands with Free Title Update 2! Seregios is a highly mobile aerial monster that fires sharp bladescales, inflicting bleeding status on hunters. Keep an eye on your health and bring along rations and well-done steak when hunting this monster. Missions to hunt Seregios are available for hunters at HR 31 or above that have cleared the “A World Turned Upside Down” main mission. New hunter and Palico armor forged from Seregios materials awaits you! For hunters looking for a greater challenge, 8★ Tempered Lagiacrus and Seregios will begin appearing for hunters at HR 41 or higher, after completing their initial missions. Best of luck against these powerful monsters! Hunt in style with layered weapons With Free Title Update 2, hunters will be able to use Layered Weapons, which lets you use the look of any weapon, while keeping the stats and abilities of another. To unlock a weapon design as a Layered Weapon option, you’ll need to craft the final weapon in that weapon’s upgrade tree. Artian Weapons can be used as layered weapons by fully reinforcing a Rarity 8 Artian weapon. For weapons that change in appearance when upgraded, you’ll also have the option to use their pre-upgrade designs as well! You can also craft layered Palico weapons by forging their high-rank weapons. We hope this feature encourages you to delve deeper into crafting the powerful Artian Weapon you’ve been looking for, all while keeping the appearance of your favorite weapon. New optional features Change your choice of handler accompanying you in the field to Eric after completing the Lagiacrus mission in Free Title Update 2! You can always switch back to Alma too, but it doesn’t hurt to give our trusty handler a break from time to time. A new Support Hunter joins the fray Mina, a support hunter who wields a Sword & Shield, joins the hunt. With Free Title Update 2, you’ll be able to choose which support hunters can join you on quests. Photo Mode Improvements Snap even more creative photos of your hunts with some new options, including an Effects tab to adjust brightness and filter effects, and a Character Display tab to toggle off your Handler, Palico, Seikret, and more. Celebrate summer with the Festival of Accord: Flamefete seasonal event View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The next seasonal event in Monster Hunter Wilds, the Festival of Accord: Flamefete, will take place in the Grand Hub from July 23 to August 6! Cool off with this summer themed celebration, where you can obtain new armor, gestures, and pop-up camp decorations for a limited time. You’ll also be able to eat special seasonal event meals and enjoy the fun of summer as the Grand Hub and all it’s members will be dressed to mark the occasion. Arch-Tempered Uth Duna slams down starting July 30 Take on an even more powerful version of Uth Duna when Arch-Tempered Uth Duna arrives as an Event Quest and Free Challenge Quest from July 30 to August 20! Take on and defeat the challenging apex of the Scarlet Forest to obtain materials for crafting the new Uth Duna γ hunter armor set and the Felyne Uth Duna γ Palico armor set. Be sure you’re at least HR 50 or above to take on this quest. We’ve also got plenty of new Event Quests on the way in the weeks ahead, including some where you can earn new special equipment, quests to obtain more armor spheres, and challenge quests against Mizutsune. Be sure to keep checking back each week to see what’s new! A special collaboration with Fender Monster Hunter Wilds is collaborating with world-renowned guitar brand Fender®! From August 27 to September 24, a special Event Quest will be available to earn a collaboration gesture that lets you rock out with the Monster Hunter Rathalos Telecaster®. In celebration of Monster Hunter’s 20th anniversary, the globally released Monster Hunter Rathalos Telecaster® collaboration guitar is making its way into the game! Be sure to experience it both in-game and in real life! A new round of cosmetic DLC arrives View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Express your style with additional DLC, including four free dance gestures. Paid cosmetic DLC, such as gestures, stickers, pendants, and more will also be available. If you’ve purchased the Premium Deluxe Edition of Monster Hunter Wilds or the Cosmetic DLC Pass, Cosmetic DLC Pack 2 and other additional items will be available to download when Free Title Update 2 releases.  Free Title Update roadmap We hope you’re excited to dive into all the content coming with Free Title Update 2! We’ll continue to release updates, with Free Title Update 3 coming at the end of September. Stay tuned for more details to come. A Monster Hunter Wilds background is added to the PS5 Welcome hub Alongside Free Title Update 2 on June 30, an animated background featuring the hunters facing Arkveld during the Inclemency will be added to the Welcome hub. Customize your PS5 Welcome hub with Monster Hunter Wilds to get you in the hunting mood. View and download image Download the image close Close Download this image How to change the backgroundWelcome hub -> Change background -> Games Try out Monster Hunter Wilds on PS5 with a PlayStation Plus Premium Game Trial starting on June 30 View and download image Download the image close Close Download this image With the Game Trial, you can try out the full version of the game for 2 hours. If you decide to purchase the full version after the trial, your save data will carry over, allowing you to continue playing seamlessly right where you left off. If you haven’t played Monster Hunter Wilds yet, this is a great way to give it a try. Happy Hunting! #monster #hunter #wilds #second #free
    BLOG.PLAYSTATION.COM
    Monster Hunter Wilds’ second free title update brings fierce new monsters and more June 30
    New monsters, features, and more arrive in the Forbidden Lands with Free Title Update 2, dropping in Monster Hunter Wilds on June 30! Watch the latest trailer for a look at what awaits you. Play Video Monster Hunter Wilds – Free Title Update 2 In addition to what’s featured in the trailer, Free Title Update 2 will also feature improvements and adjustments to various aspects of the game. Make sure to check the official Monster Hunter Wilds website for a new Director’s Letter from Game Director Yuya Tokuda coming soon, for a deeper dive into what’s coming in addition to the core new monsters and features. ● The Leviathan, Lagiacrus, emerges at last View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The long-awaited Leviathan, Lagiacrus, has finally appeared in Monster Hunter Wilds! Floating at the top of the aquatic food chain, Lagiacrus is a master of the sea, boiling the surrounding water by emitting powerful currents of electricity. New missions to hunt Lagiacrus will become available for hunters at Hunter Rank 31 or above, and after clearing the “A World Turned Upside Down” main mission, and the “Forest Doshaguma” side mission. While you’ll fight Lagiacrus primarily on land, your hunt against this formidable foe can also take you deep underwater for a special encounter, where it feels most at home. During the underwater portion of the hunt, hunters won’t be able to use their weapons freely, but there are still ways to fight back and turn the tide of battle. Stay alert for your opportunities! Hunt Lagiacrus to obtain materials for new hunter and Palico armor! As usual, these sets can be used as layered armor as well. ● The Flying Wyvern, Seregios, strikes View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Shining golden bright, the flying wyvern, Seregios, swoops into the Forbidden Lands with Free Title Update 2! Seregios is a highly mobile aerial monster that fires sharp bladescales, inflicting bleeding status on hunters. Keep an eye on your health and bring along rations and well-done steak when hunting this monster. Missions to hunt Seregios are available for hunters at HR 31 or above that have cleared the “A World Turned Upside Down” main mission. New hunter and Palico armor forged from Seregios materials awaits you! For hunters looking for a greater challenge, 8★ Tempered Lagiacrus and Seregios will begin appearing for hunters at HR 41 or higher, after completing their initial missions. Best of luck against these powerful monsters! Hunt in style with layered weapons With Free Title Update 2, hunters will be able to use Layered Weapons, which lets you use the look of any weapon, while keeping the stats and abilities of another. To unlock a weapon design as a Layered Weapon option, you’ll need to craft the final weapon in that weapon’s upgrade tree. Artian Weapons can be used as layered weapons by fully reinforcing a Rarity 8 Artian weapon. For weapons that change in appearance when upgraded, you’ll also have the option to use their pre-upgrade designs as well! You can also craft layered Palico weapons by forging their high-rank weapons. We hope this feature encourages you to delve deeper into crafting the powerful Artian Weapon you’ve been looking for, all while keeping the appearance of your favorite weapon. New optional features Change your choice of handler accompanying you in the field to Eric after completing the Lagiacrus mission in Free Title Update 2! You can always switch back to Alma too, but it doesn’t hurt to give our trusty handler a break from time to time. A new Support Hunter joins the fray Mina, a support hunter who wields a Sword & Shield, joins the hunt. With Free Title Update 2, you’ll be able to choose which support hunters can join you on quests. Photo Mode Improvements Snap even more creative photos of your hunts with some new options, including an Effects tab to adjust brightness and filter effects, and a Character Display tab to toggle off your Handler, Palico, Seikret, and more. Celebrate summer with the Festival of Accord: Flamefete seasonal event View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The next seasonal event in Monster Hunter Wilds, the Festival of Accord: Flamefete, will take place in the Grand Hub from July 23 to August 6! Cool off with this summer themed celebration, where you can obtain new armor, gestures, and pop-up camp decorations for a limited time. You’ll also be able to eat special seasonal event meals and enjoy the fun of summer as the Grand Hub and all it’s members will be dressed to mark the occasion. Arch-Tempered Uth Duna slams down starting July 30 Take on an even more powerful version of Uth Duna when Arch-Tempered Uth Duna arrives as an Event Quest and Free Challenge Quest from July 30 to August 20! Take on and defeat the challenging apex of the Scarlet Forest to obtain materials for crafting the new Uth Duna γ hunter armor set and the Felyne Uth Duna γ Palico armor set. Be sure you’re at least HR 50 or above to take on this quest. We’ve also got plenty of new Event Quests on the way in the weeks ahead, including some where you can earn new special equipment, quests to obtain more armor spheres, and challenge quests against Mizutsune. Be sure to keep checking back each week to see what’s new! A special collaboration with Fender Monster Hunter Wilds is collaborating with world-renowned guitar brand Fender®! From August 27 to September 24, a special Event Quest will be available to earn a collaboration gesture that lets you rock out with the Monster Hunter Rathalos Telecaster®. In celebration of Monster Hunter’s 20th anniversary, the globally released Monster Hunter Rathalos Telecaster® collaboration guitar is making its way into the game! Be sure to experience it both in-game and in real life! A new round of cosmetic DLC arrives View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Express your style with additional DLC, including four free dance gestures. Paid cosmetic DLC, such as gestures, stickers, pendants, and more will also be available. If you’ve purchased the Premium Deluxe Edition of Monster Hunter Wilds or the Cosmetic DLC Pass, Cosmetic DLC Pack 2 and other additional items will be available to download when Free Title Update 2 releases.  Free Title Update roadmap We hope you’re excited to dive into all the content coming with Free Title Update 2! We’ll continue to release updates, with Free Title Update 3 coming at the end of September. Stay tuned for more details to come. A Monster Hunter Wilds background is added to the PS5 Welcome hub Alongside Free Title Update 2 on June 30, an animated background featuring the hunters facing Arkveld during the Inclemency will be added to the Welcome hub. Customize your PS5 Welcome hub with Monster Hunter Wilds to get you in the hunting mood. View and download image Download the image close Close Download this image How to change the backgroundWelcome hub -> Change background -> Games Try out Monster Hunter Wilds on PS5 with a PlayStation Plus Premium Game Trial starting on June 30 View and download image Download the image close Close Download this image With the Game Trial, you can try out the full version of the game for 2 hours. If you decide to purchase the full version after the trial, your save data will carry over, allowing you to continue playing seamlessly right where you left off. If you haven’t played Monster Hunter Wilds yet, this is a great way to give it a try. Happy Hunting!
    0 Yorumlar 0 hisse senetleri
  • HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE

    By TREVOR HOGG

    Images courtesy of Warner Bros. Pictures.

    Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon.

    “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.”
    —Talia Finlayson, Creative Technologist, Disguise

    Interior and exterior environments had to be created, such as the shop owned by Steve.

    “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”

    Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.”

    A virtual exploration of Steve’s shop in Midport Village.

    Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.”

    “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”
    —Laura Bell, Creative Technologist, Disguise

    Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack.

    Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.”

    Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!”

    A virtual study and final still of the cast members standing outside of the Lava Chicken Shack.

    “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.”
    —Talia Finlayson, Creative Technologist, Disguise

    The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.”

    Virtually conceptualizing the layout of Midport Village.

    Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.”

    An example of the virtual and final version of the Woodland Mansion.

    “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.”
    —Laura Bell, Creative Technologist, Disguise

    Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.”

    Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment.

    Doing a virtual scale study of the Mountainside.

    Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.”

    Piglots cause mayhem during the Wingsuit Chase.

    Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods.

    “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    #how #disguise #built #out #virtual
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve. “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.” #how #disguise #built #out #virtual
    WWW.VFXVOICE.COM
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “[A]s the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve (Jack Black). “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’s (Jack Black) Lava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younis [VAD Art Director] adapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay George [VP Tech] and I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols [VAD Supervisor], Pat Younis, Jake Tuck [Unreal Artist] and Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    0 Yorumlar 0 hisse senetleri
  • En un mundo donde la violencia y la destrucción acechan a cada paso, siento el peso de la soledad. Las palabras del director de la Agencia Internacional de Energía Atómica resuenan en mi corazón: "La violencia podría alcanzar niveles inimaginables". ¿Por qué la diplomacia se desvanece como un susurro en el viento? La falta de compasión me deja vacío, como una sombra que vaga sin rumbo. Cada día es un recordatorio de que la humanidad puede ser tan fría, tan distante. La esperanza se convierte en un eco lejano, y la tristeza se aferra a mi ser.

    #Soledad #EsperanzaPerdida #Violencia #Diplomacia #Destrucción
    En un mundo donde la violencia y la destrucción acechan a cada paso, siento el peso de la soledad. Las palabras del director de la Agencia Internacional de Energía Atómica resuenan en mi corazón: "La violencia podría alcanzar niveles inimaginables". ¿Por qué la diplomacia se desvanece como un susurro en el viento? La falta de compasión me deja vacío, como una sombra que vaga sin rumbo. Cada día es un recordatorio de que la humanidad puede ser tan fría, tan distante. La esperanza se convierte en un eco lejano, y la tristeza se aferra a mi ser. #Soledad #EsperanzaPerdida #Violencia #Diplomacia #Destrucción
    International Nuclear Watchdog Issues Stark Warning in Wake of US Iran Strikes
    “Violence and destruction could reach unimaginable levels” if diplomacy is not pursued, said the head of the International Atomic Energy Agency.
    Like
    Wow
    8
    1 Yorumlar 0 hisse senetleri
  • In the quiet moments, when the world feels heavy and my heart is an echo of the past, I find myself drawn into the realm of Endless Legend 2. Just like the characters that roam through its beautifully crafted landscapes, I too wander through my own desolate terrains of disappointment and solitude.

    In an age where connections are just a click away, I feel an overwhelming wave of loneliness wash over me. It's as if the colors of my life have faded into shades of grey, much like the emptiness that lingers in the air. I once believed in the promise of adventure and the thrill of exploration, but now I’m left with the haunting reminder of dreams unfulfilled. The anticipation for Endless Legend 2, scheduled for early access on August 7, is bittersweet. It stirs a deep longing within me for the days when joy was effortlessly abundant.

    Jean-Maxime Moris, the creative director of Amplitude Studios, speaks of worlds to conquer, of stories to tell. Yet, each word feels like a distant whisper, a reminder of the tales I used to weave in my mind. I once imagined myself as a brave hero, surrounded by friends who would join me in battle. Now, I sit alone, the flickering light of my screen the only companion in this vast expanse of isolation.

    Every character in the game resonates with pieces of my own soul, reflecting my fears and hopes. The intricate design of Endless Legend 2 mirrors the complexity of my emotions; beautiful yet deeply fraught with the struggle of existence. I yearn for the laughter of companions and the warmth of camaraderie, yet here I am, cloaked in shadows, fighting battles that are often invisible to the outside world.

    As I read about the game, I can almost hear the distant armies clashing, feel the pulse of a story waiting to unfold. But reality is stark; the realms I traverse are not just virtual landscapes but the silent corridors of my mind, echoing with the sounds of my own solitude. I wish I could escape into that world, to feel the thrill of adventure once more, to connect with others who understand the weight of these unspoken burdens.

    But for now, all I have are the remnants of hope, the flickering flames of what could be. And as the countdown to Endless Legend 2 continues, I can’t help but wonder if the game will offer me a reprieve from this loneliness or merely serve as a reminder of the connections I yearn for.

    #EndlessLegend2 #Loneliness #Heartbreak #GamingCommunity #Solitude
    In the quiet moments, when the world feels heavy and my heart is an echo of the past, I find myself drawn into the realm of Endless Legend 2. Just like the characters that roam through its beautifully crafted landscapes, I too wander through my own desolate terrains of disappointment and solitude. 🖤 In an age where connections are just a click away, I feel an overwhelming wave of loneliness wash over me. It's as if the colors of my life have faded into shades of grey, much like the emptiness that lingers in the air. I once believed in the promise of adventure and the thrill of exploration, but now I’m left with the haunting reminder of dreams unfulfilled. The anticipation for Endless Legend 2, scheduled for early access on August 7, is bittersweet. It stirs a deep longing within me for the days when joy was effortlessly abundant. Jean-Maxime Moris, the creative director of Amplitude Studios, speaks of worlds to conquer, of stories to tell. Yet, each word feels like a distant whisper, a reminder of the tales I used to weave in my mind. I once imagined myself as a brave hero, surrounded by friends who would join me in battle. Now, I sit alone, the flickering light of my screen the only companion in this vast expanse of isolation. 🌧️ Every character in the game resonates with pieces of my own soul, reflecting my fears and hopes. The intricate design of Endless Legend 2 mirrors the complexity of my emotions; beautiful yet deeply fraught with the struggle of existence. I yearn for the laughter of companions and the warmth of camaraderie, yet here I am, cloaked in shadows, fighting battles that are often invisible to the outside world. As I read about the game, I can almost hear the distant armies clashing, feel the pulse of a story waiting to unfold. But reality is stark; the realms I traverse are not just virtual landscapes but the silent corridors of my mind, echoing with the sounds of my own solitude. I wish I could escape into that world, to feel the thrill of adventure once more, to connect with others who understand the weight of these unspoken burdens. But for now, all I have are the remnants of hope, the flickering flames of what could be. And as the countdown to Endless Legend 2 continues, I can’t help but wonder if the game will offer me a reprieve from this loneliness or merely serve as a reminder of the connections I yearn for. 🖤 #EndlessLegend2 #Loneliness #Heartbreak #GamingCommunity #Solitude
    Endless Legend 2 : Notre interview de Jean-Maxime Moris, directeur créatif sur le 4X d’Amplitude Studios
    ActuGaming.net Endless Legend 2 : Notre interview de Jean-Maxime Moris, directeur créatif sur le 4X d’Amplitude Studios Officialisé en début d’année, Endless Legend 2 sortira en accès anticipé le 7 août prochain […] L'article Endle
    Like
    Love
    Wow
    Sad
    Angry
    222
    1 Yorumlar 0 hisse senetleri
  • It's time to call out the glaring flaws in the so-called "Latest Showreel" by the Compagnie Générale des Effets Visuels (CGEV). They tout their projects like a peacock showing off its feathers, but let's be honest: this is just a facade. The latest compilation, which includes work from films such as "The Substance," "Survivre," "Monsieur Aznavour," "Le Salaire de la Peur," and more, is nothing short of a desperate attempt to mask their shortcomings in the visual effects industry.

    First off, what are they thinking with the title "Mise à jour de showreel"? This isn't an update; it's a cry for help! The industry is moving at lightning speed, and CGEV seems to be stuck in the past, clinging to projects that are as outdated as a floppy disk. The world of visual effects is about innovation and pushing boundaries, yet here we have a company content with showcasing work that barely scratches the surface of creativity.

    And let’s talk about "Le Salaire de la Peur." If this is their crown jewel, then they are in serious trouble. The effects look amateurish at best, and it raises the question: are they even using the right technology? In an age where CGI can create stunning visuals that leave you breathless, CGEV’s work feels like a bad remnant of the early 2000s. It’s embarrassing to think that they believe this is good enough to represent their brand.

    Alain Carsoux, the director, needs to take a long, hard look in the mirror. Is he satisfied with this mediocrity? Because the rest of us definitely aren’t. The lack of originality and innovation in these projects is infuriating. Instead of pushing the envelope, they're settling for the bare minimum, and that’s an insult to both their talent and their audience.

    The sad reality is that CGEV is not alone in this trend. The entire industry seems to be plagued by a lack of ambition. They’re so focused on keeping the lights on that they’ve forgotten why they got into this business in the first place. It’s about passion, creativity, and daring to take risks. "Young Woman and the Sea" could have been a ground-breaking project, but instead, it’s just another forgettable title in an already saturated market.

    We need to demand more from these companies. We deserve visual effects that inspire, challenge, and captivate. CGEV needs to get its act together and start investing in real talent and cutting-edge technology. No more excuses! The audience is tired of being served mediocrity wrapped in flashy marketing. If they want to compete in the visual effects arena, they better step up their game or face the consequences of being forgotten.

    Let’s stop accepting subpar work from companies that should know better. The time for complacency is over. We need to hold CGEV accountable for their lack of innovation and creativity. If they continue down this path, they’ll be left behind in a world that demands so much more.

    #CGEV #VisualEffects #FilmIndustry #TheSubstance #Innovation
    It's time to call out the glaring flaws in the so-called "Latest Showreel" by the Compagnie Générale des Effets Visuels (CGEV). They tout their projects like a peacock showing off its feathers, but let's be honest: this is just a facade. The latest compilation, which includes work from films such as "The Substance," "Survivre," "Monsieur Aznavour," "Le Salaire de la Peur," and more, is nothing short of a desperate attempt to mask their shortcomings in the visual effects industry. First off, what are they thinking with the title "Mise à jour de showreel"? This isn't an update; it's a cry for help! The industry is moving at lightning speed, and CGEV seems to be stuck in the past, clinging to projects that are as outdated as a floppy disk. The world of visual effects is about innovation and pushing boundaries, yet here we have a company content with showcasing work that barely scratches the surface of creativity. And let’s talk about "Le Salaire de la Peur." If this is their crown jewel, then they are in serious trouble. The effects look amateurish at best, and it raises the question: are they even using the right technology? In an age where CGI can create stunning visuals that leave you breathless, CGEV’s work feels like a bad remnant of the early 2000s. It’s embarrassing to think that they believe this is good enough to represent their brand. Alain Carsoux, the director, needs to take a long, hard look in the mirror. Is he satisfied with this mediocrity? Because the rest of us definitely aren’t. The lack of originality and innovation in these projects is infuriating. Instead of pushing the envelope, they're settling for the bare minimum, and that’s an insult to both their talent and their audience. The sad reality is that CGEV is not alone in this trend. The entire industry seems to be plagued by a lack of ambition. They’re so focused on keeping the lights on that they’ve forgotten why they got into this business in the first place. It’s about passion, creativity, and daring to take risks. "Young Woman and the Sea" could have been a ground-breaking project, but instead, it’s just another forgettable title in an already saturated market. We need to demand more from these companies. We deserve visual effects that inspire, challenge, and captivate. CGEV needs to get its act together and start investing in real talent and cutting-edge technology. No more excuses! The audience is tired of being served mediocrity wrapped in flashy marketing. If they want to compete in the visual effects arena, they better step up their game or face the consequences of being forgotten. Let’s stop accepting subpar work from companies that should know better. The time for complacency is over. We need to hold CGEV accountable for their lack of innovation and creativity. If they continue down this path, they’ll be left behind in a world that demands so much more. #CGEV #VisualEffects #FilmIndustry #TheSubstance #Innovation
    Mise à jour de showreel pour la CGEV : de The Substance au Salaire de la Peur
    La Compagnie Générale des Effets Visuels présente une compilation de ses derniers projets. On y trouvera son travail d’effets visuels sur le film The Substance, mais aussi Survivre, Monsieur Aznavour, Le Salaire de la Peur, ou encore Young Woma
    Like
    Love
    Wow
    Angry
    Sad
    153
    1 Yorumlar 0 hisse senetleri
  • When you think about horror films, what comes to mind? Creepy monsters? Jump scares? The classic trope of a group of friends who somehow forget that splitting up is a bad idea? Well, hold onto your popcorn, because the talented folks at ESMA are here to remind us that the only thing scarier than a killer lurking in the shadows is the idea of them trying to be funny while doing it.

    Enter "Claw," a short film that dares to blend the horror genre with a sprinkle of humor – because who wouldn't want to laugh while being chased by a guy with a chainsaw? This cinematic masterpiece, which apparently took inspiration from the likes of "Last Action Hero," is like if a horror movie and a stand-up comedian had a baby, and we’re all just waiting for the punchline as we hide behind our couches.

    Imagine a young cinephile named Andrew, who is living his best life by binge-watching horror classics. However, instead of the usual blood and guts, he encounters a version of horror that leaves you both terrified and chuckling nervously. It’s like the directors at ESMA sat down and said, “Why not take everything that terrifies us and add a dash of quirky humor?” Honestly, it’s a wonder they didn’t throw in a musical number.

    Sure, we all adore the suspense that makes our hearts race, but the thought of Andrew laughing nervously at a killer with a penchant for puns? Now that’s a new level of fear. Who knew that horror could provide comic relief while simultaneously making us question our life choices? Forget battling your demons; let’s just joke about them instead! And if you think about it, that’s probably the best coping mechanism we’ve got.

    But beware! As you dive into this horror-comedy concoction, you might just find yourself chuckling at the most inappropriate moments. Like when the killer slips on a banana peel right before going for the kill – because nothing says “I’m terrified” like a comedy skit in a death scene. After all, isn’t that the essence of horror? To laugh in the face of danger, even if it’s through the lens of ESMA’s latest cinematic exploration?

    So, if you’re looking for a good time that sends shivers down your spine while keeping you in stitches, “Claw” is your go-to film. Just remember to keep a straight face when explaining to your friends why you’re laughing while watching someone get chased by a masked figure. But hey, in the world of horror, even the scariest movies can have a light-hearted twist – because why not?

    Embrace the terror, welcome the humor, and prepare yourself for a rollercoaster of emotions with "Claw." After all, if we can’t laugh at our fears, what’s the point?

    #ClawFilm #HorrorComedy #ESMA #CinematicHumor #HorrorMovies
    When you think about horror films, what comes to mind? Creepy monsters? Jump scares? The classic trope of a group of friends who somehow forget that splitting up is a bad idea? Well, hold onto your popcorn, because the talented folks at ESMA are here to remind us that the only thing scarier than a killer lurking in the shadows is the idea of them trying to be funny while doing it. Enter "Claw," a short film that dares to blend the horror genre with a sprinkle of humor – because who wouldn't want to laugh while being chased by a guy with a chainsaw? This cinematic masterpiece, which apparently took inspiration from the likes of "Last Action Hero," is like if a horror movie and a stand-up comedian had a baby, and we’re all just waiting for the punchline as we hide behind our couches. Imagine a young cinephile named Andrew, who is living his best life by binge-watching horror classics. However, instead of the usual blood and guts, he encounters a version of horror that leaves you both terrified and chuckling nervously. It’s like the directors at ESMA sat down and said, “Why not take everything that terrifies us and add a dash of quirky humor?” Honestly, it’s a wonder they didn’t throw in a musical number. Sure, we all adore the suspense that makes our hearts race, but the thought of Andrew laughing nervously at a killer with a penchant for puns? Now that’s a new level of fear. Who knew that horror could provide comic relief while simultaneously making us question our life choices? Forget battling your demons; let’s just joke about them instead! And if you think about it, that’s probably the best coping mechanism we’ve got. But beware! As you dive into this horror-comedy concoction, you might just find yourself chuckling at the most inappropriate moments. Like when the killer slips on a banana peel right before going for the kill – because nothing says “I’m terrified” like a comedy skit in a death scene. After all, isn’t that the essence of horror? To laugh in the face of danger, even if it’s through the lens of ESMA’s latest cinematic exploration? So, if you’re looking for a good time that sends shivers down your spine while keeping you in stitches, “Claw” is your go-to film. Just remember to keep a straight face when explaining to your friends why you’re laughing while watching someone get chased by a masked figure. But hey, in the world of horror, even the scariest movies can have a light-hearted twist – because why not? Embrace the terror, welcome the humor, and prepare yourself for a rollercoaster of emotions with "Claw." After all, if we can’t laugh at our fears, what’s the point? #ClawFilm #HorrorComedy #ESMA #CinematicHumor #HorrorMovies
    L’ESMA détourne les clichés des films d’horreurs : tremblez !
    Découvrez Claw, un court de fin d’études de l’ESMA qui s’inspire des codes des films d’horreur pour en proposer une version revisitée. A partir d’un concept qui rappelle Last Action Hero, l’équipe a concocté un fil
    Like
    Love
    Wow
    Sad
    Angry
    636
    1 Yorumlar 0 hisse senetleri
  • Minecraft, le film! Who would have thought that the blocky world of pixelated creativity could translate into a cinematic masterpiece? Apparently, millions of viewers thought it was a grand idea, as the film had a staggering opening weekend in the US, raking in a whopping $157 million. Yes, you read that right - more than the Super Mario Bros movie. Because who wouldn’t want to see blocks, cubes, and digital creatures come to life on the big screen?

    Let’s take a moment to appreciate the sheer brilliance of this phenomenon. Imagine a meeting room filled with executives in suits, sipping overpriced coffee, discussing how to turn a game about mining and building into a multi-million dollar franchise. “What if we add a plot?” one visionary must have suggested. “And maybe some actual characters!” shouted another. Brilliant! Because nothing screams box office hit like a narrative about crafting and survival – the quintessential human experience, am I right?

    And while we’re at it, let’s not overlook the glorious irony of a massive online leak. One might think that a film like Minecraft, which is all about building and creating, would have safeguards against such breaches. Yet here we are, in a world where fans are more adept at finding leaks than creepers are at sneaking up on unsuspecting players. It’s as if the universe itself is saying, “Why wait for the official release when you can embrace the chaos of the internet?”

    Moreover, the film’s success raises an important question: is this the pinnacle of creativity, or just a sign that Hollywood has officially run out of ideas? After all, why bother developing original content when you can simply mine from the vast experiences of gamers? There’s a certain elegance to recycling beloved franchises; the nostalgia factor alone is worth millions. Let’s just hope that the next film adaptation is as riveting as watching a character gather resources for five hours straight.

    And speaking of adaptations, let’s give a nod to the directors and writers who managed to transform a game with virtually no plot into a cinematic sensation. If these individuals can take pixelated blocks and turn them into a story that captures the hearts of millions, perhaps we should hand them the keys to the next great literary classic. Who wouldn't want to see a film based on the riveting tale of a potato?

    In conclusion, Minecraft, le film is a remarkable testament to the state of modern cinema. It embodies the essence of our times: a blend of nostalgia, creativity, and a hint of desperation. So, grab your popcorn and enjoy the show, folks! Who knows what other game adaptations await us? Maybe Tetris will be next!

    #MinecraftMovie #HollywoodAdaptations #BlockbusterSuccess #CinemaIrony #NostalgiaInFilm
    Minecraft, le film! Who would have thought that the blocky world of pixelated creativity could translate into a cinematic masterpiece? Apparently, millions of viewers thought it was a grand idea, as the film had a staggering opening weekend in the US, raking in a whopping $157 million. Yes, you read that right - more than the Super Mario Bros movie. Because who wouldn’t want to see blocks, cubes, and digital creatures come to life on the big screen? Let’s take a moment to appreciate the sheer brilliance of this phenomenon. Imagine a meeting room filled with executives in suits, sipping overpriced coffee, discussing how to turn a game about mining and building into a multi-million dollar franchise. “What if we add a plot?” one visionary must have suggested. “And maybe some actual characters!” shouted another. Brilliant! Because nothing screams box office hit like a narrative about crafting and survival – the quintessential human experience, am I right? And while we’re at it, let’s not overlook the glorious irony of a massive online leak. One might think that a film like Minecraft, which is all about building and creating, would have safeguards against such breaches. Yet here we are, in a world where fans are more adept at finding leaks than creepers are at sneaking up on unsuspecting players. It’s as if the universe itself is saying, “Why wait for the official release when you can embrace the chaos of the internet?” Moreover, the film’s success raises an important question: is this the pinnacle of creativity, or just a sign that Hollywood has officially run out of ideas? After all, why bother developing original content when you can simply mine from the vast experiences of gamers? There’s a certain elegance to recycling beloved franchises; the nostalgia factor alone is worth millions. Let’s just hope that the next film adaptation is as riveting as watching a character gather resources for five hours straight. And speaking of adaptations, let’s give a nod to the directors and writers who managed to transform a game with virtually no plot into a cinematic sensation. If these individuals can take pixelated blocks and turn them into a story that captures the hearts of millions, perhaps we should hand them the keys to the next great literary classic. Who wouldn't want to see a film based on the riveting tale of a potato? In conclusion, Minecraft, le film is a remarkable testament to the state of modern cinema. It embodies the essence of our times: a blend of nostalgia, creativity, and a hint of desperation. So, grab your popcorn and enjoy the show, folks! Who knows what other game adaptations await us? Maybe Tetris will be next! #MinecraftMovie #HollywoodAdaptations #BlockbusterSuccess #CinemaIrony #NostalgiaInFilm
    Minecraft, le film : succès massif et fuite en ligne
    C’est un carton ! Minecraft, le film, qui adapte au cinéma le célèbre jeu vidéo, a débarqué ce week-end dans le salles américaines. A la clé, le meilleur démarrage de l’année, avec des recettes estimées à 157 millions de dollars aux USA.
    Like
    Love
    Wow
    Sad
    Angry
    576
    1 Yorumlar 0 hisse senetleri
Arama Sonuçları