• A New Last Airbender Bestiary Art Book Launching September 23

    Beasts of the Four Nations: Creatures from Avatar: The Last Airbender and The Legend of Korra| Releases September 23 Preorder Beasts of the Four Nations: Creatures from Avatar: The Last Airbender and The Legend of Korra| Releases September 23 Preorder Earlier this year, Nickelodeon announced Avatar is coming back with a new animated series called Avatar: Seven Havens, and there's a new Avatar: The Last Airbender live-action movie on the way, too, making now a good time to brush up on the lore and rich worldbuilding the franchise is known for. One way to do that is with the upcoming Beasts of the Four Nations, a 128-page hardcover bestiary offering in-universe lore and behind-the-scenes details on the wildlife and mythical creatures of both animated series. You can preorder the standard edition foror secure a copy of the Deluxe Edition that includes exclusive cover art and a lithograph print. Preorders for both editions are available , and both ship September 23. Beasts of the Four Nations: Creatures from Avatar: The Last Airbender and The Legend of Korra| Releases September 23 Written by John O'Bryan, Beasts of the Four Nations includes illustrations and information on the many fantastical beasts of The Last Airbender's world. Everything from the Air Nomads’ flying bison to Kyoshi Island’s elephant koi and the Earth Kingdom’s singing groundhogs are detailed in the book, along with commentary by Avatar and Legend of Korra creators Bryan Konietzko and Michael Dante DiMartino. The standard edition launches September 23 and is available to preorder for. A Deluxe Edition will also launch on the same day that includes a few extras, which we've detailed below. Preorder Beasts of the Four Nations: Creatures from Avatar: The Last Airbender and The Legend of Korra| Releases September 23 The Beasts of the Four Nations Deluxe Edition contains all the contents of the standard edition, but features a few notable upgrades like foil highlights on the cover art and a protective slipcase. The book also comes with an exclusive lithograph print depicting Cai, the cabbage merchant who appears throughout the Avatar series, and his cart pulled by two ostrich horses. You can preorder the Beasts of the Four Nations Deluxe Edition for . Preorder Beasts of the Four Nations Deluxe EditionIf you want to explore more of the Avatar franchise’s visual history, you're in luck, as several more official Avatar: The Last Airbender and The Legend of Korra art books are available, and some are even discounted. There's a giant Avatar: The Last Airbender - The Art of the Animated Series art book that covers all four seasons of the show. It's packed with concept art, design, and production materials, ranging from the very first sketch through to the series finale.The Legend of Korra has a multi-volume art book series available as well Each volume focuses on a specific season of the show and features creator commentaries and exclusive artwork. Standard and Deluxe Editions are available for each volume. The Deluxe Editions include slipcases, lithographs, new covers, and bonus sketches by the show’s creators.Continue Reading at GameSpot
    #new #last #airbender #bestiary #art
    A New Last Airbender Bestiary Art Book Launching September 23
    Beasts of the Four Nations: Creatures from Avatar: The Last Airbender and The Legend of Korra| Releases September 23 Preorder Beasts of the Four Nations: Creatures from Avatar: The Last Airbender and The Legend of Korra| Releases September 23 Preorder Earlier this year, Nickelodeon announced Avatar is coming back with a new animated series called Avatar: Seven Havens, and there's a new Avatar: The Last Airbender live-action movie on the way, too, making now a good time to brush up on the lore and rich worldbuilding the franchise is known for. One way to do that is with the upcoming Beasts of the Four Nations, a 128-page hardcover bestiary offering in-universe lore and behind-the-scenes details on the wildlife and mythical creatures of both animated series. You can preorder the standard edition foror secure a copy of the Deluxe Edition that includes exclusive cover art and a lithograph print. Preorders for both editions are available , and both ship September 23. Beasts of the Four Nations: Creatures from Avatar: The Last Airbender and The Legend of Korra| Releases September 23 Written by John O'Bryan, Beasts of the Four Nations includes illustrations and information on the many fantastical beasts of The Last Airbender's world. Everything from the Air Nomads’ flying bison to Kyoshi Island’s elephant koi and the Earth Kingdom’s singing groundhogs are detailed in the book, along with commentary by Avatar and Legend of Korra creators Bryan Konietzko and Michael Dante DiMartino. The standard edition launches September 23 and is available to preorder for. A Deluxe Edition will also launch on the same day that includes a few extras, which we've detailed below. Preorder Beasts of the Four Nations: Creatures from Avatar: The Last Airbender and The Legend of Korra| Releases September 23 The Beasts of the Four Nations Deluxe Edition contains all the contents of the standard edition, but features a few notable upgrades like foil highlights on the cover art and a protective slipcase. The book also comes with an exclusive lithograph print depicting Cai, the cabbage merchant who appears throughout the Avatar series, and his cart pulled by two ostrich horses. You can preorder the Beasts of the Four Nations Deluxe Edition for . Preorder Beasts of the Four Nations Deluxe EditionIf you want to explore more of the Avatar franchise’s visual history, you're in luck, as several more official Avatar: The Last Airbender and The Legend of Korra art books are available, and some are even discounted. There's a giant Avatar: The Last Airbender - The Art of the Animated Series art book that covers all four seasons of the show. It's packed with concept art, design, and production materials, ranging from the very first sketch through to the series finale.The Legend of Korra has a multi-volume art book series available as well Each volume focuses on a specific season of the show and features creator commentaries and exclusive artwork. Standard and Deluxe Editions are available for each volume. The Deluxe Editions include slipcases, lithographs, new covers, and bonus sketches by the show’s creators.Continue Reading at GameSpot #new #last #airbender #bestiary #art
    WWW.GAMESPOT.COM
    A New Last Airbender Bestiary Art Book Launching September 23
    Beasts of the Four Nations: Creatures from Avatar: The Last Airbender and The Legend of Korra $37.19 (was $40) | Releases September 23 Preorder at Amazon Beasts of the Four Nations: Creatures from Avatar: The Last Airbender and The Legend of Korra (Deluxe Edition) $80 | Releases September 23 Preorder at Amazon Earlier this year, Nickelodeon announced Avatar is coming back with a new animated series called Avatar: Seven Havens, and there's a new Avatar: The Last Airbender live-action movie on the way, too, making now a good time to brush up on the lore and rich worldbuilding the franchise is known for. One way to do that is with the upcoming Beasts of the Four Nations, a 128-page hardcover bestiary offering in-universe lore and behind-the-scenes details on the wildlife and mythical creatures of both animated series. You can preorder the standard edition for $37.19 (down from $40) or secure a copy of the $80 Deluxe Edition that includes exclusive cover art and a lithograph print. Preorders for both editions are available at Amazon, and both ship September 23. Beasts of the Four Nations: Creatures from Avatar: The Last Airbender and The Legend of Korra $37.19 (was $40) | Releases September 23 Written by John O'Bryan, Beasts of the Four Nations includes illustrations and information on the many fantastical beasts of The Last Airbender's world. Everything from the Air Nomads’ flying bison to Kyoshi Island’s elephant koi and the Earth Kingdom’s singing groundhogs are detailed in the book, along with commentary by Avatar and Legend of Korra creators Bryan Konietzko and Michael Dante DiMartino. The standard edition launches September 23 and is available to preorder for $37.19 (down from $40) at Amazon. A Deluxe Edition will also launch on the same day that includes a few extras, which we've detailed below. Preorder at Amazon Beasts of the Four Nations: Creatures from Avatar: The Last Airbender and The Legend of Korra (Deluxe Edition) $80 | Releases September 23 The Beasts of the Four Nations Deluxe Edition contains all the contents of the standard edition, but features a few notable upgrades like foil highlights on the cover art and a protective slipcase. The book also comes with an exclusive lithograph print depicting Cai, the cabbage merchant who appears throughout the Avatar series, and his cart pulled by two ostrich horses. You can preorder the Beasts of the Four Nations Deluxe Edition for $80 at Amazon. Preorder at Amazon Beasts of the Four Nations Deluxe EditionIf you want to explore more of the Avatar franchise’s visual history, you're in luck, as several more official Avatar: The Last Airbender and The Legend of Korra art books are available, and some are even discounted. There's a giant Avatar: The Last Airbender - The Art of the Animated Series art book that covers all four seasons of the show. It's packed with concept art, design, and production materials, ranging from the very first sketch through to the series finale.The Legend of Korra has a multi-volume art book series available as well Each volume focuses on a specific season of the show and features creator commentaries and exclusive artwork. Standard and Deluxe Editions are available for each volume. The Deluxe Editions include slipcases, lithographs, new covers, and bonus sketches by the show’s creators.Continue Reading at GameSpot
    Like
    Love
    Angry
    Sad
    18
    0 Comments 0 Shares
  • Retail Reboot: Major Global Brands Transform End-to-End Operations With NVIDIA

    AI is packing and shipping efficiency for the retail and consumer packaged goodsindustries, with a majority of surveyed companies in the space reporting the technology is increasing revenue and reducing operational costs.
    Global brands are reimagining every facet of their businesses with AI, from how products are designed and manufactured to how they’re marketed, shipped and experienced in-store and online.
    At NVIDIA GTC Paris at VivaTech, industry leaders including L’Oréal, LVMH and Nestlé shared how they’re using tools like AI agents and physical AI — powered by NVIDIA AI and simulation technologies — across every step of the product lifecycle to enhance operations and experiences for partners, customers and employees.
    3D Digital Twins and AI Transform Marketing, Advertising and Product Design
    The meeting of generative AI and 3D product digital twins results in unlimited creative potential.
    Nestlé, the world’s largest food and beverage company, today announced a collaboration with NVIDIA and Accenture to launch a new, AI-powered in-house service that will create high-quality product content at scale for e-commerce and digital media channels.
    The new content service, based on digital twins powered by the NVIDIA Omniverse platform, creates exact 3D virtual replicas of physical products. Product packaging can be adjusted or localized digitally, enabling seamless integration into various environments, such as seasonal campaigns or channel-specific formats. This means that new creative content can be generated without having to constantly reshoot from scratch.
    Image courtesy of Nestlé
    The service is developed in partnership with Accenture Song, using Accenture AI Refinery built on NVIDIA Omniverse for advanced digital twin creation. It uses NVIDIA AI Enterprise for generative AI, hosted on Microsoft Azure for robust cloud infrastructure.
    Nestlé already has a baseline of 4,000 3D digital products — mainly for global brands — with the ambition to convert a total of 10,000 products into digital twins in the next two years across global and local brands.
    LVMH, the world’s leading luxury goods company, home to 75 distinguished maisons, is bringing 3D digital twins to its content production processes through its wine and spirits division, Moët Hennessy.
    The group partnered with content configuration engine Grip to develop a solution using the NVIDIA Omniverse platform, which enables the creation of 3D digital twins that power content variation production. With Grip’s solution, Moët Hennessy teams can quickly generate digital marketing assets and experiences to promote luxury products at scale.
    The initiative, led by Capucine Lafarge and Chloé Fournier, has been recognized by LVMH as a leading approach to scaling content creation.
    Image courtesy of Grip
    L’Oréal Gives Marketing and Online Shopping an AI Makeover
    Innovation starts at the drawing board. Today, that board is digital — and it’s powered by AI.
    L’Oréal Groupe, the world’s leading beauty player, announced its collaboration with NVIDIA today. Through this collaboration, L’Oréal and its partner ecosystem will leverage the NVIDIA AI Enterprise platform to transform its consumer beauty experiences, marketing and advertising content pipelines.
    “AI doesn’t think with the same constraints as a human being. That opens new avenues for creativity,” said Anne Machet, global head of content and entertainment at L’Oréal. “Generative AI enables our teams and partner agencies to explore creative possibilities.”
    CreAItech, L’Oréal’s generative AI content platform, is augmenting the creativity of marketing and content teams. Combining a modular ecosystem of models, expertise, technologies and partners — including NVIDIA — CreAltech empowers marketers to generate thousands of unique, on-brand images, videos and lines of text for diverse platforms and global audiences.
    The solution empowers L’Oréal’s marketing teams to quickly iterate on campaigns that improve consumer engagement across social media, e-commerce content and influencer marketing — driving higher conversion rates.

    Noli.com, the first AI-powered multi-brand marketplace startup founded and backed by the  L’Oréal Groupe, is reinventing how people discover and shop for beauty products.
    Noli’s AI Beauty Matchmaker experience uses L’Oréal Groupe’s century-long expertise in beauty, including its extensive knowledge of beauty science, beauty tech and consumer insights, built from over 1 million skin data points and analysis of thousands of product formulations. It gives users a BeautyDNA profile with expert-level guidance and personalized product recommendations for skincare and haircare.
    “Beauty shoppers are often overwhelmed by choice and struggling to find the products that are right for them,” said Amos Susskind, founder and CEO of Noli. “By applying the latest AI models accelerated by NVIDIA and Accenture to the unparalleled knowledge base and expertise of the L’Oréal Groupe, we can provide hyper-personalized, explainable recommendations to our users.” 

    The Accenture AI Refinery, powered by NVIDIA AI Enterprise, will provide the platform for Noli to experiment and scale. Noli’s new agent models will use NVIDIA NIM and NVIDIA NeMo microservices, including NeMo Retriever, running on Microsoft Azure.
    Rapid Innovation With the NVIDIA Partner Ecosystem
    NVIDIA’s ecosystem of solution provider partners empowers retail and CPG companies to innovate faster, personalize customer experiences, and optimize operations with NVIDIA accelerated computing and AI.
    Global digital agency Monks is reshaping the landscape of AI-driven marketing, creative production and enterprise transformation. At the heart of their innovation lies the Monks.Flow platform that enhances both the speed and sophistication of creative workflows through NVIDIA Omniverse, NVIDIA NIM microservices and Triton Inference Server for lightning-fast inference.
    AI image solutions provider Bria is helping retail giants like Lidl and L’Oreal to enhance marketing asset creation. Bria AI transforms static product images into compelling, dynamic advertisements that can be quickly scaled for use across any marketing need.
    The company’s generative AI platform uses NVIDIA Triton Inference Server software and the NVIDIA TensorRT software development kit for accelerated inference, as well as NVIDIA NIM and NeMo microservices for quick image generation at scale.
    Physical AI Brings Acceleration to Supply Chain and Logistics
    AI’s impact extends far beyond the digital world. Physical AI-powered warehousing robots, for example, are helping maximize efficiency in retail supply chain operations. Four in five retail companies have reported that AI has helped reduce supply chain operational costs, with 25% reporting cost reductions of at least 10%.
    Technology providers Lyric, KoiReader Technologies and Exotec are tackling the challenges of integrating AI into complex warehouse environments.
    Lyric is using the NVIDIA cuOpt GPU-accelerated solver for warehouse network planning and route optimization, and is collaborating with NVIDIA to apply the technology to broader supply chain decision-making problems. KoiReader Technologies is tapping the NVIDIA Metropolis stack for its computer vision solutions within logistics, supply chain and manufacturing environments using the KoiVision Platform. And Exotec is using NVIDIA CUDA libraries and the NVIDIA JetPack software development kit for embedded robotic systems in warehouse and distribution centers.
    From real-time robotics orchestration to predictive maintenance, these solutions are delivering impact on uptime, throughput and cost savings for supply chain operations.
    Learn more by joining a follow-up discussion on digital twins and AI-powered creativity with Microsoft, Nestlé, Accenture and NVIDIA at Cannes Lions on Monday, June 16.
    Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    #retail #reboot #major #global #brands
    Retail Reboot: Major Global Brands Transform End-to-End Operations With NVIDIA
    AI is packing and shipping efficiency for the retail and consumer packaged goodsindustries, with a majority of surveyed companies in the space reporting the technology is increasing revenue and reducing operational costs. Global brands are reimagining every facet of their businesses with AI, from how products are designed and manufactured to how they’re marketed, shipped and experienced in-store and online. At NVIDIA GTC Paris at VivaTech, industry leaders including L’Oréal, LVMH and Nestlé shared how they’re using tools like AI agents and physical AI — powered by NVIDIA AI and simulation technologies — across every step of the product lifecycle to enhance operations and experiences for partners, customers and employees. 3D Digital Twins and AI Transform Marketing, Advertising and Product Design The meeting of generative AI and 3D product digital twins results in unlimited creative potential. Nestlé, the world’s largest food and beverage company, today announced a collaboration with NVIDIA and Accenture to launch a new, AI-powered in-house service that will create high-quality product content at scale for e-commerce and digital media channels. The new content service, based on digital twins powered by the NVIDIA Omniverse platform, creates exact 3D virtual replicas of physical products. Product packaging can be adjusted or localized digitally, enabling seamless integration into various environments, such as seasonal campaigns or channel-specific formats. This means that new creative content can be generated without having to constantly reshoot from scratch. Image courtesy of Nestlé The service is developed in partnership with Accenture Song, using Accenture AI Refinery built on NVIDIA Omniverse for advanced digital twin creation. It uses NVIDIA AI Enterprise for generative AI, hosted on Microsoft Azure for robust cloud infrastructure. Nestlé already has a baseline of 4,000 3D digital products — mainly for global brands — with the ambition to convert a total of 10,000 products into digital twins in the next two years across global and local brands. LVMH, the world’s leading luxury goods company, home to 75 distinguished maisons, is bringing 3D digital twins to its content production processes through its wine and spirits division, Moët Hennessy. The group partnered with content configuration engine Grip to develop a solution using the NVIDIA Omniverse platform, which enables the creation of 3D digital twins that power content variation production. With Grip’s solution, Moët Hennessy teams can quickly generate digital marketing assets and experiences to promote luxury products at scale. The initiative, led by Capucine Lafarge and Chloé Fournier, has been recognized by LVMH as a leading approach to scaling content creation. Image courtesy of Grip L’Oréal Gives Marketing and Online Shopping an AI Makeover Innovation starts at the drawing board. Today, that board is digital — and it’s powered by AI. L’Oréal Groupe, the world’s leading beauty player, announced its collaboration with NVIDIA today. Through this collaboration, L’Oréal and its partner ecosystem will leverage the NVIDIA AI Enterprise platform to transform its consumer beauty experiences, marketing and advertising content pipelines. “AI doesn’t think with the same constraints as a human being. That opens new avenues for creativity,” said Anne Machet, global head of content and entertainment at L’Oréal. “Generative AI enables our teams and partner agencies to explore creative possibilities.” CreAItech, L’Oréal’s generative AI content platform, is augmenting the creativity of marketing and content teams. Combining a modular ecosystem of models, expertise, technologies and partners — including NVIDIA — CreAltech empowers marketers to generate thousands of unique, on-brand images, videos and lines of text for diverse platforms and global audiences. The solution empowers L’Oréal’s marketing teams to quickly iterate on campaigns that improve consumer engagement across social media, e-commerce content and influencer marketing — driving higher conversion rates. Noli.com, the first AI-powered multi-brand marketplace startup founded and backed by the  L’Oréal Groupe, is reinventing how people discover and shop for beauty products. Noli’s AI Beauty Matchmaker experience uses L’Oréal Groupe’s century-long expertise in beauty, including its extensive knowledge of beauty science, beauty tech and consumer insights, built from over 1 million skin data points and analysis of thousands of product formulations. It gives users a BeautyDNA profile with expert-level guidance and personalized product recommendations for skincare and haircare. “Beauty shoppers are often overwhelmed by choice and struggling to find the products that are right for them,” said Amos Susskind, founder and CEO of Noli. “By applying the latest AI models accelerated by NVIDIA and Accenture to the unparalleled knowledge base and expertise of the L’Oréal Groupe, we can provide hyper-personalized, explainable recommendations to our users.”  The Accenture AI Refinery, powered by NVIDIA AI Enterprise, will provide the platform for Noli to experiment and scale. Noli’s new agent models will use NVIDIA NIM and NVIDIA NeMo microservices, including NeMo Retriever, running on Microsoft Azure. Rapid Innovation With the NVIDIA Partner Ecosystem NVIDIA’s ecosystem of solution provider partners empowers retail and CPG companies to innovate faster, personalize customer experiences, and optimize operations with NVIDIA accelerated computing and AI. Global digital agency Monks is reshaping the landscape of AI-driven marketing, creative production and enterprise transformation. At the heart of their innovation lies the Monks.Flow platform that enhances both the speed and sophistication of creative workflows through NVIDIA Omniverse, NVIDIA NIM microservices and Triton Inference Server for lightning-fast inference. AI image solutions provider Bria is helping retail giants like Lidl and L’Oreal to enhance marketing asset creation. Bria AI transforms static product images into compelling, dynamic advertisements that can be quickly scaled for use across any marketing need. The company’s generative AI platform uses NVIDIA Triton Inference Server software and the NVIDIA TensorRT software development kit for accelerated inference, as well as NVIDIA NIM and NeMo microservices for quick image generation at scale. Physical AI Brings Acceleration to Supply Chain and Logistics AI’s impact extends far beyond the digital world. Physical AI-powered warehousing robots, for example, are helping maximize efficiency in retail supply chain operations. Four in five retail companies have reported that AI has helped reduce supply chain operational costs, with 25% reporting cost reductions of at least 10%. Technology providers Lyric, KoiReader Technologies and Exotec are tackling the challenges of integrating AI into complex warehouse environments. Lyric is using the NVIDIA cuOpt GPU-accelerated solver for warehouse network planning and route optimization, and is collaborating with NVIDIA to apply the technology to broader supply chain decision-making problems. KoiReader Technologies is tapping the NVIDIA Metropolis stack for its computer vision solutions within logistics, supply chain and manufacturing environments using the KoiVision Platform. And Exotec is using NVIDIA CUDA libraries and the NVIDIA JetPack software development kit for embedded robotic systems in warehouse and distribution centers. From real-time robotics orchestration to predictive maintenance, these solutions are delivering impact on uptime, throughput and cost savings for supply chain operations. Learn more by joining a follow-up discussion on digital twins and AI-powered creativity with Microsoft, Nestlé, Accenture and NVIDIA at Cannes Lions on Monday, June 16. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions. #retail #reboot #major #global #brands
    BLOGS.NVIDIA.COM
    Retail Reboot: Major Global Brands Transform End-to-End Operations With NVIDIA
    AI is packing and shipping efficiency for the retail and consumer packaged goods (CPG) industries, with a majority of surveyed companies in the space reporting the technology is increasing revenue and reducing operational costs. Global brands are reimagining every facet of their businesses with AI, from how products are designed and manufactured to how they’re marketed, shipped and experienced in-store and online. At NVIDIA GTC Paris at VivaTech, industry leaders including L’Oréal, LVMH and Nestlé shared how they’re using tools like AI agents and physical AI — powered by NVIDIA AI and simulation technologies — across every step of the product lifecycle to enhance operations and experiences for partners, customers and employees. 3D Digital Twins and AI Transform Marketing, Advertising and Product Design The meeting of generative AI and 3D product digital twins results in unlimited creative potential. Nestlé, the world’s largest food and beverage company, today announced a collaboration with NVIDIA and Accenture to launch a new, AI-powered in-house service that will create high-quality product content at scale for e-commerce and digital media channels. The new content service, based on digital twins powered by the NVIDIA Omniverse platform, creates exact 3D virtual replicas of physical products. Product packaging can be adjusted or localized digitally, enabling seamless integration into various environments, such as seasonal campaigns or channel-specific formats. This means that new creative content can be generated without having to constantly reshoot from scratch. Image courtesy of Nestlé The service is developed in partnership with Accenture Song, using Accenture AI Refinery built on NVIDIA Omniverse for advanced digital twin creation. It uses NVIDIA AI Enterprise for generative AI, hosted on Microsoft Azure for robust cloud infrastructure. Nestlé already has a baseline of 4,000 3D digital products — mainly for global brands — with the ambition to convert a total of 10,000 products into digital twins in the next two years across global and local brands. LVMH, the world’s leading luxury goods company, home to 75 distinguished maisons, is bringing 3D digital twins to its content production processes through its wine and spirits division, Moët Hennessy. The group partnered with content configuration engine Grip to develop a solution using the NVIDIA Omniverse platform, which enables the creation of 3D digital twins that power content variation production. With Grip’s solution, Moët Hennessy teams can quickly generate digital marketing assets and experiences to promote luxury products at scale. The initiative, led by Capucine Lafarge and Chloé Fournier, has been recognized by LVMH as a leading approach to scaling content creation. Image courtesy of Grip L’Oréal Gives Marketing and Online Shopping an AI Makeover Innovation starts at the drawing board. Today, that board is digital — and it’s powered by AI. L’Oréal Groupe, the world’s leading beauty player, announced its collaboration with NVIDIA today. Through this collaboration, L’Oréal and its partner ecosystem will leverage the NVIDIA AI Enterprise platform to transform its consumer beauty experiences, marketing and advertising content pipelines. “AI doesn’t think with the same constraints as a human being. That opens new avenues for creativity,” said Anne Machet, global head of content and entertainment at L’Oréal. “Generative AI enables our teams and partner agencies to explore creative possibilities.” CreAItech, L’Oréal’s generative AI content platform, is augmenting the creativity of marketing and content teams. Combining a modular ecosystem of models, expertise, technologies and partners — including NVIDIA — CreAltech empowers marketers to generate thousands of unique, on-brand images, videos and lines of text for diverse platforms and global audiences. The solution empowers L’Oréal’s marketing teams to quickly iterate on campaigns that improve consumer engagement across social media, e-commerce content and influencer marketing — driving higher conversion rates. Noli.com, the first AI-powered multi-brand marketplace startup founded and backed by the  L’Oréal Groupe, is reinventing how people discover and shop for beauty products. Noli’s AI Beauty Matchmaker experience uses L’Oréal Groupe’s century-long expertise in beauty, including its extensive knowledge of beauty science, beauty tech and consumer insights, built from over 1 million skin data points and analysis of thousands of product formulations. It gives users a BeautyDNA profile with expert-level guidance and personalized product recommendations for skincare and haircare. “Beauty shoppers are often overwhelmed by choice and struggling to find the products that are right for them,” said Amos Susskind, founder and CEO of Noli. “By applying the latest AI models accelerated by NVIDIA and Accenture to the unparalleled knowledge base and expertise of the L’Oréal Groupe, we can provide hyper-personalized, explainable recommendations to our users.”  https://blogs.nvidia.com/wp-content/uploads/2025/06/Noli_Demo.mp4 The Accenture AI Refinery, powered by NVIDIA AI Enterprise, will provide the platform for Noli to experiment and scale. Noli’s new agent models will use NVIDIA NIM and NVIDIA NeMo microservices, including NeMo Retriever, running on Microsoft Azure. Rapid Innovation With the NVIDIA Partner Ecosystem NVIDIA’s ecosystem of solution provider partners empowers retail and CPG companies to innovate faster, personalize customer experiences, and optimize operations with NVIDIA accelerated computing and AI. Global digital agency Monks is reshaping the landscape of AI-driven marketing, creative production and enterprise transformation. At the heart of their innovation lies the Monks.Flow platform that enhances both the speed and sophistication of creative workflows through NVIDIA Omniverse, NVIDIA NIM microservices and Triton Inference Server for lightning-fast inference. AI image solutions provider Bria is helping retail giants like Lidl and L’Oreal to enhance marketing asset creation. Bria AI transforms static product images into compelling, dynamic advertisements that can be quickly scaled for use across any marketing need. The company’s generative AI platform uses NVIDIA Triton Inference Server software and the NVIDIA TensorRT software development kit for accelerated inference, as well as NVIDIA NIM and NeMo microservices for quick image generation at scale. Physical AI Brings Acceleration to Supply Chain and Logistics AI’s impact extends far beyond the digital world. Physical AI-powered warehousing robots, for example, are helping maximize efficiency in retail supply chain operations. Four in five retail companies have reported that AI has helped reduce supply chain operational costs, with 25% reporting cost reductions of at least 10%. Technology providers Lyric, KoiReader Technologies and Exotec are tackling the challenges of integrating AI into complex warehouse environments. Lyric is using the NVIDIA cuOpt GPU-accelerated solver for warehouse network planning and route optimization, and is collaborating with NVIDIA to apply the technology to broader supply chain decision-making problems. KoiReader Technologies is tapping the NVIDIA Metropolis stack for its computer vision solutions within logistics, supply chain and manufacturing environments using the KoiVision Platform. And Exotec is using NVIDIA CUDA libraries and the NVIDIA JetPack software development kit for embedded robotic systems in warehouse and distribution centers. From real-time robotics orchestration to predictive maintenance, these solutions are delivering impact on uptime, throughput and cost savings for supply chain operations. Learn more by joining a follow-up discussion on digital twins and AI-powered creativity with Microsoft, Nestlé, Accenture and NVIDIA at Cannes Lions on Monday, June 16. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    Like
    Love
    Sad
    Wow
    Angry
    23
    0 Comments 0 Shares
  • Lost Records developer Don't Nod is making layoffs. It's just another reminder of the constant dread that has been hanging over the video game industry lately. Not much else to say, really. Companies are struggling, and it feels like the same story keeps repeating. Layoffs happen, and everyone just moves on.

    #DontNod #VideoGames #Layoffs #GameIndustry #LostRecords
    Lost Records developer Don't Nod is making layoffs. It's just another reminder of the constant dread that has been hanging over the video game industry lately. Not much else to say, really. Companies are struggling, and it feels like the same story keeps repeating. Layoffs happen, and everyone just moves on. #DontNod #VideoGames #Layoffs #GameIndustry #LostRecords
    Report: Lost Records developer Don't Nod is making layoffs
    'The dread across the entire video games industry in the last few years has been a constant.'
    1 Comments 0 Shares
  • Step Inside the Vault: The ‘Borderland’ Series Arrives on GeForce NOW

    GeForce NOW is throwing open the vault doors to welcome the legendary Borderland series to the cloud.
    Whether a seasoned Vault Hunter or new to the mayhem of Pandora, prepare to experience the high-octane action and humor that define the series that includes Borderlands Game of the Year Enhanced, Borderlands 2, Borderlands 3 and Borderlands: The Pre-Sequel.
    Members can explore it all before the highly anticipated Borderlands 4 arrives in the cloud at launch.
    In addition, leap into the flames and save the day in the pulse-pounding FBC: Firebreak from Remedy Entertainment on GeForce NOW.
    It’s all part of the 13 new games in the cloud this week, including the latest Genshin Impact update and advanced access for REMATCH.
    Plus, GeForce NOW’s Summer Sale is still in full swing. For a limited time, get 40% off a six-month GeForce NOW Performance membership — perfect for diving into role-playing game favorites like the Borderlands series or any of the 2,200 titles in the platform’s cloud gaming library.
    Vault Hunters Assemble
    Gear up for a world where loot is king and chaos is always just a trigger pull away. The Borderlands series is known for its wild humor, outrageous characters and nonstop action — and now, its chaotic adventures can be streamed on GeForce NOW.
    Welcome to Pandora.
    Members revisiting the classics or jumping in for the first time can start with Borderlands Game of the Year Enhanced, the original mayhem-fueled classic now polished and packed with downloadable content. The title brings Pandora to life with a fresh coat of paint, crazy loot and the same iconic humor that started it all.
    New worlds, same chaos.
    In Borderlands 2, Handsome Jack steals the show with his mix of charm and villainy. This sequel cranks up the fun and insanity with unforgettable characters and a zany storyline. For more laughs and even wilder chaos, Borderlands 3 delivers the biggest loot explosion yet, with new worlds to explore. Face off against the Calypso twins and enjoy nonstop action.
    The rise of Handsome Jack.
    The adventure blasts off with Borderlands: The Pre-Sequel, revealing how Handsome Jack became so handsome. The game throws in zero gravity, moon boots and enough sarcasm to fuel a spaceship.
    Jump in with GeForce NOW and get ready to laugh, loot and blast through Pandora, all from the cloud. With instant access and seamless streaming at up to 4K resolution with an Ultimate membership, enter the chaos of Borderlands anytime, anywhere. No downloads, no waiting.
    Suit Up, Clean Up
    The Oldest House needs you.
    Step into the shoes of the Federal Bureau of Control’s elite first responders in the highly anticipated three-player co-op first-person shooter FBC: Firebreak. Taking place six years after Control, the game is set in the Oldest House — under siege by reality-warping threats. It’s up to players to restore order before chaos wins.
    Equip unique Crisis Kits packed with weapons, specialized tools and paranatural augments, like a garden gnome that summons a thunderstorm or a piggy bank that spews coins. As each mission, or “Job,” drops players into unpredictable environments with shifting objectives, bizarre crises and wacky enemies, teamwork and quick thinking are key.
    Jump into the fray with friends and stream it on GeForce NOW instantly across devices. Experience the mind-bending action and stunning visuals powered by cloud streaming. Contain the chaos, save the Oldest House and enjoy a new kind of co-op adventure, all from the cloud.
    No Rules Included
    Score big laughs in the cloud.
    REMATCH gives soccer a bold twist, transforming the classic sport into a fast-paced, third-person action experience where every player controls a single athlete on the field.
    With no fouls, offsides or breaks, matches are nonstop and skills-based, demanding quick reflexes and seamless teamwork. Dynamic role-switching lets players jump between attack, defense and goalkeeping, while seasonal updates and various multiplayer modes keep the competition fresh and the action intense.
    Where arcade flair meets tactical depth, REMATCH is football, unleashed. Get instant access to the soccer pitch by streaming the title on GeForce NOW and jump into the action wherever the match calls.
    Time To Game
    Skirk has arrived.
    Genshin Impact’s next major update launches this week, and members can stream the latest adventures from Teyvat at GeForce quality on any device. Version 5.7 includes the new playable characters Skirk and Dahlia — as well as fresh story quests and the launch of a Stygian Onslaught combat mode.
    Look for the following games available to stream in the cloud this week:

    REMATCHBroken ArrowCrime SimulatorDate Everything!FBC: FirebreakLost in Random: The Eternal DieArchitect Life: A House Design SimulatorBorderlands Game of the Year EnhancedBorderlands 2Borderlands 3Borderlands: The Pre-SequelMETAL EDEN DemoTorque Drift 2What are you planning to play this weekend? Let us know on X or in the comments below.

    What's a gaming achievement you'll never forget?
    — NVIDIA GeForce NOWJune 18, 2025
    #step #inside #vault #borderland #series
    Step Inside the Vault: The ‘Borderland’ Series Arrives on GeForce NOW
    GeForce NOW is throwing open the vault doors to welcome the legendary Borderland series to the cloud. Whether a seasoned Vault Hunter or new to the mayhem of Pandora, prepare to experience the high-octane action and humor that define the series that includes Borderlands Game of the Year Enhanced, Borderlands 2, Borderlands 3 and Borderlands: The Pre-Sequel. Members can explore it all before the highly anticipated Borderlands 4 arrives in the cloud at launch. In addition, leap into the flames and save the day in the pulse-pounding FBC: Firebreak from Remedy Entertainment on GeForce NOW. It’s all part of the 13 new games in the cloud this week, including the latest Genshin Impact update and advanced access for REMATCH. Plus, GeForce NOW’s Summer Sale is still in full swing. For a limited time, get 40% off a six-month GeForce NOW Performance membership — perfect for diving into role-playing game favorites like the Borderlands series or any of the 2,200 titles in the platform’s cloud gaming library. Vault Hunters Assemble Gear up for a world where loot is king and chaos is always just a trigger pull away. The Borderlands series is known for its wild humor, outrageous characters and nonstop action — and now, its chaotic adventures can be streamed on GeForce NOW. Welcome to Pandora. Members revisiting the classics or jumping in for the first time can start with Borderlands Game of the Year Enhanced, the original mayhem-fueled classic now polished and packed with downloadable content. The title brings Pandora to life with a fresh coat of paint, crazy loot and the same iconic humor that started it all. New worlds, same chaos. In Borderlands 2, Handsome Jack steals the show with his mix of charm and villainy. This sequel cranks up the fun and insanity with unforgettable characters and a zany storyline. For more laughs and even wilder chaos, Borderlands 3 delivers the biggest loot explosion yet, with new worlds to explore. Face off against the Calypso twins and enjoy nonstop action. The rise of Handsome Jack. The adventure blasts off with Borderlands: The Pre-Sequel, revealing how Handsome Jack became so handsome. The game throws in zero gravity, moon boots and enough sarcasm to fuel a spaceship. Jump in with GeForce NOW and get ready to laugh, loot and blast through Pandora, all from the cloud. With instant access and seamless streaming at up to 4K resolution with an Ultimate membership, enter the chaos of Borderlands anytime, anywhere. No downloads, no waiting. Suit Up, Clean Up The Oldest House needs you. Step into the shoes of the Federal Bureau of Control’s elite first responders in the highly anticipated three-player co-op first-person shooter FBC: Firebreak. Taking place six years after Control, the game is set in the Oldest House — under siege by reality-warping threats. It’s up to players to restore order before chaos wins. Equip unique Crisis Kits packed with weapons, specialized tools and paranatural augments, like a garden gnome that summons a thunderstorm or a piggy bank that spews coins. As each mission, or “Job,” drops players into unpredictable environments with shifting objectives, bizarre crises and wacky enemies, teamwork and quick thinking are key. Jump into the fray with friends and stream it on GeForce NOW instantly across devices. Experience the mind-bending action and stunning visuals powered by cloud streaming. Contain the chaos, save the Oldest House and enjoy a new kind of co-op adventure, all from the cloud. No Rules Included Score big laughs in the cloud. REMATCH gives soccer a bold twist, transforming the classic sport into a fast-paced, third-person action experience where every player controls a single athlete on the field. With no fouls, offsides or breaks, matches are nonstop and skills-based, demanding quick reflexes and seamless teamwork. Dynamic role-switching lets players jump between attack, defense and goalkeeping, while seasonal updates and various multiplayer modes keep the competition fresh and the action intense. Where arcade flair meets tactical depth, REMATCH is football, unleashed. Get instant access to the soccer pitch by streaming the title on GeForce NOW and jump into the action wherever the match calls. Time To Game Skirk has arrived. Genshin Impact’s next major update launches this week, and members can stream the latest adventures from Teyvat at GeForce quality on any device. Version 5.7 includes the new playable characters Skirk and Dahlia — as well as fresh story quests and the launch of a Stygian Onslaught combat mode. Look for the following games available to stream in the cloud this week: REMATCHBroken ArrowCrime SimulatorDate Everything!FBC: FirebreakLost in Random: The Eternal DieArchitect Life: A House Design SimulatorBorderlands Game of the Year EnhancedBorderlands 2Borderlands 3Borderlands: The Pre-SequelMETAL EDEN DemoTorque Drift 2What are you planning to play this weekend? Let us know on X or in the comments below. What's a gaming achievement you'll never forget? — NVIDIA GeForce NOWJune 18, 2025 #step #inside #vault #borderland #series
    BLOGS.NVIDIA.COM
    Step Inside the Vault: The ‘Borderland’ Series Arrives on GeForce NOW
    GeForce NOW is throwing open the vault doors to welcome the legendary Borderland series to the cloud. Whether a seasoned Vault Hunter or new to the mayhem of Pandora, prepare to experience the high-octane action and humor that define the series that includes Borderlands Game of the Year Enhanced, Borderlands 2, Borderlands 3 and Borderlands: The Pre-Sequel. Members can explore it all before the highly anticipated Borderlands 4 arrives in the cloud at launch. In addition, leap into the flames and save the day in the pulse-pounding FBC: Firebreak from Remedy Entertainment on GeForce NOW. It’s all part of the 13 new games in the cloud this week, including the latest Genshin Impact update and advanced access for REMATCH. Plus, GeForce NOW’s Summer Sale is still in full swing. For a limited time, get 40% off a six-month GeForce NOW Performance membership — perfect for diving into role-playing game favorites like the Borderlands series or any of the 2,200 titles in the platform’s cloud gaming library. Vault Hunters Assemble Gear up for a world where loot is king and chaos is always just a trigger pull away. The Borderlands series is known for its wild humor, outrageous characters and nonstop action — and now, its chaotic adventures can be streamed on GeForce NOW. Welcome to Pandora. Members revisiting the classics or jumping in for the first time can start with Borderlands Game of the Year Enhanced, the original mayhem-fueled classic now polished and packed with downloadable content. The title brings Pandora to life with a fresh coat of paint, crazy loot and the same iconic humor that started it all. New worlds, same chaos. In Borderlands 2, Handsome Jack steals the show with his mix of charm and villainy. This sequel cranks up the fun and insanity with unforgettable characters and a zany storyline. For more laughs and even wilder chaos, Borderlands 3 delivers the biggest loot explosion yet, with new worlds to explore. Face off against the Calypso twins and enjoy nonstop action. The rise of Handsome Jack. The adventure blasts off with Borderlands: The Pre-Sequel, revealing how Handsome Jack became so handsome. The game throws in zero gravity, moon boots and enough sarcasm to fuel a spaceship. Jump in with GeForce NOW and get ready to laugh, loot and blast through Pandora, all from the cloud. With instant access and seamless streaming at up to 4K resolution with an Ultimate membership, enter the chaos of Borderlands anytime, anywhere. No downloads, no waiting. Suit Up, Clean Up The Oldest House needs you. Step into the shoes of the Federal Bureau of Control’s elite first responders in the highly anticipated three-player co-op first-person shooter FBC: Firebreak. Taking place six years after Control, the game is set in the Oldest House — under siege by reality-warping threats. It’s up to players to restore order before chaos wins. Equip unique Crisis Kits packed with weapons, specialized tools and paranatural augments, like a garden gnome that summons a thunderstorm or a piggy bank that spews coins. As each mission, or “Job,” drops players into unpredictable environments with shifting objectives, bizarre crises and wacky enemies, teamwork and quick thinking are key. Jump into the fray with friends and stream it on GeForce NOW instantly across devices. Experience the mind-bending action and stunning visuals powered by cloud streaming. Contain the chaos, save the Oldest House and enjoy a new kind of co-op adventure, all from the cloud. No Rules Included Score big laughs in the cloud. REMATCH gives soccer a bold twist, transforming the classic sport into a fast-paced, third-person action experience where every player controls a single athlete on the field. With no fouls, offsides or breaks, matches are nonstop and skills-based, demanding quick reflexes and seamless teamwork. Dynamic role-switching lets players jump between attack, defense and goalkeeping, while seasonal updates and various multiplayer modes keep the competition fresh and the action intense. Where arcade flair meets tactical depth, REMATCH is football, unleashed. Get instant access to the soccer pitch by streaming the title on GeForce NOW and jump into the action wherever the match calls. Time To Game Skirk has arrived. Genshin Impact’s next major update launches this week, and members can stream the latest adventures from Teyvat at GeForce quality on any device. Version 5.7 includes the new playable characters Skirk and Dahlia — as well as fresh story quests and the launch of a Stygian Onslaught combat mode. Look for the following games available to stream in the cloud this week: REMATCH (New release on Steam, Xbox, available on PC Game Pass, June 16) Broken Arrow (New release on Steam, June 19) Crime Simulator (New release on Steam, June 17) Date Everything! (New release on Steam, June 17) FBC: Firebreak (New release on Steam, Xbox, available on PC Game Pass, June 17) Lost in Random: The Eternal Die (New release on Steam, Xbox, available on PC Game Pass, June 17) Architect Life: A House Design Simulator (New release on Steam, June 19) Borderlands Game of the Year Enhanced (Steam) Borderlands 2 (Steam, Epic Games Store) Borderlands 3 (Steam, Epic Games Store) Borderlands: The Pre-Sequel (Steam, Epic Games Store) METAL EDEN Demo (Steam) Torque Drift 2 (Epic Games Store) What are you planning to play this weekend? Let us know on X or in the comments below. What's a gaming achievement you'll never forget? — NVIDIA GeForce NOW (@NVIDIAGFN) June 18, 2025
    Like
    Love
    Wow
    Sad
    Angry
    32
    0 Comments 0 Shares
  • BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4

    By TREVOR HOGG
    Images courtesy of Prime Video.

    For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!”

    When Splintersplits in two, the cloning effect was inspired by cellular mitosis.

    “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!”
    —Stephan Fleet, VFX Supervisor

    A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith, who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.”

    Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed.

    Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.”

    “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.”
    —Stephan Fleet, VFX Supervisor

    The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be, so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a humanyou tend to want to give it human gestures and eyebrows. Erik Kripkesaid, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.”

    A building is replaced by a massive crowd attending a rally being held by Homelander.

    In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep wasin one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.”

    In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around.

    “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.”
    —Stephan Fleet, VFX Supervisor

    Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.”

    The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes.

    Once injected with Compound V, Hugh Campbell Sr.develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.”

    Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination.

    Homelanderbreaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.”

    “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.”
    —Stephan Fleet, VFX Supervisor

    Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution.

    A different spin on the bloodbath occurs during a fight when a drugged Frenchiehallucinates as Kimiko Miyashirogoes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.”

    Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4.

    When Splintersplits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker. “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.”
    #bouncing #rubber #duckies #flying #sheep
    BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4
    By TREVOR HOGG Images courtesy of Prime Video. For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” When Splintersplits in two, the cloning effect was inspired by cellular mitosis. “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” —Stephan Fleet, VFX Supervisor A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith, who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed. Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.” “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” —Stephan Fleet, VFX Supervisor The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be, so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a humanyou tend to want to give it human gestures and eyebrows. Erik Kripkesaid, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.” A building is replaced by a massive crowd attending a rally being held by Homelander. In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep wasin one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.” In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” —Stephan Fleet, VFX Supervisor Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes. Once injected with Compound V, Hugh Campbell Sr.develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.” Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination. Homelanderbreaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.” “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” —Stephan Fleet, VFX Supervisor Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution. A different spin on the bloodbath occurs during a fight when a drugged Frenchiehallucinates as Kimiko Miyashirogoes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.” Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4. When Splintersplits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker. “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.” #bouncing #rubber #duckies #flying #sheep
    WWW.VFXVOICE.COM
    BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4
    By TREVOR HOGG Images courtesy of Prime Video. For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” When Splinter (Rob Benedict) splits in two, the cloning effect was inspired by cellular mitosis. “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” —Stephan Fleet, VFX Supervisor A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith [Previs Director], who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed. Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.” “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” —Stephan Fleet, VFX Supervisor The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be [from Season 3], so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a human [is] you tend to want to give it human gestures and eyebrows. Erik Kripke [Creator, Executive Producer, Showrunner, Director, Writer] said, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.” A building is replaced by a massive crowd attending a rally being held by Homelander. In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep was [acting in a certain way] in one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.” In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” —Stephan Fleet, VFX Supervisor Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes. Once injected with Compound V, Hugh Campbell Sr. (Simon Pegg) develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.” Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination. Homelander (Anthony Starr) breaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.” “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” —Stephan Fleet, VFX Supervisor Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution. A different spin on the bloodbath occurs during a fight when a drugged Frenchie (Tomer Capone) hallucinates as Kimiko Miyashiro (Karen Fukuhara) goes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.” Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4. When Splinter (Rob Benedict) splits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker (Valorie Curry). “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.”
    0 Comments 0 Shares
  • HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE

    By TREVOR HOGG

    Images courtesy of Warner Bros. Pictures.

    Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon.

    “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.”
    —Talia Finlayson, Creative Technologist, Disguise

    Interior and exterior environments had to be created, such as the shop owned by Steve.

    “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”

    Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.”

    A virtual exploration of Steve’s shop in Midport Village.

    Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.”

    “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”
    —Laura Bell, Creative Technologist, Disguise

    Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack.

    Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.”

    Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!”

    A virtual study and final still of the cast members standing outside of the Lava Chicken Shack.

    “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.”
    —Talia Finlayson, Creative Technologist, Disguise

    The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.”

    Virtually conceptualizing the layout of Midport Village.

    Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.”

    An example of the virtual and final version of the Woodland Mansion.

    “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.”
    —Laura Bell, Creative Technologist, Disguise

    Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.”

    Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment.

    Doing a virtual scale study of the Mountainside.

    Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.”

    Piglots cause mayhem during the Wingsuit Chase.

    Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods.

    “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    #how #disguise #built #out #virtual
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve. “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.” #how #disguise #built #out #virtual
    WWW.VFXVOICE.COM
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “[A]s the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve (Jack Black). “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’s (Jack Black) Lava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younis [VAD Art Director] adapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay George [VP Tech] and I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols [VAD Supervisor], Pat Younis, Jake Tuck [Unreal Artist] and Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    0 Comments 0 Shares
  • In the midst of countless events and activities that surround design, I find myself feeling emptier than ever. The noise of festivals and workshops fades into a haunting silence, reminding me that when everything is free, the true value dissipates. It’s disheartening to see creativity diluted amidst a sea of sameness, where the essence of art loses its significance. I long for genuine connection, a spark of inspiration, yet I stand alone, surrounded by the shadows of what once felt vibrant. The world may celebrate abundance, but here I am, feeling the weight of solitude.

    #Loneliness #CreativeSoul #ArtisticValue #Heartbreak #DesignCommunity
    In the midst of countless events and activities that surround design, I find myself feeling emptier than ever. 🌧️ The noise of festivals and workshops fades into a haunting silence, reminding me that when everything is free, the true value dissipates. It’s disheartening to see creativity diluted amidst a sea of sameness, where the essence of art loses its significance. I long for genuine connection, a spark of inspiration, yet I stand alone, surrounded by the shadows of what once felt vibrant. The world may celebrate abundance, but here I am, feeling the weight of solitude. 💔 #Loneliness #CreativeSoul #ArtisticValue #Heartbreak #DesignCommunity
    GRAFFICA.INFO
    «Cuando todo es gratis, todos perdemos», por Víctor Palau
    Últimamente observo con cierta incredulidad algo que, si uno se detiene a pensarlo, resulta bastante revelador. En estos meses me he fijado en la cantidad de actividades que se celebran en torno al diseño: eventos, festivales, premios, exposiciones,
    1 Comments 0 Shares
  • It’s infuriating to see the Blender Developers Meeting Notes from June 23, 2025, filled with the same old issues and empty promises! Why are we still talking about moving the Git SSH domain to git.blender.org when there are far more pressing concerns? The upcoming Blender 5.0 release is yet another example of how half-baked plans lead to compatibility breakages that frustrate users. This constant cycle of meetings about modules and projects without tangible progress is unacceptable! Users deserve better than this lackadaisical approach! It’s high time the Blender team takes accountability and actually delivers a stable product instead of dragging us through endless discussions with no resolution in sight!

    #Blender #DeveloperIssues #TechFrustration #User
    It’s infuriating to see the Blender Developers Meeting Notes from June 23, 2025, filled with the same old issues and empty promises! Why are we still talking about moving the Git SSH domain to git.blender.org when there are far more pressing concerns? The upcoming Blender 5.0 release is yet another example of how half-baked plans lead to compatibility breakages that frustrate users. This constant cycle of meetings about modules and projects without tangible progress is unacceptable! Users deserve better than this lackadaisical approach! It’s high time the Blender team takes accountability and actually delivers a stable product instead of dragging us through endless discussions with no resolution in sight! #Blender #DeveloperIssues #TechFrustration #User
    Blender Developers Meeting Notes: 23 June 2025
    Notes for weekly communication of ongoing projects and modules. Announcements Blender Projects is moving its Git SSH domain to git.blender.org Reminder: Upcoming Blender 5.0 Release & Compatibility Breakages - #6 by mont29 Modules & Projects
    1 Comments 0 Shares
  • Enough with the endless updates and empty promises! Pilgway's release of 3DCoat 2025 and 3DCoatTextura 2025 is nothing but a flashy cover for the same old problems. Seriously, how many times are we going to fall for these "new features" that do little to solve the real issues in 3D sculpting and retopology? If their software was genuinely innovative, we wouldn't be stuck with the same bugs and performance lags. Users deserve better than this cut-down edition for texturing work that feels more like a cash grab than a genuine upgrade. It's time to hold companies accountable for their lack of real progress and stop settling for mediocrity in the tech we rely on
    Enough with the endless updates and empty promises! Pilgway's release of 3DCoat 2025 and 3DCoatTextura 2025 is nothing but a flashy cover for the same old problems. Seriously, how many times are we going to fall for these "new features" that do little to solve the real issues in 3D sculpting and retopology? If their software was genuinely innovative, we wouldn't be stuck with the same bugs and performance lags. Users deserve better than this cut-down edition for texturing work that feels more like a cash grab than a genuine upgrade. It's time to hold companies accountable for their lack of real progress and stop settling for mediocrity in the tech we rely on
    Pilgway releases 3DCoat 2025 and 3DCoatTextura 2025
    Check out the new features in the 3D sculpting and retopology software, and its cut-down edition for texturing work.
    Like
    Love
    Wow
    Sad
    31
    1 Comments 0 Shares
  • Just finished playing Death Stranding 2: On The Beach, and wow, what an experience! This game takes you on an emotional journey that is both beautifully crafted and deeply immersive. The stunning visuals and engaging storyline remind us that every challenge can lead to a greater connection with others!

    Every step you take in this game symbolizes resilience and hope. Let's embrace our own journeys with the same spirit! Remember, we are all connected, and together we can overcome any obstacles we face! Keep pushing forward, friends!

    #DeathStranding2 #OnTheBeach #GamingCommunity #Positivity #Inspiration
    🌟🎮 Just finished playing Death Stranding 2: On The Beach, and wow, what an experience! This game takes you on an emotional journey that is both beautifully crafted and deeply immersive. The stunning visuals and engaging storyline remind us that every challenge can lead to a greater connection with others! 🌊💫 Every step you take in this game symbolizes resilience and hope. Let's embrace our own journeys with the same spirit! Remember, we are all connected, and together we can overcome any obstacles we face! Keep pushing forward, friends! 💪❤️ #DeathStranding2 #OnTheBeach #GamingCommunity #Positivity #Inspiration
    ARABHARDWARE.NET
    مراجعة لعبة Death Stranding 2: On The Beach
    The post مراجعة لعبة Death Stranding 2: On The Beach appeared first on عرب هاردوير.
    1 Comments 0 Shares
More Results