• Get The BioShock And Mafia Trilogies For Just $18 In This New Deal

    If you're craving some FPS gaming this summer, you'll want to check out Humble's latest bundle deal, the 2K Classic Trilogies Mafia X Bioshock Game Bundle. The deal lets you grab each mainline entry in the BioShock and Mafia series on PC for just Like other Humble Bundle deals, the company donates a portion of proceeds to charity--in this case, the bundle supports Covenant House, which provides food, shelter, immediate crisis care, and ongoing services to homeless and trafficked young people. You can also adjust the revenue split between Humble, 2K, and Covenant House before checkout. See bundle at Humble Humble's 2K Classic Trilogies Mafia X Bioshock Game Bundle offers a tiered payment option, and the more you pay, the more games you get.If you only want the Bioshock series, you can grab the base tier that includes the Remastered Editions of BioShock 1 and 2, alongside the original BioShock Infinite. The enhanced editions of Bioshock 1 and 2 included in this bundle offer improved graphics and UI, with support for up to 4K resolutions. It's also worth noting that the version of Bioshock Infinite included in the bundle does not include the Season Pass DLC content, and must be purchased separately.Continue Reading at GameSpot
    #get #bioshock #mafia #trilogies #just
    Get The BioShock And Mafia Trilogies For Just $18 In This New Deal
    If you're craving some FPS gaming this summer, you'll want to check out Humble's latest bundle deal, the 2K Classic Trilogies Mafia X Bioshock Game Bundle. The deal lets you grab each mainline entry in the BioShock and Mafia series on PC for just Like other Humble Bundle deals, the company donates a portion of proceeds to charity--in this case, the bundle supports Covenant House, which provides food, shelter, immediate crisis care, and ongoing services to homeless and trafficked young people. You can also adjust the revenue split between Humble, 2K, and Covenant House before checkout. See bundle at Humble Humble's 2K Classic Trilogies Mafia X Bioshock Game Bundle offers a tiered payment option, and the more you pay, the more games you get.If you only want the Bioshock series, you can grab the base tier that includes the Remastered Editions of BioShock 1 and 2, alongside the original BioShock Infinite. The enhanced editions of Bioshock 1 and 2 included in this bundle offer improved graphics and UI, with support for up to 4K resolutions. It's also worth noting that the version of Bioshock Infinite included in the bundle does not include the Season Pass DLC content, and must be purchased separately.Continue Reading at GameSpot #get #bioshock #mafia #trilogies #just
    WWW.GAMESPOT.COM
    Get The BioShock And Mafia Trilogies For Just $18 In This New Deal
    If you're craving some FPS gaming this summer, you'll want to check out Humble's latest bundle deal, the 2K Classic Trilogies Mafia X Bioshock Game Bundle. The deal lets you grab each mainline entry in the BioShock and Mafia series on PC for just $18. Like other Humble Bundle deals, the company donates a portion of proceeds to charity--in this case, the bundle supports Covenant House, which provides food, shelter, immediate crisis care, and ongoing services to homeless and trafficked young people. You can also adjust the revenue split between Humble, 2K, and Covenant House before checkout. See bundle at Humble Humble's 2K Classic Trilogies Mafia X Bioshock Game Bundle offers a tiered payment option, and the more you pay, the more games you get.If you only want the Bioshock series, you can grab the base $10 tier that includes the Remastered Editions of BioShock 1 and 2, alongside the original BioShock Infinite. The enhanced editions of Bioshock 1 and 2 included in this bundle offer improved graphics and UI, with support for up to 4K resolutions. It's also worth noting that the version of Bioshock Infinite included in the bundle does not include the Season Pass DLC content, and must be purchased separately.Continue Reading at GameSpot
    Like
    Love
    Wow
    Angry
    Sad
    34
    0 Комментарии 0 Поделились
  • Retail Reboot: Major Global Brands Transform End-to-End Operations With NVIDIA

    AI is packing and shipping efficiency for the retail and consumer packaged goodsindustries, with a majority of surveyed companies in the space reporting the technology is increasing revenue and reducing operational costs.
    Global brands are reimagining every facet of their businesses with AI, from how products are designed and manufactured to how they’re marketed, shipped and experienced in-store and online.
    At NVIDIA GTC Paris at VivaTech, industry leaders including L’Oréal, LVMH and Nestlé shared how they’re using tools like AI agents and physical AI — powered by NVIDIA AI and simulation technologies — across every step of the product lifecycle to enhance operations and experiences for partners, customers and employees.
    3D Digital Twins and AI Transform Marketing, Advertising and Product Design
    The meeting of generative AI and 3D product digital twins results in unlimited creative potential.
    Nestlé, the world’s largest food and beverage company, today announced a collaboration with NVIDIA and Accenture to launch a new, AI-powered in-house service that will create high-quality product content at scale for e-commerce and digital media channels.
    The new content service, based on digital twins powered by the NVIDIA Omniverse platform, creates exact 3D virtual replicas of physical products. Product packaging can be adjusted or localized digitally, enabling seamless integration into various environments, such as seasonal campaigns or channel-specific formats. This means that new creative content can be generated without having to constantly reshoot from scratch.
    Image courtesy of Nestlé
    The service is developed in partnership with Accenture Song, using Accenture AI Refinery built on NVIDIA Omniverse for advanced digital twin creation. It uses NVIDIA AI Enterprise for generative AI, hosted on Microsoft Azure for robust cloud infrastructure.
    Nestlé already has a baseline of 4,000 3D digital products — mainly for global brands — with the ambition to convert a total of 10,000 products into digital twins in the next two years across global and local brands.
    LVMH, the world’s leading luxury goods company, home to 75 distinguished maisons, is bringing 3D digital twins to its content production processes through its wine and spirits division, Moët Hennessy.
    The group partnered with content configuration engine Grip to develop a solution using the NVIDIA Omniverse platform, which enables the creation of 3D digital twins that power content variation production. With Grip’s solution, Moët Hennessy teams can quickly generate digital marketing assets and experiences to promote luxury products at scale.
    The initiative, led by Capucine Lafarge and Chloé Fournier, has been recognized by LVMH as a leading approach to scaling content creation.
    Image courtesy of Grip
    L’Oréal Gives Marketing and Online Shopping an AI Makeover
    Innovation starts at the drawing board. Today, that board is digital — and it’s powered by AI.
    L’Oréal Groupe, the world’s leading beauty player, announced its collaboration with NVIDIA today. Through this collaboration, L’Oréal and its partner ecosystem will leverage the NVIDIA AI Enterprise platform to transform its consumer beauty experiences, marketing and advertising content pipelines.
    “AI doesn’t think with the same constraints as a human being. That opens new avenues for creativity,” said Anne Machet, global head of content and entertainment at L’Oréal. “Generative AI enables our teams and partner agencies to explore creative possibilities.”
    CreAItech, L’Oréal’s generative AI content platform, is augmenting the creativity of marketing and content teams. Combining a modular ecosystem of models, expertise, technologies and partners — including NVIDIA — CreAltech empowers marketers to generate thousands of unique, on-brand images, videos and lines of text for diverse platforms and global audiences.
    The solution empowers L’Oréal’s marketing teams to quickly iterate on campaigns that improve consumer engagement across social media, e-commerce content and influencer marketing — driving higher conversion rates.

    Noli.com, the first AI-powered multi-brand marketplace startup founded and backed by the  L’Oréal Groupe, is reinventing how people discover and shop for beauty products.
    Noli’s AI Beauty Matchmaker experience uses L’Oréal Groupe’s century-long expertise in beauty, including its extensive knowledge of beauty science, beauty tech and consumer insights, built from over 1 million skin data points and analysis of thousands of product formulations. It gives users a BeautyDNA profile with expert-level guidance and personalized product recommendations for skincare and haircare.
    “Beauty shoppers are often overwhelmed by choice and struggling to find the products that are right for them,” said Amos Susskind, founder and CEO of Noli. “By applying the latest AI models accelerated by NVIDIA and Accenture to the unparalleled knowledge base and expertise of the L’Oréal Groupe, we can provide hyper-personalized, explainable recommendations to our users.” 

    The Accenture AI Refinery, powered by NVIDIA AI Enterprise, will provide the platform for Noli to experiment and scale. Noli’s new agent models will use NVIDIA NIM and NVIDIA NeMo microservices, including NeMo Retriever, running on Microsoft Azure.
    Rapid Innovation With the NVIDIA Partner Ecosystem
    NVIDIA’s ecosystem of solution provider partners empowers retail and CPG companies to innovate faster, personalize customer experiences, and optimize operations with NVIDIA accelerated computing and AI.
    Global digital agency Monks is reshaping the landscape of AI-driven marketing, creative production and enterprise transformation. At the heart of their innovation lies the Monks.Flow platform that enhances both the speed and sophistication of creative workflows through NVIDIA Omniverse, NVIDIA NIM microservices and Triton Inference Server for lightning-fast inference.
    AI image solutions provider Bria is helping retail giants like Lidl and L’Oreal to enhance marketing asset creation. Bria AI transforms static product images into compelling, dynamic advertisements that can be quickly scaled for use across any marketing need.
    The company’s generative AI platform uses NVIDIA Triton Inference Server software and the NVIDIA TensorRT software development kit for accelerated inference, as well as NVIDIA NIM and NeMo microservices for quick image generation at scale.
    Physical AI Brings Acceleration to Supply Chain and Logistics
    AI’s impact extends far beyond the digital world. Physical AI-powered warehousing robots, for example, are helping maximize efficiency in retail supply chain operations. Four in five retail companies have reported that AI has helped reduce supply chain operational costs, with 25% reporting cost reductions of at least 10%.
    Technology providers Lyric, KoiReader Technologies and Exotec are tackling the challenges of integrating AI into complex warehouse environments.
    Lyric is using the NVIDIA cuOpt GPU-accelerated solver for warehouse network planning and route optimization, and is collaborating with NVIDIA to apply the technology to broader supply chain decision-making problems. KoiReader Technologies is tapping the NVIDIA Metropolis stack for its computer vision solutions within logistics, supply chain and manufacturing environments using the KoiVision Platform. And Exotec is using NVIDIA CUDA libraries and the NVIDIA JetPack software development kit for embedded robotic systems in warehouse and distribution centers.
    From real-time robotics orchestration to predictive maintenance, these solutions are delivering impact on uptime, throughput and cost savings for supply chain operations.
    Learn more by joining a follow-up discussion on digital twins and AI-powered creativity with Microsoft, Nestlé, Accenture and NVIDIA at Cannes Lions on Monday, June 16.
    Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    #retail #reboot #major #global #brands
    Retail Reboot: Major Global Brands Transform End-to-End Operations With NVIDIA
    AI is packing and shipping efficiency for the retail and consumer packaged goodsindustries, with a majority of surveyed companies in the space reporting the technology is increasing revenue and reducing operational costs. Global brands are reimagining every facet of their businesses with AI, from how products are designed and manufactured to how they’re marketed, shipped and experienced in-store and online. At NVIDIA GTC Paris at VivaTech, industry leaders including L’Oréal, LVMH and Nestlé shared how they’re using tools like AI agents and physical AI — powered by NVIDIA AI and simulation technologies — across every step of the product lifecycle to enhance operations and experiences for partners, customers and employees. 3D Digital Twins and AI Transform Marketing, Advertising and Product Design The meeting of generative AI and 3D product digital twins results in unlimited creative potential. Nestlé, the world’s largest food and beverage company, today announced a collaboration with NVIDIA and Accenture to launch a new, AI-powered in-house service that will create high-quality product content at scale for e-commerce and digital media channels. The new content service, based on digital twins powered by the NVIDIA Omniverse platform, creates exact 3D virtual replicas of physical products. Product packaging can be adjusted or localized digitally, enabling seamless integration into various environments, such as seasonal campaigns or channel-specific formats. This means that new creative content can be generated without having to constantly reshoot from scratch. Image courtesy of Nestlé The service is developed in partnership with Accenture Song, using Accenture AI Refinery built on NVIDIA Omniverse for advanced digital twin creation. It uses NVIDIA AI Enterprise for generative AI, hosted on Microsoft Azure for robust cloud infrastructure. Nestlé already has a baseline of 4,000 3D digital products — mainly for global brands — with the ambition to convert a total of 10,000 products into digital twins in the next two years across global and local brands. LVMH, the world’s leading luxury goods company, home to 75 distinguished maisons, is bringing 3D digital twins to its content production processes through its wine and spirits division, Moët Hennessy. The group partnered with content configuration engine Grip to develop a solution using the NVIDIA Omniverse platform, which enables the creation of 3D digital twins that power content variation production. With Grip’s solution, Moët Hennessy teams can quickly generate digital marketing assets and experiences to promote luxury products at scale. The initiative, led by Capucine Lafarge and Chloé Fournier, has been recognized by LVMH as a leading approach to scaling content creation. Image courtesy of Grip L’Oréal Gives Marketing and Online Shopping an AI Makeover Innovation starts at the drawing board. Today, that board is digital — and it’s powered by AI. L’Oréal Groupe, the world’s leading beauty player, announced its collaboration with NVIDIA today. Through this collaboration, L’Oréal and its partner ecosystem will leverage the NVIDIA AI Enterprise platform to transform its consumer beauty experiences, marketing and advertising content pipelines. “AI doesn’t think with the same constraints as a human being. That opens new avenues for creativity,” said Anne Machet, global head of content and entertainment at L’Oréal. “Generative AI enables our teams and partner agencies to explore creative possibilities.” CreAItech, L’Oréal’s generative AI content platform, is augmenting the creativity of marketing and content teams. Combining a modular ecosystem of models, expertise, technologies and partners — including NVIDIA — CreAltech empowers marketers to generate thousands of unique, on-brand images, videos and lines of text for diverse platforms and global audiences. The solution empowers L’Oréal’s marketing teams to quickly iterate on campaigns that improve consumer engagement across social media, e-commerce content and influencer marketing — driving higher conversion rates. Noli.com, the first AI-powered multi-brand marketplace startup founded and backed by the  L’Oréal Groupe, is reinventing how people discover and shop for beauty products. Noli’s AI Beauty Matchmaker experience uses L’Oréal Groupe’s century-long expertise in beauty, including its extensive knowledge of beauty science, beauty tech and consumer insights, built from over 1 million skin data points and analysis of thousands of product formulations. It gives users a BeautyDNA profile with expert-level guidance and personalized product recommendations for skincare and haircare. “Beauty shoppers are often overwhelmed by choice and struggling to find the products that are right for them,” said Amos Susskind, founder and CEO of Noli. “By applying the latest AI models accelerated by NVIDIA and Accenture to the unparalleled knowledge base and expertise of the L’Oréal Groupe, we can provide hyper-personalized, explainable recommendations to our users.”  The Accenture AI Refinery, powered by NVIDIA AI Enterprise, will provide the platform for Noli to experiment and scale. Noli’s new agent models will use NVIDIA NIM and NVIDIA NeMo microservices, including NeMo Retriever, running on Microsoft Azure. Rapid Innovation With the NVIDIA Partner Ecosystem NVIDIA’s ecosystem of solution provider partners empowers retail and CPG companies to innovate faster, personalize customer experiences, and optimize operations with NVIDIA accelerated computing and AI. Global digital agency Monks is reshaping the landscape of AI-driven marketing, creative production and enterprise transformation. At the heart of their innovation lies the Monks.Flow platform that enhances both the speed and sophistication of creative workflows through NVIDIA Omniverse, NVIDIA NIM microservices and Triton Inference Server for lightning-fast inference. AI image solutions provider Bria is helping retail giants like Lidl and L’Oreal to enhance marketing asset creation. Bria AI transforms static product images into compelling, dynamic advertisements that can be quickly scaled for use across any marketing need. The company’s generative AI platform uses NVIDIA Triton Inference Server software and the NVIDIA TensorRT software development kit for accelerated inference, as well as NVIDIA NIM and NeMo microservices for quick image generation at scale. Physical AI Brings Acceleration to Supply Chain and Logistics AI’s impact extends far beyond the digital world. Physical AI-powered warehousing robots, for example, are helping maximize efficiency in retail supply chain operations. Four in five retail companies have reported that AI has helped reduce supply chain operational costs, with 25% reporting cost reductions of at least 10%. Technology providers Lyric, KoiReader Technologies and Exotec are tackling the challenges of integrating AI into complex warehouse environments. Lyric is using the NVIDIA cuOpt GPU-accelerated solver for warehouse network planning and route optimization, and is collaborating with NVIDIA to apply the technology to broader supply chain decision-making problems. KoiReader Technologies is tapping the NVIDIA Metropolis stack for its computer vision solutions within logistics, supply chain and manufacturing environments using the KoiVision Platform. And Exotec is using NVIDIA CUDA libraries and the NVIDIA JetPack software development kit for embedded robotic systems in warehouse and distribution centers. From real-time robotics orchestration to predictive maintenance, these solutions are delivering impact on uptime, throughput and cost savings for supply chain operations. Learn more by joining a follow-up discussion on digital twins and AI-powered creativity with Microsoft, Nestlé, Accenture and NVIDIA at Cannes Lions on Monday, June 16. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions. #retail #reboot #major #global #brands
    BLOGS.NVIDIA.COM
    Retail Reboot: Major Global Brands Transform End-to-End Operations With NVIDIA
    AI is packing and shipping efficiency for the retail and consumer packaged goods (CPG) industries, with a majority of surveyed companies in the space reporting the technology is increasing revenue and reducing operational costs. Global brands are reimagining every facet of their businesses with AI, from how products are designed and manufactured to how they’re marketed, shipped and experienced in-store and online. At NVIDIA GTC Paris at VivaTech, industry leaders including L’Oréal, LVMH and Nestlé shared how they’re using tools like AI agents and physical AI — powered by NVIDIA AI and simulation technologies — across every step of the product lifecycle to enhance operations and experiences for partners, customers and employees. 3D Digital Twins and AI Transform Marketing, Advertising and Product Design The meeting of generative AI and 3D product digital twins results in unlimited creative potential. Nestlé, the world’s largest food and beverage company, today announced a collaboration with NVIDIA and Accenture to launch a new, AI-powered in-house service that will create high-quality product content at scale for e-commerce and digital media channels. The new content service, based on digital twins powered by the NVIDIA Omniverse platform, creates exact 3D virtual replicas of physical products. Product packaging can be adjusted or localized digitally, enabling seamless integration into various environments, such as seasonal campaigns or channel-specific formats. This means that new creative content can be generated without having to constantly reshoot from scratch. Image courtesy of Nestlé The service is developed in partnership with Accenture Song, using Accenture AI Refinery built on NVIDIA Omniverse for advanced digital twin creation. It uses NVIDIA AI Enterprise for generative AI, hosted on Microsoft Azure for robust cloud infrastructure. Nestlé already has a baseline of 4,000 3D digital products — mainly for global brands — with the ambition to convert a total of 10,000 products into digital twins in the next two years across global and local brands. LVMH, the world’s leading luxury goods company, home to 75 distinguished maisons, is bringing 3D digital twins to its content production processes through its wine and spirits division, Moët Hennessy. The group partnered with content configuration engine Grip to develop a solution using the NVIDIA Omniverse platform, which enables the creation of 3D digital twins that power content variation production. With Grip’s solution, Moët Hennessy teams can quickly generate digital marketing assets and experiences to promote luxury products at scale. The initiative, led by Capucine Lafarge and Chloé Fournier, has been recognized by LVMH as a leading approach to scaling content creation. Image courtesy of Grip L’Oréal Gives Marketing and Online Shopping an AI Makeover Innovation starts at the drawing board. Today, that board is digital — and it’s powered by AI. L’Oréal Groupe, the world’s leading beauty player, announced its collaboration with NVIDIA today. Through this collaboration, L’Oréal and its partner ecosystem will leverage the NVIDIA AI Enterprise platform to transform its consumer beauty experiences, marketing and advertising content pipelines. “AI doesn’t think with the same constraints as a human being. That opens new avenues for creativity,” said Anne Machet, global head of content and entertainment at L’Oréal. “Generative AI enables our teams and partner agencies to explore creative possibilities.” CreAItech, L’Oréal’s generative AI content platform, is augmenting the creativity of marketing and content teams. Combining a modular ecosystem of models, expertise, technologies and partners — including NVIDIA — CreAltech empowers marketers to generate thousands of unique, on-brand images, videos and lines of text for diverse platforms and global audiences. The solution empowers L’Oréal’s marketing teams to quickly iterate on campaigns that improve consumer engagement across social media, e-commerce content and influencer marketing — driving higher conversion rates. Noli.com, the first AI-powered multi-brand marketplace startup founded and backed by the  L’Oréal Groupe, is reinventing how people discover and shop for beauty products. Noli’s AI Beauty Matchmaker experience uses L’Oréal Groupe’s century-long expertise in beauty, including its extensive knowledge of beauty science, beauty tech and consumer insights, built from over 1 million skin data points and analysis of thousands of product formulations. It gives users a BeautyDNA profile with expert-level guidance and personalized product recommendations for skincare and haircare. “Beauty shoppers are often overwhelmed by choice and struggling to find the products that are right for them,” said Amos Susskind, founder and CEO of Noli. “By applying the latest AI models accelerated by NVIDIA and Accenture to the unparalleled knowledge base and expertise of the L’Oréal Groupe, we can provide hyper-personalized, explainable recommendations to our users.”  https://blogs.nvidia.com/wp-content/uploads/2025/06/Noli_Demo.mp4 The Accenture AI Refinery, powered by NVIDIA AI Enterprise, will provide the platform for Noli to experiment and scale. Noli’s new agent models will use NVIDIA NIM and NVIDIA NeMo microservices, including NeMo Retriever, running on Microsoft Azure. Rapid Innovation With the NVIDIA Partner Ecosystem NVIDIA’s ecosystem of solution provider partners empowers retail and CPG companies to innovate faster, personalize customer experiences, and optimize operations with NVIDIA accelerated computing and AI. Global digital agency Monks is reshaping the landscape of AI-driven marketing, creative production and enterprise transformation. At the heart of their innovation lies the Monks.Flow platform that enhances both the speed and sophistication of creative workflows through NVIDIA Omniverse, NVIDIA NIM microservices and Triton Inference Server for lightning-fast inference. AI image solutions provider Bria is helping retail giants like Lidl and L’Oreal to enhance marketing asset creation. Bria AI transforms static product images into compelling, dynamic advertisements that can be quickly scaled for use across any marketing need. The company’s generative AI platform uses NVIDIA Triton Inference Server software and the NVIDIA TensorRT software development kit for accelerated inference, as well as NVIDIA NIM and NeMo microservices for quick image generation at scale. Physical AI Brings Acceleration to Supply Chain and Logistics AI’s impact extends far beyond the digital world. Physical AI-powered warehousing robots, for example, are helping maximize efficiency in retail supply chain operations. Four in five retail companies have reported that AI has helped reduce supply chain operational costs, with 25% reporting cost reductions of at least 10%. Technology providers Lyric, KoiReader Technologies and Exotec are tackling the challenges of integrating AI into complex warehouse environments. Lyric is using the NVIDIA cuOpt GPU-accelerated solver for warehouse network planning and route optimization, and is collaborating with NVIDIA to apply the technology to broader supply chain decision-making problems. KoiReader Technologies is tapping the NVIDIA Metropolis stack for its computer vision solutions within logistics, supply chain and manufacturing environments using the KoiVision Platform. And Exotec is using NVIDIA CUDA libraries and the NVIDIA JetPack software development kit for embedded robotic systems in warehouse and distribution centers. From real-time robotics orchestration to predictive maintenance, these solutions are delivering impact on uptime, throughput and cost savings for supply chain operations. Learn more by joining a follow-up discussion on digital twins and AI-powered creativity with Microsoft, Nestlé, Accenture and NVIDIA at Cannes Lions on Monday, June 16. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    Like
    Love
    Sad
    Wow
    Angry
    23
    0 Комментарии 0 Поделились
  • Plug and Play: Build a G-Assist Plug-In Today

    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems.
    NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels.

    G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow.
    Below, find popular G-Assist plug-ins, hackathon details and tips to get started.
    Plug-In and Win
    Join the hackathon by registering and checking out the curated technical resources.
    G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation.
    For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins.
    To submit an entry, participants must provide a GitHub repository, including source code file, requirements.txt, manifest.json, config.json, a plug-in executable file and READme code.
    Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action.
    Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16.
    Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in.
    Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit.
    Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU, specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver.
    Plug-InExplore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows.

    Popular plug-ins include:

    Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay.
    Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay.
    IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device.
    Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists.
    Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more.

    Get G-Assist 
    Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff.
    the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session.
    Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities.
    Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process.
    NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch.
    Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations. 
    Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter.
    Follow NVIDIA Workstation on LinkedIn and X. 
    See notice regarding software product information.
    #plug #play #build #gassist #plugin
    Plug and Play: Build a G-Assist Plug-In Today
    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems. NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels. G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow. Below, find popular G-Assist plug-ins, hackathon details and tips to get started. Plug-In and Win Join the hackathon by registering and checking out the curated technical resources. G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation. For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins. To submit an entry, participants must provide a GitHub repository, including source code file, requirements.txt, manifest.json, config.json, a plug-in executable file and READme code. Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action. Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16. Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in. Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit. Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU, specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver. Plug-InExplore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows. Popular plug-ins include: Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay. Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay. IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device. Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists. Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more. Get G-Assist  Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff. the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session. Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities. Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process. NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch. Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations.  Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter. Follow NVIDIA Workstation on LinkedIn and X.  See notice regarding software product information. #plug #play #build #gassist #plugin
    BLOGS.NVIDIA.COM
    Plug and Play: Build a G-Assist Plug-In Today
    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems. NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels. G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow. Below, find popular G-Assist plug-ins, hackathon details and tips to get started. Plug-In and Win Join the hackathon by registering and checking out the curated technical resources. G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation. For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins. To submit an entry, participants must provide a GitHub repository, including source code file (plugin.py), requirements.txt, manifest.json, config.json (if applicable), a plug-in executable file and READme code. Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action. Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16. Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in. Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit. Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU (Intel Pentium G Series, Core i3, i5, i7 or higher; AMD FX, Ryzen 3, 5, 7, 9, Threadripper or higher), specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver. Plug-In(spiration) Explore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows. Popular plug-ins include: Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay. Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay. IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device. Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists. Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more. Get G-Assist(ance)  Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff. Save the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session. Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities. Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process. NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch. Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations.  Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter. Follow NVIDIA Workstation on LinkedIn and X.  See notice regarding software product information.
    Like
    Wow
    Love
    Sad
    25
    0 Комментарии 0 Поделились
  • Blender Jobs for June 27, 2025

    Here's an overview of the most recent Blender jobs on Blender Artists, ArtStation and 3djobs.xyz: 3D Generalist (Blender) for Luxury Real Estate Visualization – Collaborative, Time-Bound Project Looking for artist to create product demo videos - delivery within one week 3D rigger/animator Blender artist Aquent | 3D Designer Art Lead TDA | 3D Artist TUEREN, [...]
    Source
    Blender Jobs for June 27, 2025 Here's an overview of the most recent Blender jobs on Blender Artists, ArtStation and 3djobs.xyz: 3D Generalist (Blender) for Luxury Real Estate Visualization – Collaborative, Time-Bound Project Looking for artist to create product demo videos - delivery within one week 3D rigger/animator Blender artist Aquent | 3D Designer Art Lead TDA | 3D Artist TUEREN, [...] Source
    Blender Jobs for June 27, 2025
    Here's an overview of the most recent Blender jobs on Blender Artists, ArtStation and 3djobs.xyz: 3D Generalist (Blender) for Luxury Real Estate Visualization – Collaborative, Time-Bound Project Looking for artist to create product demo videos - deli
    1 Комментарии 0 Поделились
  • Monster Hunter Wilds’ second free title update brings fierce new monsters and more June 30

    New monsters, features, and more arrive in the Forbidden Lands with Free Title Update 2, dropping in Monster Hunter Wilds on June 30! Watch the latest trailer for a look at what awaits you.

    Play Video

    Monster Hunter Wilds – Free Title Update 2

    In addition to what’s featured in the trailer, Free Title Update 2 will also feature improvements and adjustments to various aspects of the game. Make sure to check the official Monster Hunter Wilds website for a new Director’s Letter from Game Director Yuya Tokuda coming soon, for a deeper dive into what’s coming in addition to the core new monsters and features.

    ● The Leviathan, Lagiacrus, emerges at last

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    The long-awaited Leviathan, Lagiacrus, has finally appeared in Monster Hunter Wilds! Floating at the top of the aquatic food chain, Lagiacrus is a master of the sea, boiling the surrounding water by emitting powerful currents of electricity. New missions to hunt Lagiacrus will become available for hunters at Hunter Rank 31 or above, and after clearing the “A World Turned Upside Down” main mission, and the “Forest Doshaguma” side mission.

    While you’ll fight Lagiacrus primarily on land, your hunt against this formidable foe can also take you deep underwater for a special encounter, where it feels most at home. During the underwater portion of the hunt, hunters won’t be able to use their weapons freely, but there are still ways to fight back and turn the tide of battle. Stay alert for your opportunities!

    Hunt Lagiacrus to obtain materials for new hunter and Palico armor! As usual, these sets can be used as layered armor as well.

    ● The Flying Wyvern, Seregios, strikes

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    Shining golden bright, the flying wyvern, Seregios, swoops into the Forbidden Lands with Free Title Update 2! Seregios is a highly mobile aerial monster that fires sharp bladescales, inflicting bleeding status on hunters. Keep an eye on your health and bring along rations and well-done steak when hunting this monster. Missions to hunt Seregios are available for hunters at HR 31 or above that have cleared the “A World Turned Upside Down” main mission.

    New hunter and Palico armor forged from Seregios materials awaits you!

    For hunters looking for a greater challenge, 8★ Tempered Lagiacrus and Seregios will begin appearing for hunters at HR 41 or higher, after completing their initial missions. Best of luck against these powerful monsters!

    Hunt in style with layered weapons

    With Free Title Update 2, hunters will be able to use Layered Weapons, which lets you use the look of any weapon, while keeping the stats and abilities of another.

    To unlock a weapon design as a Layered Weapon option, you’ll need to craft the final weapon in that weapon’s upgrade tree. Artian Weapons can be used as layered weapons by fully reinforcing a Rarity 8 Artian weapon.

    For weapons that change in appearance when upgraded, you’ll also have the option to use their pre-upgrade designs as well! You can also craft layered Palico weapons by forging their high-rank weapons. We hope this feature encourages you to delve deeper into crafting the powerful Artian Weapon you’ve been looking for, all while keeping the appearance of your favorite weapon.

    New optional features

    Change your choice of handler accompanying you in the field to Eric after completing the Lagiacrus mission in Free Title Update 2! You can always switch back to Alma too, but it doesn’t hurt to give our trusty handler a break from time to time.

    A new Support Hunter joins the fray

    Mina, a support hunter who wields a Sword & Shield, joins the hunt. With Free Title Update 2, you’ll be able to choose which support hunters can join you on quests.

    Photo Mode Improvements

    Snap even more creative photos of your hunts with some new options, including an Effects tab to adjust brightness and filter effects, and a Character Display tab to toggle off your Handler, Palico, Seikret, and more.

    Celebrate summer with the Festival of Accord: Flamefete seasonal event

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    The next seasonal event in Monster Hunter Wilds, the Festival of Accord: Flamefete, will take place in the Grand Hub from July 23 to August 6! Cool off with this summer themed celebration, where you can obtain new armor, gestures, and pop-up camp decorations for a limited time. You’ll also be able to eat special seasonal event meals and enjoy the fun of summer as the Grand Hub and all it’s members will be dressed to mark the occasion.

    Arch-Tempered Uth Duna slams down starting July 30

    Take on an even more powerful version of Uth Duna when Arch-Tempered Uth Duna arrives as an Event Quest and Free Challenge Quest from July 30 to August 20! Take on and defeat the challenging apex of the Scarlet Forest to obtain materials for crafting the new Uth Duna γ hunter armor set and the Felyne Uth Duna γ Palico armor set. Be sure you’re at least HR 50 or above to take on this quest.

    We’ve also got plenty of new Event Quests on the way in the weeks ahead, including some where you can earn new special equipment, quests to obtain more armor spheres, and challenge quests against Mizutsune. Be sure to keep checking back each week to see what’s new!

    A special collaboration with Fender

    Monster Hunter Wilds is collaborating with world-renowned guitar brand Fender®! From August 27 to September 24, a special Event Quest will be available to earn a collaboration gesture that lets you rock out with the Monster Hunter Rathalos Telecaster®.

    In celebration of Monster Hunter’s 20th anniversary, the globally released Monster Hunter Rathalos Telecaster® collaboration guitar is making its way into the game! Be sure to experience it both in-game and in real life!

    A new round of cosmetic DLC arrives

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    Express your style with additional DLC, including four free dance gestures. Paid cosmetic DLC, such as gestures, stickers, pendants, and more will also be available. If you’ve purchased the Premium Deluxe Edition of Monster Hunter Wilds or the Cosmetic DLC Pass, Cosmetic DLC Pack 2 and other additional items will be available to download when Free Title Update 2 releases. 

    Free Title Update roadmap

    We hope you’re excited to dive into all the content coming with Free Title Update 2! We’ll continue to release updates, with Free Title Update 3 coming at the end of September. Stay tuned for more details to come.

    A Monster Hunter Wilds background is added to the PS5 Welcome hub

    Alongside Free Title Update 2 on June 30, an animated background featuring the hunters facing Arkveld during the Inclemency will be added to the Welcome hub. Customize your PS5 Welcome hub with Monster Hunter Wilds to get you in the hunting mood.

    View and download image

    Download the image

    close
    Close

    Download this image

    How to change the backgroundWelcome hub -> Change background -> Games

    Try out Monster Hunter Wilds on PS5 with a PlayStation Plus Premium Game Trial starting on June 30

    View and download image

    Download the image

    close
    Close

    Download this image

    With the Game Trial, you can try out the full version of the game for 2 hours. If you decide to purchase the full version after the trial, your save data will carry over, allowing you to continue playing seamlessly right where you left off. If you haven’t played Monster Hunter Wilds yet, this is a great way to give it a try.

    Happy Hunting!
    #monster #hunter #wilds #second #free
    Monster Hunter Wilds’ second free title update brings fierce new monsters and more June 30
    New monsters, features, and more arrive in the Forbidden Lands with Free Title Update 2, dropping in Monster Hunter Wilds on June 30! Watch the latest trailer for a look at what awaits you. Play Video Monster Hunter Wilds – Free Title Update 2 In addition to what’s featured in the trailer, Free Title Update 2 will also feature improvements and adjustments to various aspects of the game. Make sure to check the official Monster Hunter Wilds website for a new Director’s Letter from Game Director Yuya Tokuda coming soon, for a deeper dive into what’s coming in addition to the core new monsters and features. ● The Leviathan, Lagiacrus, emerges at last View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The long-awaited Leviathan, Lagiacrus, has finally appeared in Monster Hunter Wilds! Floating at the top of the aquatic food chain, Lagiacrus is a master of the sea, boiling the surrounding water by emitting powerful currents of electricity. New missions to hunt Lagiacrus will become available for hunters at Hunter Rank 31 or above, and after clearing the “A World Turned Upside Down” main mission, and the “Forest Doshaguma” side mission. While you’ll fight Lagiacrus primarily on land, your hunt against this formidable foe can also take you deep underwater for a special encounter, where it feels most at home. During the underwater portion of the hunt, hunters won’t be able to use their weapons freely, but there are still ways to fight back and turn the tide of battle. Stay alert for your opportunities! Hunt Lagiacrus to obtain materials for new hunter and Palico armor! As usual, these sets can be used as layered armor as well. ● The Flying Wyvern, Seregios, strikes View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Shining golden bright, the flying wyvern, Seregios, swoops into the Forbidden Lands with Free Title Update 2! Seregios is a highly mobile aerial monster that fires sharp bladescales, inflicting bleeding status on hunters. Keep an eye on your health and bring along rations and well-done steak when hunting this monster. Missions to hunt Seregios are available for hunters at HR 31 or above that have cleared the “A World Turned Upside Down” main mission. New hunter and Palico armor forged from Seregios materials awaits you! For hunters looking for a greater challenge, 8★ Tempered Lagiacrus and Seregios will begin appearing for hunters at HR 41 or higher, after completing their initial missions. Best of luck against these powerful monsters! Hunt in style with layered weapons With Free Title Update 2, hunters will be able to use Layered Weapons, which lets you use the look of any weapon, while keeping the stats and abilities of another. To unlock a weapon design as a Layered Weapon option, you’ll need to craft the final weapon in that weapon’s upgrade tree. Artian Weapons can be used as layered weapons by fully reinforcing a Rarity 8 Artian weapon. For weapons that change in appearance when upgraded, you’ll also have the option to use their pre-upgrade designs as well! You can also craft layered Palico weapons by forging their high-rank weapons. We hope this feature encourages you to delve deeper into crafting the powerful Artian Weapon you’ve been looking for, all while keeping the appearance of your favorite weapon. New optional features Change your choice of handler accompanying you in the field to Eric after completing the Lagiacrus mission in Free Title Update 2! You can always switch back to Alma too, but it doesn’t hurt to give our trusty handler a break from time to time. A new Support Hunter joins the fray Mina, a support hunter who wields a Sword & Shield, joins the hunt. With Free Title Update 2, you’ll be able to choose which support hunters can join you on quests. Photo Mode Improvements Snap even more creative photos of your hunts with some new options, including an Effects tab to adjust brightness and filter effects, and a Character Display tab to toggle off your Handler, Palico, Seikret, and more. Celebrate summer with the Festival of Accord: Flamefete seasonal event View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The next seasonal event in Monster Hunter Wilds, the Festival of Accord: Flamefete, will take place in the Grand Hub from July 23 to August 6! Cool off with this summer themed celebration, where you can obtain new armor, gestures, and pop-up camp decorations for a limited time. You’ll also be able to eat special seasonal event meals and enjoy the fun of summer as the Grand Hub and all it’s members will be dressed to mark the occasion. Arch-Tempered Uth Duna slams down starting July 30 Take on an even more powerful version of Uth Duna when Arch-Tempered Uth Duna arrives as an Event Quest and Free Challenge Quest from July 30 to August 20! Take on and defeat the challenging apex of the Scarlet Forest to obtain materials for crafting the new Uth Duna γ hunter armor set and the Felyne Uth Duna γ Palico armor set. Be sure you’re at least HR 50 or above to take on this quest. We’ve also got plenty of new Event Quests on the way in the weeks ahead, including some where you can earn new special equipment, quests to obtain more armor spheres, and challenge quests against Mizutsune. Be sure to keep checking back each week to see what’s new! A special collaboration with Fender Monster Hunter Wilds is collaborating with world-renowned guitar brand Fender®! From August 27 to September 24, a special Event Quest will be available to earn a collaboration gesture that lets you rock out with the Monster Hunter Rathalos Telecaster®. In celebration of Monster Hunter’s 20th anniversary, the globally released Monster Hunter Rathalos Telecaster® collaboration guitar is making its way into the game! Be sure to experience it both in-game and in real life! A new round of cosmetic DLC arrives View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Express your style with additional DLC, including four free dance gestures. Paid cosmetic DLC, such as gestures, stickers, pendants, and more will also be available. If you’ve purchased the Premium Deluxe Edition of Monster Hunter Wilds or the Cosmetic DLC Pass, Cosmetic DLC Pack 2 and other additional items will be available to download when Free Title Update 2 releases.  Free Title Update roadmap We hope you’re excited to dive into all the content coming with Free Title Update 2! We’ll continue to release updates, with Free Title Update 3 coming at the end of September. Stay tuned for more details to come. A Monster Hunter Wilds background is added to the PS5 Welcome hub Alongside Free Title Update 2 on June 30, an animated background featuring the hunters facing Arkveld during the Inclemency will be added to the Welcome hub. Customize your PS5 Welcome hub with Monster Hunter Wilds to get you in the hunting mood. View and download image Download the image close Close Download this image How to change the backgroundWelcome hub -> Change background -> Games Try out Monster Hunter Wilds on PS5 with a PlayStation Plus Premium Game Trial starting on June 30 View and download image Download the image close Close Download this image With the Game Trial, you can try out the full version of the game for 2 hours. If you decide to purchase the full version after the trial, your save data will carry over, allowing you to continue playing seamlessly right where you left off. If you haven’t played Monster Hunter Wilds yet, this is a great way to give it a try. Happy Hunting! #monster #hunter #wilds #second #free
    BLOG.PLAYSTATION.COM
    Monster Hunter Wilds’ second free title update brings fierce new monsters and more June 30
    New monsters, features, and more arrive in the Forbidden Lands with Free Title Update 2, dropping in Monster Hunter Wilds on June 30! Watch the latest trailer for a look at what awaits you. Play Video Monster Hunter Wilds – Free Title Update 2 In addition to what’s featured in the trailer, Free Title Update 2 will also feature improvements and adjustments to various aspects of the game. Make sure to check the official Monster Hunter Wilds website for a new Director’s Letter from Game Director Yuya Tokuda coming soon, for a deeper dive into what’s coming in addition to the core new monsters and features. ● The Leviathan, Lagiacrus, emerges at last View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The long-awaited Leviathan, Lagiacrus, has finally appeared in Monster Hunter Wilds! Floating at the top of the aquatic food chain, Lagiacrus is a master of the sea, boiling the surrounding water by emitting powerful currents of electricity. New missions to hunt Lagiacrus will become available for hunters at Hunter Rank 31 or above, and after clearing the “A World Turned Upside Down” main mission, and the “Forest Doshaguma” side mission. While you’ll fight Lagiacrus primarily on land, your hunt against this formidable foe can also take you deep underwater for a special encounter, where it feels most at home. During the underwater portion of the hunt, hunters won’t be able to use their weapons freely, but there are still ways to fight back and turn the tide of battle. Stay alert for your opportunities! Hunt Lagiacrus to obtain materials for new hunter and Palico armor! As usual, these sets can be used as layered armor as well. ● The Flying Wyvern, Seregios, strikes View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Shining golden bright, the flying wyvern, Seregios, swoops into the Forbidden Lands with Free Title Update 2! Seregios is a highly mobile aerial monster that fires sharp bladescales, inflicting bleeding status on hunters. Keep an eye on your health and bring along rations and well-done steak when hunting this monster. Missions to hunt Seregios are available for hunters at HR 31 or above that have cleared the “A World Turned Upside Down” main mission. New hunter and Palico armor forged from Seregios materials awaits you! For hunters looking for a greater challenge, 8★ Tempered Lagiacrus and Seregios will begin appearing for hunters at HR 41 or higher, after completing their initial missions. Best of luck against these powerful monsters! Hunt in style with layered weapons With Free Title Update 2, hunters will be able to use Layered Weapons, which lets you use the look of any weapon, while keeping the stats and abilities of another. To unlock a weapon design as a Layered Weapon option, you’ll need to craft the final weapon in that weapon’s upgrade tree. Artian Weapons can be used as layered weapons by fully reinforcing a Rarity 8 Artian weapon. For weapons that change in appearance when upgraded, you’ll also have the option to use their pre-upgrade designs as well! You can also craft layered Palico weapons by forging their high-rank weapons. We hope this feature encourages you to delve deeper into crafting the powerful Artian Weapon you’ve been looking for, all while keeping the appearance of your favorite weapon. New optional features Change your choice of handler accompanying you in the field to Eric after completing the Lagiacrus mission in Free Title Update 2! You can always switch back to Alma too, but it doesn’t hurt to give our trusty handler a break from time to time. A new Support Hunter joins the fray Mina, a support hunter who wields a Sword & Shield, joins the hunt. With Free Title Update 2, you’ll be able to choose which support hunters can join you on quests. Photo Mode Improvements Snap even more creative photos of your hunts with some new options, including an Effects tab to adjust brightness and filter effects, and a Character Display tab to toggle off your Handler, Palico, Seikret, and more. Celebrate summer with the Festival of Accord: Flamefete seasonal event View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The next seasonal event in Monster Hunter Wilds, the Festival of Accord: Flamefete, will take place in the Grand Hub from July 23 to August 6! Cool off with this summer themed celebration, where you can obtain new armor, gestures, and pop-up camp decorations for a limited time. You’ll also be able to eat special seasonal event meals and enjoy the fun of summer as the Grand Hub and all it’s members will be dressed to mark the occasion. Arch-Tempered Uth Duna slams down starting July 30 Take on an even more powerful version of Uth Duna when Arch-Tempered Uth Duna arrives as an Event Quest and Free Challenge Quest from July 30 to August 20! Take on and defeat the challenging apex of the Scarlet Forest to obtain materials for crafting the new Uth Duna γ hunter armor set and the Felyne Uth Duna γ Palico armor set. Be sure you’re at least HR 50 or above to take on this quest. We’ve also got plenty of new Event Quests on the way in the weeks ahead, including some where you can earn new special equipment, quests to obtain more armor spheres, and challenge quests against Mizutsune. Be sure to keep checking back each week to see what’s new! A special collaboration with Fender Monster Hunter Wilds is collaborating with world-renowned guitar brand Fender®! From August 27 to September 24, a special Event Quest will be available to earn a collaboration gesture that lets you rock out with the Monster Hunter Rathalos Telecaster®. In celebration of Monster Hunter’s 20th anniversary, the globally released Monster Hunter Rathalos Telecaster® collaboration guitar is making its way into the game! Be sure to experience it both in-game and in real life! A new round of cosmetic DLC arrives View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Express your style with additional DLC, including four free dance gestures. Paid cosmetic DLC, such as gestures, stickers, pendants, and more will also be available. If you’ve purchased the Premium Deluxe Edition of Monster Hunter Wilds or the Cosmetic DLC Pass, Cosmetic DLC Pack 2 and other additional items will be available to download when Free Title Update 2 releases.  Free Title Update roadmap We hope you’re excited to dive into all the content coming with Free Title Update 2! We’ll continue to release updates, with Free Title Update 3 coming at the end of September. Stay tuned for more details to come. A Monster Hunter Wilds background is added to the PS5 Welcome hub Alongside Free Title Update 2 on June 30, an animated background featuring the hunters facing Arkveld during the Inclemency will be added to the Welcome hub. Customize your PS5 Welcome hub with Monster Hunter Wilds to get you in the hunting mood. View and download image Download the image close Close Download this image How to change the backgroundWelcome hub -> Change background -> Games Try out Monster Hunter Wilds on PS5 with a PlayStation Plus Premium Game Trial starting on June 30 View and download image Download the image close Close Download this image With the Game Trial, you can try out the full version of the game for 2 hours. If you decide to purchase the full version after the trial, your save data will carry over, allowing you to continue playing seamlessly right where you left off. If you haven’t played Monster Hunter Wilds yet, this is a great way to give it a try. Happy Hunting!
    0 Комментарии 0 Поделились
  • HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE

    By TREVOR HOGG

    Images courtesy of Warner Bros. Pictures.

    Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon.

    “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.”
    —Talia Finlayson, Creative Technologist, Disguise

    Interior and exterior environments had to be created, such as the shop owned by Steve.

    “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”

    Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.”

    A virtual exploration of Steve’s shop in Midport Village.

    Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.”

    “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”
    —Laura Bell, Creative Technologist, Disguise

    Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack.

    Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.”

    Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!”

    A virtual study and final still of the cast members standing outside of the Lava Chicken Shack.

    “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.”
    —Talia Finlayson, Creative Technologist, Disguise

    The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.”

    Virtually conceptualizing the layout of Midport Village.

    Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.”

    An example of the virtual and final version of the Woodland Mansion.

    “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.”
    —Laura Bell, Creative Technologist, Disguise

    Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.”

    Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment.

    Doing a virtual scale study of the Mountainside.

    Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.”

    Piglots cause mayhem during the Wingsuit Chase.

    Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods.

    “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    #how #disguise #built #out #virtual
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve. “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.” #how #disguise #built #out #virtual
    WWW.VFXVOICE.COM
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “[A]s the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve (Jack Black). “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’s (Jack Black) Lava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younis [VAD Art Director] adapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay George [VP Tech] and I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols [VAD Supervisor], Pat Younis, Jake Tuck [Unreal Artist] and Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    0 Комментарии 0 Поделились
  • In a world where imagination knows no bounds, Don Diablo has taken the plunge into the tech-art romance we never knew we needed. Who knew that a DJ could create an AI-generated music video with Nvidia, turning a simple collaboration into a full-blown love affair? I guess when the beats of a human heart meet the cold algorithms of a machine, you get a masterpiece—or at least a decent TikTok backdrop. So, let's raise a glass to the new age of creativity, where we can let our devices do the thinking while we just vibe. Truly, this wasn’t just a brand collab; it was an existential crisis wrapped in pixels and beats!

    #TechMeetsArt #AIMusicVideo #DonDiablo #Nvidia #
    In a world where imagination knows no bounds, Don Diablo has taken the plunge into the tech-art romance we never knew we needed. Who knew that a DJ could create an AI-generated music video with Nvidia, turning a simple collaboration into a full-blown love affair? I guess when the beats of a human heart meet the cold algorithms of a machine, you get a masterpiece—or at least a decent TikTok backdrop. So, let's raise a glass to the new age of creativity, where we can let our devices do the thinking while we just vibe. Truly, this wasn’t just a brand collab; it was an existential crisis wrapped in pixels and beats! #TechMeetsArt #AIMusicVideo #DonDiablo #Nvidia #
    1 Комментарии 0 Поделились
  • إنه لأمر محبط حقًا أن نرى كيف أن تقنية الذكاء الاصطناعي مثل Midjourney تُستخدم لتحويل الصور إلى أفلام، وكأننا نعيش في عصر من السحر التكنولوجي بينما نحن في الواقع نغرق في كومة من الأخطاء التقنية والمشاكل الاجتماعية. ما الذي يحدث؟ هل نحن جميعًا مستعدون لتقبل هذه الأوهام بينما يتزايد فقدان الهوية الفنية والإنسانية؟ الفيديو ليس مجرد صور متحركة، بل هو تعبير عن الروح والإبداع، وليس مجرد منتج يتم دفعه من خلال خوارزميات مملة. دعونا نكون واضحين: هذه ليست خطوة نحو المستقبل، بل هي خطوة إلى الوراء
    إنه لأمر محبط حقًا أن نرى كيف أن تقنية الذكاء الاصطناعي مثل Midjourney تُستخدم لتحويل الصور إلى أفلام، وكأننا نعيش في عصر من السحر التكنولوجي بينما نحن في الواقع نغرق في كومة من الأخطاء التقنية والمشاكل الاجتماعية. ما الذي يحدث؟ هل نحن جميعًا مستعدون لتقبل هذه الأوهام بينما يتزايد فقدان الهوية الفنية والإنسانية؟ الفيديو ليس مجرد صور متحركة، بل هو تعبير عن الروح والإبداع، وليس مجرد منتج يتم دفعه من خلال خوارزميات مملة. دعونا نكون واضحين: هذه ليست خطوة نحو المستقبل، بل هي خطوة إلى الوراء
    ARABHARDWARE.NET
    حوّل الصور إلى أفلام: Midjourney تكشف عن سحر الفيديو بالذكاء الاصطناعي
    The post حوّل الصور إلى أفلام: Midjourney تكشف عن سحر الفيديو بالذكاء الاصطناعي appeared first on عرب هاردوير.
    1 Комментарии 0 Поделились
  • In a world where we’re all desperately trying to make our digital creations look as lifelike as a potato, we now have the privilege of diving headfirst into the revolutionary topic of "Separate shaders in AI 3D generated models." Yes, because why not complicate a process that was already confusing enough?

    Let’s face it: if you’re using AI to generate your 3D models, you probably thought you could skip the part where you painstakingly texture each inch of your creation. But alas! Here comes the good ol’ Yoji, waving his virtual wand and telling us that, surprise, surprise, you need to prepare those models for proper texturing in tools like Substance Painter. Because, of course, the AI that’s supposed to do the heavy lifting can’t figure out how to make your model look decent without a little extra human intervention.

    But don’t worry! Yoji has got your back with his meticulous “how-to” on separating shaders. Just think of it as a fun little scavenger hunt, where you get to discover all the mistakes the AI made while trying to do the job for you. Who knew that a model could look so… special? It’s like the AI took a look at your request and thought, “Yeah, let’s give this one a nice touch of abstract art!” Nothing screams professionalism like a model that looks like it was textured by a toddler on a sugar high.

    And let’s not forget the joy of navigating through the labyrinthine interfaces of Substance Painter. Ah, yes! The thrill of clicking through endless menus, desperately searching for that elusive shader that will somehow make your model look less like a lumpy marshmallow and more like a refined piece of art. It’s a bit like being in a relationship, really. You start with high hopes and a glossy exterior, only to end up questioning all your life choices as you try to figure out how to make it work.

    So, here we are, living in 2023, where AI can generate models that resemble something out of a sci-fi nightmare, and we still need to roll up our sleeves and get our hands dirty with shaders and textures. Who knew that the future would come with so many manual adjustments? Isn’t technology just delightful?

    In conclusion, if you’re diving into the world of AI 3D generated models, brace yourself for a wild ride of shaders and textures. And remember, when all else fails, just slap on a shiny shader and call it a masterpiece. After all, art is subjective, right?

    #3DModels #AIGenerated #SubstancePainter #Shaders #DigitalArt
    In a world where we’re all desperately trying to make our digital creations look as lifelike as a potato, we now have the privilege of diving headfirst into the revolutionary topic of "Separate shaders in AI 3D generated models." Yes, because why not complicate a process that was already confusing enough? Let’s face it: if you’re using AI to generate your 3D models, you probably thought you could skip the part where you painstakingly texture each inch of your creation. But alas! Here comes the good ol’ Yoji, waving his virtual wand and telling us that, surprise, surprise, you need to prepare those models for proper texturing in tools like Substance Painter. Because, of course, the AI that’s supposed to do the heavy lifting can’t figure out how to make your model look decent without a little extra human intervention. But don’t worry! Yoji has got your back with his meticulous “how-to” on separating shaders. Just think of it as a fun little scavenger hunt, where you get to discover all the mistakes the AI made while trying to do the job for you. Who knew that a model could look so… special? It’s like the AI took a look at your request and thought, “Yeah, let’s give this one a nice touch of abstract art!” Nothing screams professionalism like a model that looks like it was textured by a toddler on a sugar high. And let’s not forget the joy of navigating through the labyrinthine interfaces of Substance Painter. Ah, yes! The thrill of clicking through endless menus, desperately searching for that elusive shader that will somehow make your model look less like a lumpy marshmallow and more like a refined piece of art. It’s a bit like being in a relationship, really. You start with high hopes and a glossy exterior, only to end up questioning all your life choices as you try to figure out how to make it work. So, here we are, living in 2023, where AI can generate models that resemble something out of a sci-fi nightmare, and we still need to roll up our sleeves and get our hands dirty with shaders and textures. Who knew that the future would come with so many manual adjustments? Isn’t technology just delightful? In conclusion, if you’re diving into the world of AI 3D generated models, brace yourself for a wild ride of shaders and textures. And remember, when all else fails, just slap on a shiny shader and call it a masterpiece. After all, art is subjective, right? #3DModels #AIGenerated #SubstancePainter #Shaders #DigitalArt
    Separate shaders in AI 3d generated models
    Yoji shows how to prepare generated models for proper texturing in tools like Substance Painter. Source
    Like
    Love
    Wow
    Sad
    Angry
    192
    1 Комментарии 0 Поделились
  • Midjourney, Disney, Universal, procès pour droits d’auteur, vidéos génératives, personnages célèbres, créativité numérique, IA générative, contenu numérique

    ## Introduction

    Dans un monde où la technologie évolue à une vitesse fulgurante, nous assistons à des développements fascinants qui transcendent notre imagination. Récemment, une nouvelle a captivé l'attention des passionnés de cinéma et de technologie : Midjourney, l'innovant studio d'intelligence artificielle, a lancé un nouvel outil vid...
    Midjourney, Disney, Universal, procès pour droits d’auteur, vidéos génératives, personnages célèbres, créativité numérique, IA générative, contenu numérique ## Introduction Dans un monde où la technologie évolue à une vitesse fulgurante, nous assistons à des développements fascinants qui transcendent notre imagination. Récemment, une nouvelle a captivé l'attention des passionnés de cinéma et de technologie : Midjourney, l'innovant studio d'intelligence artificielle, a lancé un nouvel outil vid...
    ‘Wall-E avec un pistolet’ : Midjourney génère des vidéos de personnages Disney au milieu d’un énorme procès pour droits d’auteur
    Midjourney, Disney, Universal, procès pour droits d’auteur, vidéos génératives, personnages célèbres, créativité numérique, IA générative, contenu numérique ## Introduction Dans un monde où la technologie évolue à une vitesse fulgurante, nous assistons à des développements fascinants qui transcendent notre imagination. Récemment, une nouvelle a captivé l'attention des passionnés de cinéma et...
    Like
    Love
    Wow
    Sad
    Angry
    207
    1 Комментарии 0 Поделились
Расширенные страницы