• Retail Reboot: Major Global Brands Transform End-to-End Operations With NVIDIA

    AI is packing and shipping efficiency for the retail and consumer packaged goodsindustries, with a majority of surveyed companies in the space reporting the technology is increasing revenue and reducing operational costs.
    Global brands are reimagining every facet of their businesses with AI, from how products are designed and manufactured to how they’re marketed, shipped and experienced in-store and online.
    At NVIDIA GTC Paris at VivaTech, industry leaders including L’Oréal, LVMH and Nestlé shared how they’re using tools like AI agents and physical AI — powered by NVIDIA AI and simulation technologies — across every step of the product lifecycle to enhance operations and experiences for partners, customers and employees.
    3D Digital Twins and AI Transform Marketing, Advertising and Product Design
    The meeting of generative AI and 3D product digital twins results in unlimited creative potential.
    Nestlé, the world’s largest food and beverage company, today announced a collaboration with NVIDIA and Accenture to launch a new, AI-powered in-house service that will create high-quality product content at scale for e-commerce and digital media channels.
    The new content service, based on digital twins powered by the NVIDIA Omniverse platform, creates exact 3D virtual replicas of physical products. Product packaging can be adjusted or localized digitally, enabling seamless integration into various environments, such as seasonal campaigns or channel-specific formats. This means that new creative content can be generated without having to constantly reshoot from scratch.
    Image courtesy of Nestlé
    The service is developed in partnership with Accenture Song, using Accenture AI Refinery built on NVIDIA Omniverse for advanced digital twin creation. It uses NVIDIA AI Enterprise for generative AI, hosted on Microsoft Azure for robust cloud infrastructure.
    Nestlé already has a baseline of 4,000 3D digital products — mainly for global brands — with the ambition to convert a total of 10,000 products into digital twins in the next two years across global and local brands.
    LVMH, the world’s leading luxury goods company, home to 75 distinguished maisons, is bringing 3D digital twins to its content production processes through its wine and spirits division, Moët Hennessy.
    The group partnered with content configuration engine Grip to develop a solution using the NVIDIA Omniverse platform, which enables the creation of 3D digital twins that power content variation production. With Grip’s solution, Moët Hennessy teams can quickly generate digital marketing assets and experiences to promote luxury products at scale.
    The initiative, led by Capucine Lafarge and Chloé Fournier, has been recognized by LVMH as a leading approach to scaling content creation.
    Image courtesy of Grip
    L’Oréal Gives Marketing and Online Shopping an AI Makeover
    Innovation starts at the drawing board. Today, that board is digital — and it’s powered by AI.
    L’Oréal Groupe, the world’s leading beauty player, announced its collaboration with NVIDIA today. Through this collaboration, L’Oréal and its partner ecosystem will leverage the NVIDIA AI Enterprise platform to transform its consumer beauty experiences, marketing and advertising content pipelines.
    “AI doesn’t think with the same constraints as a human being. That opens new avenues for creativity,” said Anne Machet, global head of content and entertainment at L’Oréal. “Generative AI enables our teams and partner agencies to explore creative possibilities.”
    CreAItech, L’Oréal’s generative AI content platform, is augmenting the creativity of marketing and content teams. Combining a modular ecosystem of models, expertise, technologies and partners — including NVIDIA — CreAltech empowers marketers to generate thousands of unique, on-brand images, videos and lines of text for diverse platforms and global audiences.
    The solution empowers L’Oréal’s marketing teams to quickly iterate on campaigns that improve consumer engagement across social media, e-commerce content and influencer marketing — driving higher conversion rates.

    Noli.com, the first AI-powered multi-brand marketplace startup founded and backed by the  L’Oréal Groupe, is reinventing how people discover and shop for beauty products.
    Noli’s AI Beauty Matchmaker experience uses L’Oréal Groupe’s century-long expertise in beauty, including its extensive knowledge of beauty science, beauty tech and consumer insights, built from over 1 million skin data points and analysis of thousands of product formulations. It gives users a BeautyDNA profile with expert-level guidance and personalized product recommendations for skincare and haircare.
    “Beauty shoppers are often overwhelmed by choice and struggling to find the products that are right for them,” said Amos Susskind, founder and CEO of Noli. “By applying the latest AI models accelerated by NVIDIA and Accenture to the unparalleled knowledge base and expertise of the L’Oréal Groupe, we can provide hyper-personalized, explainable recommendations to our users.” 

    The Accenture AI Refinery, powered by NVIDIA AI Enterprise, will provide the platform for Noli to experiment and scale. Noli’s new agent models will use NVIDIA NIM and NVIDIA NeMo microservices, including NeMo Retriever, running on Microsoft Azure.
    Rapid Innovation With the NVIDIA Partner Ecosystem
    NVIDIA’s ecosystem of solution provider partners empowers retail and CPG companies to innovate faster, personalize customer experiences, and optimize operations with NVIDIA accelerated computing and AI.
    Global digital agency Monks is reshaping the landscape of AI-driven marketing, creative production and enterprise transformation. At the heart of their innovation lies the Monks.Flow platform that enhances both the speed and sophistication of creative workflows through NVIDIA Omniverse, NVIDIA NIM microservices and Triton Inference Server for lightning-fast inference.
    AI image solutions provider Bria is helping retail giants like Lidl and L’Oreal to enhance marketing asset creation. Bria AI transforms static product images into compelling, dynamic advertisements that can be quickly scaled for use across any marketing need.
    The company’s generative AI platform uses NVIDIA Triton Inference Server software and the NVIDIA TensorRT software development kit for accelerated inference, as well as NVIDIA NIM and NeMo microservices for quick image generation at scale.
    Physical AI Brings Acceleration to Supply Chain and Logistics
    AI’s impact extends far beyond the digital world. Physical AI-powered warehousing robots, for example, are helping maximize efficiency in retail supply chain operations. Four in five retail companies have reported that AI has helped reduce supply chain operational costs, with 25% reporting cost reductions of at least 10%.
    Technology providers Lyric, KoiReader Technologies and Exotec are tackling the challenges of integrating AI into complex warehouse environments.
    Lyric is using the NVIDIA cuOpt GPU-accelerated solver for warehouse network planning and route optimization, and is collaborating with NVIDIA to apply the technology to broader supply chain decision-making problems. KoiReader Technologies is tapping the NVIDIA Metropolis stack for its computer vision solutions within logistics, supply chain and manufacturing environments using the KoiVision Platform. And Exotec is using NVIDIA CUDA libraries and the NVIDIA JetPack software development kit for embedded robotic systems in warehouse and distribution centers.
    From real-time robotics orchestration to predictive maintenance, these solutions are delivering impact on uptime, throughput and cost savings for supply chain operations.
    Learn more by joining a follow-up discussion on digital twins and AI-powered creativity with Microsoft, Nestlé, Accenture and NVIDIA at Cannes Lions on Monday, June 16.
    Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    #retail #reboot #major #global #brands
    Retail Reboot: Major Global Brands Transform End-to-End Operations With NVIDIA
    AI is packing and shipping efficiency for the retail and consumer packaged goodsindustries, with a majority of surveyed companies in the space reporting the technology is increasing revenue and reducing operational costs. Global brands are reimagining every facet of their businesses with AI, from how products are designed and manufactured to how they’re marketed, shipped and experienced in-store and online. At NVIDIA GTC Paris at VivaTech, industry leaders including L’Oréal, LVMH and Nestlé shared how they’re using tools like AI agents and physical AI — powered by NVIDIA AI and simulation technologies — across every step of the product lifecycle to enhance operations and experiences for partners, customers and employees. 3D Digital Twins and AI Transform Marketing, Advertising and Product Design The meeting of generative AI and 3D product digital twins results in unlimited creative potential. Nestlé, the world’s largest food and beverage company, today announced a collaboration with NVIDIA and Accenture to launch a new, AI-powered in-house service that will create high-quality product content at scale for e-commerce and digital media channels. The new content service, based on digital twins powered by the NVIDIA Omniverse platform, creates exact 3D virtual replicas of physical products. Product packaging can be adjusted or localized digitally, enabling seamless integration into various environments, such as seasonal campaigns or channel-specific formats. This means that new creative content can be generated without having to constantly reshoot from scratch. Image courtesy of Nestlé The service is developed in partnership with Accenture Song, using Accenture AI Refinery built on NVIDIA Omniverse for advanced digital twin creation. It uses NVIDIA AI Enterprise for generative AI, hosted on Microsoft Azure for robust cloud infrastructure. Nestlé already has a baseline of 4,000 3D digital products — mainly for global brands — with the ambition to convert a total of 10,000 products into digital twins in the next two years across global and local brands. LVMH, the world’s leading luxury goods company, home to 75 distinguished maisons, is bringing 3D digital twins to its content production processes through its wine and spirits division, Moët Hennessy. The group partnered with content configuration engine Grip to develop a solution using the NVIDIA Omniverse platform, which enables the creation of 3D digital twins that power content variation production. With Grip’s solution, Moët Hennessy teams can quickly generate digital marketing assets and experiences to promote luxury products at scale. The initiative, led by Capucine Lafarge and Chloé Fournier, has been recognized by LVMH as a leading approach to scaling content creation. Image courtesy of Grip L’Oréal Gives Marketing and Online Shopping an AI Makeover Innovation starts at the drawing board. Today, that board is digital — and it’s powered by AI. L’Oréal Groupe, the world’s leading beauty player, announced its collaboration with NVIDIA today. Through this collaboration, L’Oréal and its partner ecosystem will leverage the NVIDIA AI Enterprise platform to transform its consumer beauty experiences, marketing and advertising content pipelines. “AI doesn’t think with the same constraints as a human being. That opens new avenues for creativity,” said Anne Machet, global head of content and entertainment at L’Oréal. “Generative AI enables our teams and partner agencies to explore creative possibilities.” CreAItech, L’Oréal’s generative AI content platform, is augmenting the creativity of marketing and content teams. Combining a modular ecosystem of models, expertise, technologies and partners — including NVIDIA — CreAltech empowers marketers to generate thousands of unique, on-brand images, videos and lines of text for diverse platforms and global audiences. The solution empowers L’Oréal’s marketing teams to quickly iterate on campaigns that improve consumer engagement across social media, e-commerce content and influencer marketing — driving higher conversion rates. Noli.com, the first AI-powered multi-brand marketplace startup founded and backed by the  L’Oréal Groupe, is reinventing how people discover and shop for beauty products. Noli’s AI Beauty Matchmaker experience uses L’Oréal Groupe’s century-long expertise in beauty, including its extensive knowledge of beauty science, beauty tech and consumer insights, built from over 1 million skin data points and analysis of thousands of product formulations. It gives users a BeautyDNA profile with expert-level guidance and personalized product recommendations for skincare and haircare. “Beauty shoppers are often overwhelmed by choice and struggling to find the products that are right for them,” said Amos Susskind, founder and CEO of Noli. “By applying the latest AI models accelerated by NVIDIA and Accenture to the unparalleled knowledge base and expertise of the L’Oréal Groupe, we can provide hyper-personalized, explainable recommendations to our users.”  The Accenture AI Refinery, powered by NVIDIA AI Enterprise, will provide the platform for Noli to experiment and scale. Noli’s new agent models will use NVIDIA NIM and NVIDIA NeMo microservices, including NeMo Retriever, running on Microsoft Azure. Rapid Innovation With the NVIDIA Partner Ecosystem NVIDIA’s ecosystem of solution provider partners empowers retail and CPG companies to innovate faster, personalize customer experiences, and optimize operations with NVIDIA accelerated computing and AI. Global digital agency Monks is reshaping the landscape of AI-driven marketing, creative production and enterprise transformation. At the heart of their innovation lies the Monks.Flow platform that enhances both the speed and sophistication of creative workflows through NVIDIA Omniverse, NVIDIA NIM microservices and Triton Inference Server for lightning-fast inference. AI image solutions provider Bria is helping retail giants like Lidl and L’Oreal to enhance marketing asset creation. Bria AI transforms static product images into compelling, dynamic advertisements that can be quickly scaled for use across any marketing need. The company’s generative AI platform uses NVIDIA Triton Inference Server software and the NVIDIA TensorRT software development kit for accelerated inference, as well as NVIDIA NIM and NeMo microservices for quick image generation at scale. Physical AI Brings Acceleration to Supply Chain and Logistics AI’s impact extends far beyond the digital world. Physical AI-powered warehousing robots, for example, are helping maximize efficiency in retail supply chain operations. Four in five retail companies have reported that AI has helped reduce supply chain operational costs, with 25% reporting cost reductions of at least 10%. Technology providers Lyric, KoiReader Technologies and Exotec are tackling the challenges of integrating AI into complex warehouse environments. Lyric is using the NVIDIA cuOpt GPU-accelerated solver for warehouse network planning and route optimization, and is collaborating with NVIDIA to apply the technology to broader supply chain decision-making problems. KoiReader Technologies is tapping the NVIDIA Metropolis stack for its computer vision solutions within logistics, supply chain and manufacturing environments using the KoiVision Platform. And Exotec is using NVIDIA CUDA libraries and the NVIDIA JetPack software development kit for embedded robotic systems in warehouse and distribution centers. From real-time robotics orchestration to predictive maintenance, these solutions are delivering impact on uptime, throughput and cost savings for supply chain operations. Learn more by joining a follow-up discussion on digital twins and AI-powered creativity with Microsoft, Nestlé, Accenture and NVIDIA at Cannes Lions on Monday, June 16. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions. #retail #reboot #major #global #brands
    BLOGS.NVIDIA.COM
    Retail Reboot: Major Global Brands Transform End-to-End Operations With NVIDIA
    AI is packing and shipping efficiency for the retail and consumer packaged goods (CPG) industries, with a majority of surveyed companies in the space reporting the technology is increasing revenue and reducing operational costs. Global brands are reimagining every facet of their businesses with AI, from how products are designed and manufactured to how they’re marketed, shipped and experienced in-store and online. At NVIDIA GTC Paris at VivaTech, industry leaders including L’Oréal, LVMH and Nestlé shared how they’re using tools like AI agents and physical AI — powered by NVIDIA AI and simulation technologies — across every step of the product lifecycle to enhance operations and experiences for partners, customers and employees. 3D Digital Twins and AI Transform Marketing, Advertising and Product Design The meeting of generative AI and 3D product digital twins results in unlimited creative potential. Nestlé, the world’s largest food and beverage company, today announced a collaboration with NVIDIA and Accenture to launch a new, AI-powered in-house service that will create high-quality product content at scale for e-commerce and digital media channels. The new content service, based on digital twins powered by the NVIDIA Omniverse platform, creates exact 3D virtual replicas of physical products. Product packaging can be adjusted or localized digitally, enabling seamless integration into various environments, such as seasonal campaigns or channel-specific formats. This means that new creative content can be generated without having to constantly reshoot from scratch. Image courtesy of Nestlé The service is developed in partnership with Accenture Song, using Accenture AI Refinery built on NVIDIA Omniverse for advanced digital twin creation. It uses NVIDIA AI Enterprise for generative AI, hosted on Microsoft Azure for robust cloud infrastructure. Nestlé already has a baseline of 4,000 3D digital products — mainly for global brands — with the ambition to convert a total of 10,000 products into digital twins in the next two years across global and local brands. LVMH, the world’s leading luxury goods company, home to 75 distinguished maisons, is bringing 3D digital twins to its content production processes through its wine and spirits division, Moët Hennessy. The group partnered with content configuration engine Grip to develop a solution using the NVIDIA Omniverse platform, which enables the creation of 3D digital twins that power content variation production. With Grip’s solution, Moët Hennessy teams can quickly generate digital marketing assets and experiences to promote luxury products at scale. The initiative, led by Capucine Lafarge and Chloé Fournier, has been recognized by LVMH as a leading approach to scaling content creation. Image courtesy of Grip L’Oréal Gives Marketing and Online Shopping an AI Makeover Innovation starts at the drawing board. Today, that board is digital — and it’s powered by AI. L’Oréal Groupe, the world’s leading beauty player, announced its collaboration with NVIDIA today. Through this collaboration, L’Oréal and its partner ecosystem will leverage the NVIDIA AI Enterprise platform to transform its consumer beauty experiences, marketing and advertising content pipelines. “AI doesn’t think with the same constraints as a human being. That opens new avenues for creativity,” said Anne Machet, global head of content and entertainment at L’Oréal. “Generative AI enables our teams and partner agencies to explore creative possibilities.” CreAItech, L’Oréal’s generative AI content platform, is augmenting the creativity of marketing and content teams. Combining a modular ecosystem of models, expertise, technologies and partners — including NVIDIA — CreAltech empowers marketers to generate thousands of unique, on-brand images, videos and lines of text for diverse platforms and global audiences. The solution empowers L’Oréal’s marketing teams to quickly iterate on campaigns that improve consumer engagement across social media, e-commerce content and influencer marketing — driving higher conversion rates. Noli.com, the first AI-powered multi-brand marketplace startup founded and backed by the  L’Oréal Groupe, is reinventing how people discover and shop for beauty products. Noli’s AI Beauty Matchmaker experience uses L’Oréal Groupe’s century-long expertise in beauty, including its extensive knowledge of beauty science, beauty tech and consumer insights, built from over 1 million skin data points and analysis of thousands of product formulations. It gives users a BeautyDNA profile with expert-level guidance and personalized product recommendations for skincare and haircare. “Beauty shoppers are often overwhelmed by choice and struggling to find the products that are right for them,” said Amos Susskind, founder and CEO of Noli. “By applying the latest AI models accelerated by NVIDIA and Accenture to the unparalleled knowledge base and expertise of the L’Oréal Groupe, we can provide hyper-personalized, explainable recommendations to our users.”  https://blogs.nvidia.com/wp-content/uploads/2025/06/Noli_Demo.mp4 The Accenture AI Refinery, powered by NVIDIA AI Enterprise, will provide the platform for Noli to experiment and scale. Noli’s new agent models will use NVIDIA NIM and NVIDIA NeMo microservices, including NeMo Retriever, running on Microsoft Azure. Rapid Innovation With the NVIDIA Partner Ecosystem NVIDIA’s ecosystem of solution provider partners empowers retail and CPG companies to innovate faster, personalize customer experiences, and optimize operations with NVIDIA accelerated computing and AI. Global digital agency Monks is reshaping the landscape of AI-driven marketing, creative production and enterprise transformation. At the heart of their innovation lies the Monks.Flow platform that enhances both the speed and sophistication of creative workflows through NVIDIA Omniverse, NVIDIA NIM microservices and Triton Inference Server for lightning-fast inference. AI image solutions provider Bria is helping retail giants like Lidl and L’Oreal to enhance marketing asset creation. Bria AI transforms static product images into compelling, dynamic advertisements that can be quickly scaled for use across any marketing need. The company’s generative AI platform uses NVIDIA Triton Inference Server software and the NVIDIA TensorRT software development kit for accelerated inference, as well as NVIDIA NIM and NeMo microservices for quick image generation at scale. Physical AI Brings Acceleration to Supply Chain and Logistics AI’s impact extends far beyond the digital world. Physical AI-powered warehousing robots, for example, are helping maximize efficiency in retail supply chain operations. Four in five retail companies have reported that AI has helped reduce supply chain operational costs, with 25% reporting cost reductions of at least 10%. Technology providers Lyric, KoiReader Technologies and Exotec are tackling the challenges of integrating AI into complex warehouse environments. Lyric is using the NVIDIA cuOpt GPU-accelerated solver for warehouse network planning and route optimization, and is collaborating with NVIDIA to apply the technology to broader supply chain decision-making problems. KoiReader Technologies is tapping the NVIDIA Metropolis stack for its computer vision solutions within logistics, supply chain and manufacturing environments using the KoiVision Platform. And Exotec is using NVIDIA CUDA libraries and the NVIDIA JetPack software development kit for embedded robotic systems in warehouse and distribution centers. From real-time robotics orchestration to predictive maintenance, these solutions are delivering impact on uptime, throughput and cost savings for supply chain operations. Learn more by joining a follow-up discussion on digital twins and AI-powered creativity with Microsoft, Nestlé, Accenture and NVIDIA at Cannes Lions on Monday, June 16. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    Like
    Love
    Sad
    Wow
    Angry
    23
    0 Comentários 0 Compartilhamentos
  • Calling on LLMs: New NVIDIA AI Blueprint Helps Automate Telco Network Configuration

    Telecom companies last year spent nearly billion in capital expenditures and over trillion in operating expenditures.
    These large expenses are due in part to laborious manual processes that telcos face when operating networks that require continuous optimizations.
    For example, telcos must constantly tune network parameters for tasks — such as transferring calls from one network to another or distributing network traffic across multiple servers — based on the time of day, user behavior, mobility and traffic type.
    These factors directly affect network performance, user experience and energy consumption.
    To automate these optimization processes and save costs for telcos across the globe, NVIDIA today unveiled at GTC Paris its first AI Blueprint for telco network configuration.
    At the blueprint’s core are customized large language models trained specifically on telco network data — as well as the full technical and operational architecture for turning the LLMs into an autonomous, goal-driven AI agent for telcos.
    Automate Network Configuration With the AI Blueprint
    NVIDIA AI Blueprints — available on build.nvidia.com — are customizable AI workflow examples. They include reference code, documentation and deployment tools that show enterprise developers how to deliver business value with NVIDIA NIM microservices.
    The AI Blueprint for telco network configuration — built with BubbleRAN 5G solutions and datasets — enables developers, network engineers and telecom providers to automatically optimize the configuration of network parameters using agentic AI.
    This can streamline operations, reduce costs and significantly improve service quality by embedding continuous learning and adaptability directly into network infrastructures.
    Traditionally, network configurations required manual intervention or followed rigid rules to adapt to dynamic network conditions. These approaches limited adaptability and increased operational complexities, costs and inefficiencies.
    The new blueprint helps shift telco operations from relying on static, rules-based systems to operations based on dynamic, AI-driven automation. It enables developers to build advanced, telco-specific AI agents that make real-time, intelligent decisions and autonomously balance trade-offs — such as network speed versus interference, or energy savings versus utilization — without human input.
    Powered and Deployed by Industry Leaders
    Trained on 5G data generated by BubbleRAN, and deployed on the BubbleRAN 5G O-RAN platform, the blueprint provides telcos with insight on how to set various parameters to reach performance goals, like achieving a certain bitrate while choosing an acceptable signal-to-noise ratio — a measure that impacts voice quality and thus user experience.
    With the new AI Blueprint, network engineers can confidently set initial parameter values and update them as demanded by continuous network changes.
    Norway-based Telenor Group, which serves over 200 million customers globally, is the first telco to integrate the AI Blueprint for telco network configuration as part of its initiative to deploy intelligent, autonomous networks that meet the performance and agility demands of 5G and beyond.
    “The blueprint is helping us address configuration challenges and enhance quality of service during network installation,” said Knut Fjellheim, chief technology innovation officer at Telenor Maritime. “Implementing it is part of our push toward network automation and follows the successful deployment of agentic AI for real-time network slicing in a private 5G maritime use case.”
    Industry Partners Deploy Other NVIDIA-Powered Autonomous Network Technologies
    The AI Blueprint for telco network configuration is just one of many announcements at NVIDIA GTC Paris showcasing how the telecom industry is using agentic AI to make autonomous networks a reality.
    Beyond the blueprint, leading telecom companies and solutions providers are tapping into NVIDIA accelerated computing, software and microservices to provide breakthrough innovations poised to vastly improve networks and communications services — accelerating the progress to autonomous networks and improving customer experiences.
    NTT DATA is powering its agentic platform for telcos with NVIDIA accelerated compute and the NVIDIA AI Enterprise software platform. Its first agentic use case is focused on network alarms management, where NVIDIA NIM microservices help automate and power observability, troubleshooting, anomaly detection and resolution with closed loop ticketing.
    Tata Consultancy Services is delivering agentic AI solutions for telcos built on NVIDIA DGX Cloud and using NVIDIA AI Enterprise to develop, fine-tune and integrate large telco models into AI agent workflows. These range from billing and revenue assurance, autonomous network management to hybrid edge-cloud distributed inference.
    For example, the company’s anomaly management agentic AI model includes real-time detection and resolution of network anomalies and service performance optimization. This increases business agility and improves operational efficiencies by up to 40% by eliminating human intensive toils, overheads and cross-departmental silos.
    Prodapt has introduced an autonomous operations workflow for networks, powered by NVIDIA AI Enterprise, that offers agentic AI capabilities to support autonomous telecom networks. AI agents can autonomously monitor networks, detect anomalies in real time, initiate diagnostics, analyze root causes of issues using historical data and correlation techniques, automatically execute corrective actions, and generate, enrich and assign incident tickets through integrated ticketing systems.
    Accenture announced its new portfolio of agentic AI solutions for telecommunications through its AI Refinery platform, built on NVIDIA AI Enterprise software and accelerated computing.
    The first available solution, the NOC Agentic App, boosts network operations center tasks by using a generative AI-driven, nonlinear agentic framework to automate processes such as incident and fault management, root cause analysis and configuration planning. Using the Llama 3.1 70B NVIDIA NIM microservice and the AI Refinery Distiller Framework, the NOC Agentic App orchestrates networks of intelligent agents for faster, more efficient decision-making.
    Infosys is announcing its agentic autonomous operations platform, called Infosys Smart Network Assurance, designed to accelerate telecom operators’ journeys toward fully autonomous network operations.
    ISNA helps address long-standing operational challenges for telcos — such as limited automation and high average time to repair — with an integrated, AI-driven platform that reduces operational costs by up to 40% and shortens fault resolution times by up to 30%. NVIDIA NIM and NeMo microservices enhance the platform’s reasoning and hallucination-detection capabilities, reduce latency and increase accuracy.
    Get started with the new blueprint today.
    Learn more about the latest AI advancements for telecom and other industries at NVIDIA GTC Paris, running through Thursday, June 12, at VivaTech, including a keynote from NVIDIA founder and CEO Jensen Huang and a special address from Ronnie Vasishta, senior vice president of telecom at NVIDIA. Plus, hear from industry leaders in a panel session with Orange, Swisscom, Telenor and NVIDIA.
    #calling #llms #new #nvidia #blueprint
    Calling on LLMs: New NVIDIA AI Blueprint Helps Automate Telco Network Configuration
    Telecom companies last year spent nearly billion in capital expenditures and over trillion in operating expenditures. These large expenses are due in part to laborious manual processes that telcos face when operating networks that require continuous optimizations. For example, telcos must constantly tune network parameters for tasks — such as transferring calls from one network to another or distributing network traffic across multiple servers — based on the time of day, user behavior, mobility and traffic type. These factors directly affect network performance, user experience and energy consumption. To automate these optimization processes and save costs for telcos across the globe, NVIDIA today unveiled at GTC Paris its first AI Blueprint for telco network configuration. At the blueprint’s core are customized large language models trained specifically on telco network data — as well as the full technical and operational architecture for turning the LLMs into an autonomous, goal-driven AI agent for telcos. Automate Network Configuration With the AI Blueprint NVIDIA AI Blueprints — available on build.nvidia.com — are customizable AI workflow examples. They include reference code, documentation and deployment tools that show enterprise developers how to deliver business value with NVIDIA NIM microservices. The AI Blueprint for telco network configuration — built with BubbleRAN 5G solutions and datasets — enables developers, network engineers and telecom providers to automatically optimize the configuration of network parameters using agentic AI. This can streamline operations, reduce costs and significantly improve service quality by embedding continuous learning and adaptability directly into network infrastructures. Traditionally, network configurations required manual intervention or followed rigid rules to adapt to dynamic network conditions. These approaches limited adaptability and increased operational complexities, costs and inefficiencies. The new blueprint helps shift telco operations from relying on static, rules-based systems to operations based on dynamic, AI-driven automation. It enables developers to build advanced, telco-specific AI agents that make real-time, intelligent decisions and autonomously balance trade-offs — such as network speed versus interference, or energy savings versus utilization — without human input. Powered and Deployed by Industry Leaders Trained on 5G data generated by BubbleRAN, and deployed on the BubbleRAN 5G O-RAN platform, the blueprint provides telcos with insight on how to set various parameters to reach performance goals, like achieving a certain bitrate while choosing an acceptable signal-to-noise ratio — a measure that impacts voice quality and thus user experience. With the new AI Blueprint, network engineers can confidently set initial parameter values and update them as demanded by continuous network changes. Norway-based Telenor Group, which serves over 200 million customers globally, is the first telco to integrate the AI Blueprint for telco network configuration as part of its initiative to deploy intelligent, autonomous networks that meet the performance and agility demands of 5G and beyond. “The blueprint is helping us address configuration challenges and enhance quality of service during network installation,” said Knut Fjellheim, chief technology innovation officer at Telenor Maritime. “Implementing it is part of our push toward network automation and follows the successful deployment of agentic AI for real-time network slicing in a private 5G maritime use case.” Industry Partners Deploy Other NVIDIA-Powered Autonomous Network Technologies The AI Blueprint for telco network configuration is just one of many announcements at NVIDIA GTC Paris showcasing how the telecom industry is using agentic AI to make autonomous networks a reality. Beyond the blueprint, leading telecom companies and solutions providers are tapping into NVIDIA accelerated computing, software and microservices to provide breakthrough innovations poised to vastly improve networks and communications services — accelerating the progress to autonomous networks and improving customer experiences. NTT DATA is powering its agentic platform for telcos with NVIDIA accelerated compute and the NVIDIA AI Enterprise software platform. Its first agentic use case is focused on network alarms management, where NVIDIA NIM microservices help automate and power observability, troubleshooting, anomaly detection and resolution with closed loop ticketing. Tata Consultancy Services is delivering agentic AI solutions for telcos built on NVIDIA DGX Cloud and using NVIDIA AI Enterprise to develop, fine-tune and integrate large telco models into AI agent workflows. These range from billing and revenue assurance, autonomous network management to hybrid edge-cloud distributed inference. For example, the company’s anomaly management agentic AI model includes real-time detection and resolution of network anomalies and service performance optimization. This increases business agility and improves operational efficiencies by up to 40% by eliminating human intensive toils, overheads and cross-departmental silos. Prodapt has introduced an autonomous operations workflow for networks, powered by NVIDIA AI Enterprise, that offers agentic AI capabilities to support autonomous telecom networks. AI agents can autonomously monitor networks, detect anomalies in real time, initiate diagnostics, analyze root causes of issues using historical data and correlation techniques, automatically execute corrective actions, and generate, enrich and assign incident tickets through integrated ticketing systems. Accenture announced its new portfolio of agentic AI solutions for telecommunications through its AI Refinery platform, built on NVIDIA AI Enterprise software and accelerated computing. The first available solution, the NOC Agentic App, boosts network operations center tasks by using a generative AI-driven, nonlinear agentic framework to automate processes such as incident and fault management, root cause analysis and configuration planning. Using the Llama 3.1 70B NVIDIA NIM microservice and the AI Refinery Distiller Framework, the NOC Agentic App orchestrates networks of intelligent agents for faster, more efficient decision-making. Infosys is announcing its agentic autonomous operations platform, called Infosys Smart Network Assurance, designed to accelerate telecom operators’ journeys toward fully autonomous network operations. ISNA helps address long-standing operational challenges for telcos — such as limited automation and high average time to repair — with an integrated, AI-driven platform that reduces operational costs by up to 40% and shortens fault resolution times by up to 30%. NVIDIA NIM and NeMo microservices enhance the platform’s reasoning and hallucination-detection capabilities, reduce latency and increase accuracy. Get started with the new blueprint today. Learn more about the latest AI advancements for telecom and other industries at NVIDIA GTC Paris, running through Thursday, June 12, at VivaTech, including a keynote from NVIDIA founder and CEO Jensen Huang and a special address from Ronnie Vasishta, senior vice president of telecom at NVIDIA. Plus, hear from industry leaders in a panel session with Orange, Swisscom, Telenor and NVIDIA. #calling #llms #new #nvidia #blueprint
    BLOGS.NVIDIA.COM
    Calling on LLMs: New NVIDIA AI Blueprint Helps Automate Telco Network Configuration
    Telecom companies last year spent nearly $295 billion in capital expenditures and over $1 trillion in operating expenditures. These large expenses are due in part to laborious manual processes that telcos face when operating networks that require continuous optimizations. For example, telcos must constantly tune network parameters for tasks — such as transferring calls from one network to another or distributing network traffic across multiple servers — based on the time of day, user behavior, mobility and traffic type. These factors directly affect network performance, user experience and energy consumption. To automate these optimization processes and save costs for telcos across the globe, NVIDIA today unveiled at GTC Paris its first AI Blueprint for telco network configuration. At the blueprint’s core are customized large language models trained specifically on telco network data — as well as the full technical and operational architecture for turning the LLMs into an autonomous, goal-driven AI agent for telcos. Automate Network Configuration With the AI Blueprint NVIDIA AI Blueprints — available on build.nvidia.com — are customizable AI workflow examples. They include reference code, documentation and deployment tools that show enterprise developers how to deliver business value with NVIDIA NIM microservices. The AI Blueprint for telco network configuration — built with BubbleRAN 5G solutions and datasets — enables developers, network engineers and telecom providers to automatically optimize the configuration of network parameters using agentic AI. This can streamline operations, reduce costs and significantly improve service quality by embedding continuous learning and adaptability directly into network infrastructures. Traditionally, network configurations required manual intervention or followed rigid rules to adapt to dynamic network conditions. These approaches limited adaptability and increased operational complexities, costs and inefficiencies. The new blueprint helps shift telco operations from relying on static, rules-based systems to operations based on dynamic, AI-driven automation. It enables developers to build advanced, telco-specific AI agents that make real-time, intelligent decisions and autonomously balance trade-offs — such as network speed versus interference, or energy savings versus utilization — without human input. Powered and Deployed by Industry Leaders Trained on 5G data generated by BubbleRAN, and deployed on the BubbleRAN 5G O-RAN platform, the blueprint provides telcos with insight on how to set various parameters to reach performance goals, like achieving a certain bitrate while choosing an acceptable signal-to-noise ratio — a measure that impacts voice quality and thus user experience. With the new AI Blueprint, network engineers can confidently set initial parameter values and update them as demanded by continuous network changes. Norway-based Telenor Group, which serves over 200 million customers globally, is the first telco to integrate the AI Blueprint for telco network configuration as part of its initiative to deploy intelligent, autonomous networks that meet the performance and agility demands of 5G and beyond. “The blueprint is helping us address configuration challenges and enhance quality of service during network installation,” said Knut Fjellheim, chief technology innovation officer at Telenor Maritime. “Implementing it is part of our push toward network automation and follows the successful deployment of agentic AI for real-time network slicing in a private 5G maritime use case.” Industry Partners Deploy Other NVIDIA-Powered Autonomous Network Technologies The AI Blueprint for telco network configuration is just one of many announcements at NVIDIA GTC Paris showcasing how the telecom industry is using agentic AI to make autonomous networks a reality. Beyond the blueprint, leading telecom companies and solutions providers are tapping into NVIDIA accelerated computing, software and microservices to provide breakthrough innovations poised to vastly improve networks and communications services — accelerating the progress to autonomous networks and improving customer experiences. NTT DATA is powering its agentic platform for telcos with NVIDIA accelerated compute and the NVIDIA AI Enterprise software platform. Its first agentic use case is focused on network alarms management, where NVIDIA NIM microservices help automate and power observability, troubleshooting, anomaly detection and resolution with closed loop ticketing. Tata Consultancy Services is delivering agentic AI solutions for telcos built on NVIDIA DGX Cloud and using NVIDIA AI Enterprise to develop, fine-tune and integrate large telco models into AI agent workflows. These range from billing and revenue assurance, autonomous network management to hybrid edge-cloud distributed inference. For example, the company’s anomaly management agentic AI model includes real-time detection and resolution of network anomalies and service performance optimization. This increases business agility and improves operational efficiencies by up to 40% by eliminating human intensive toils, overheads and cross-departmental silos. Prodapt has introduced an autonomous operations workflow for networks, powered by NVIDIA AI Enterprise, that offers agentic AI capabilities to support autonomous telecom networks. AI agents can autonomously monitor networks, detect anomalies in real time, initiate diagnostics, analyze root causes of issues using historical data and correlation techniques, automatically execute corrective actions, and generate, enrich and assign incident tickets through integrated ticketing systems. Accenture announced its new portfolio of agentic AI solutions for telecommunications through its AI Refinery platform, built on NVIDIA AI Enterprise software and accelerated computing. The first available solution, the NOC Agentic App, boosts network operations center tasks by using a generative AI-driven, nonlinear agentic framework to automate processes such as incident and fault management, root cause analysis and configuration planning. Using the Llama 3.1 70B NVIDIA NIM microservice and the AI Refinery Distiller Framework, the NOC Agentic App orchestrates networks of intelligent agents for faster, more efficient decision-making. Infosys is announcing its agentic autonomous operations platform, called Infosys Smart Network Assurance (ISNA), designed to accelerate telecom operators’ journeys toward fully autonomous network operations. ISNA helps address long-standing operational challenges for telcos — such as limited automation and high average time to repair — with an integrated, AI-driven platform that reduces operational costs by up to 40% and shortens fault resolution times by up to 30%. NVIDIA NIM and NeMo microservices enhance the platform’s reasoning and hallucination-detection capabilities, reduce latency and increase accuracy. Get started with the new blueprint today. Learn more about the latest AI advancements for telecom and other industries at NVIDIA GTC Paris, running through Thursday, June 12, at VivaTech, including a keynote from NVIDIA founder and CEO Jensen Huang and a special address from Ronnie Vasishta, senior vice president of telecom at NVIDIA. Plus, hear from industry leaders in a panel session with Orange, Swisscom, Telenor and NVIDIA.
    Like
    Love
    Wow
    Sad
    Angry
    80
    0 Comentários 0 Compartilhamentos
  • Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid

    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand.
    Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation.
    At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics.
    Future use cases for AEON include:

    Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Realityplatform powering Hexagon Reality Cloud Studio.
    Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings.
    Part inspection, which includes checking parts for defects or ensuring adherence to specifications.
    Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners.

    “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.”

    Using NVIDIA’s Three Computers to Develop AEON 
    To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models.
    Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations.
    AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning.


    This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment.
    In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation.
    “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.”
    Data Comes to Life Through Reality Capture and Omniverse Integration 
    AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas.

    Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure.
    “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.”
    AEON’s Next Steps
    By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON.
    This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data.
    Watch the Hexagon LIVE keynote, explore presentations and read more about AEON.
    All imagery courtesy of Hexagon.
    #hexagon #taps #nvidia #robotics #software
    Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid
    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand. Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation. At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics. Future use cases for AEON include: Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Realityplatform powering Hexagon Reality Cloud Studio. Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings. Part inspection, which includes checking parts for defects or ensuring adherence to specifications. Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners. “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.” Using NVIDIA’s Three Computers to Develop AEON  To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models. Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations. AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning. This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment. In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation. “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.” Data Comes to Life Through Reality Capture and Omniverse Integration  AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas. Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure. “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.” AEON’s Next Steps By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON. This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data. Watch the Hexagon LIVE keynote, explore presentations and read more about AEON. All imagery courtesy of Hexagon. #hexagon #taps #nvidia #robotics #software
    BLOGS.NVIDIA.COM
    Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid
    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand. Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation. At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics. Future use cases for AEON include: Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Reality (HxDR) platform powering Hexagon Reality Cloud Studio (RCS). Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings. Part inspection, which includes checking parts for defects or ensuring adherence to specifications. Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners. “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.” Using NVIDIA’s Three Computers to Develop AEON  To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models. Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations. AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning. https://blogs.nvidia.com/wp-content/uploads/2025/06/Copy-of-robotics-hxgn-live-blog-1920x1080-1.mp4 This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment. In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation. “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.” Data Comes to Life Through Reality Capture and Omniverse Integration  AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas. Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure. “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.” AEON’s Next Steps By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON. This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data. Watch the Hexagon LIVE keynote, explore presentations and read more about AEON. All imagery courtesy of Hexagon.
    Like
    Love
    Wow
    Sad
    Angry
    38
    0 Comentários 0 Compartilhamentos
  • Step Inside the Vault: The ‘Borderland’ Series Arrives on GeForce NOW

    GeForce NOW is throwing open the vault doors to welcome the legendary Borderland series to the cloud.
    Whether a seasoned Vault Hunter or new to the mayhem of Pandora, prepare to experience the high-octane action and humor that define the series that includes Borderlands Game of the Year Enhanced, Borderlands 2, Borderlands 3 and Borderlands: The Pre-Sequel.
    Members can explore it all before the highly anticipated Borderlands 4 arrives in the cloud at launch.
    In addition, leap into the flames and save the day in the pulse-pounding FBC: Firebreak from Remedy Entertainment on GeForce NOW.
    It’s all part of the 13 new games in the cloud this week, including the latest Genshin Impact update and advanced access for REMATCH.
    Plus, GeForce NOW’s Summer Sale is still in full swing. For a limited time, get 40% off a six-month GeForce NOW Performance membership — perfect for diving into role-playing game favorites like the Borderlands series or any of the 2,200 titles in the platform’s cloud gaming library.
    Vault Hunters Assemble
    Gear up for a world where loot is king and chaos is always just a trigger pull away. The Borderlands series is known for its wild humor, outrageous characters and nonstop action — and now, its chaotic adventures can be streamed on GeForce NOW.
    Welcome to Pandora.
    Members revisiting the classics or jumping in for the first time can start with Borderlands Game of the Year Enhanced, the original mayhem-fueled classic now polished and packed with downloadable content. The title brings Pandora to life with a fresh coat of paint, crazy loot and the same iconic humor that started it all.
    New worlds, same chaos.
    In Borderlands 2, Handsome Jack steals the show with his mix of charm and villainy. This sequel cranks up the fun and insanity with unforgettable characters and a zany storyline. For more laughs and even wilder chaos, Borderlands 3 delivers the biggest loot explosion yet, with new worlds to explore. Face off against the Calypso twins and enjoy nonstop action.
    The rise of Handsome Jack.
    The adventure blasts off with Borderlands: The Pre-Sequel, revealing how Handsome Jack became so handsome. The game throws in zero gravity, moon boots and enough sarcasm to fuel a spaceship.
    Jump in with GeForce NOW and get ready to laugh, loot and blast through Pandora, all from the cloud. With instant access and seamless streaming at up to 4K resolution with an Ultimate membership, enter the chaos of Borderlands anytime, anywhere. No downloads, no waiting.
    Suit Up, Clean Up
    The Oldest House needs you.
    Step into the shoes of the Federal Bureau of Control’s elite first responders in the highly anticipated three-player co-op first-person shooter FBC: Firebreak. Taking place six years after Control, the game is set in the Oldest House — under siege by reality-warping threats. It’s up to players to restore order before chaos wins.
    Equip unique Crisis Kits packed with weapons, specialized tools and paranatural augments, like a garden gnome that summons a thunderstorm or a piggy bank that spews coins. As each mission, or “Job,” drops players into unpredictable environments with shifting objectives, bizarre crises and wacky enemies, teamwork and quick thinking are key.
    Jump into the fray with friends and stream it on GeForce NOW instantly across devices. Experience the mind-bending action and stunning visuals powered by cloud streaming. Contain the chaos, save the Oldest House and enjoy a new kind of co-op adventure, all from the cloud.
    No Rules Included
    Score big laughs in the cloud.
    REMATCH gives soccer a bold twist, transforming the classic sport into a fast-paced, third-person action experience where every player controls a single athlete on the field.
    With no fouls, offsides or breaks, matches are nonstop and skills-based, demanding quick reflexes and seamless teamwork. Dynamic role-switching lets players jump between attack, defense and goalkeeping, while seasonal updates and various multiplayer modes keep the competition fresh and the action intense.
    Where arcade flair meets tactical depth, REMATCH is football, unleashed. Get instant access to the soccer pitch by streaming the title on GeForce NOW and jump into the action wherever the match calls.
    Time To Game
    Skirk has arrived.
    Genshin Impact’s next major update launches this week, and members can stream the latest adventures from Teyvat at GeForce quality on any device. Version 5.7 includes the new playable characters Skirk and Dahlia — as well as fresh story quests and the launch of a Stygian Onslaught combat mode.
    Look for the following games available to stream in the cloud this week:

    REMATCHBroken ArrowCrime SimulatorDate Everything!FBC: FirebreakLost in Random: The Eternal DieArchitect Life: A House Design SimulatorBorderlands Game of the Year EnhancedBorderlands 2Borderlands 3Borderlands: The Pre-SequelMETAL EDEN DemoTorque Drift 2What are you planning to play this weekend? Let us know on X or in the comments below.

    What's a gaming achievement you'll never forget?
    — NVIDIA GeForce NOWJune 18, 2025
    #step #inside #vault #borderland #series
    Step Inside the Vault: The ‘Borderland’ Series Arrives on GeForce NOW
    GeForce NOW is throwing open the vault doors to welcome the legendary Borderland series to the cloud. Whether a seasoned Vault Hunter or new to the mayhem of Pandora, prepare to experience the high-octane action and humor that define the series that includes Borderlands Game of the Year Enhanced, Borderlands 2, Borderlands 3 and Borderlands: The Pre-Sequel. Members can explore it all before the highly anticipated Borderlands 4 arrives in the cloud at launch. In addition, leap into the flames and save the day in the pulse-pounding FBC: Firebreak from Remedy Entertainment on GeForce NOW. It’s all part of the 13 new games in the cloud this week, including the latest Genshin Impact update and advanced access for REMATCH. Plus, GeForce NOW’s Summer Sale is still in full swing. For a limited time, get 40% off a six-month GeForce NOW Performance membership — perfect for diving into role-playing game favorites like the Borderlands series or any of the 2,200 titles in the platform’s cloud gaming library. Vault Hunters Assemble Gear up for a world where loot is king and chaos is always just a trigger pull away. The Borderlands series is known for its wild humor, outrageous characters and nonstop action — and now, its chaotic adventures can be streamed on GeForce NOW. Welcome to Pandora. Members revisiting the classics or jumping in for the first time can start with Borderlands Game of the Year Enhanced, the original mayhem-fueled classic now polished and packed with downloadable content. The title brings Pandora to life with a fresh coat of paint, crazy loot and the same iconic humor that started it all. New worlds, same chaos. In Borderlands 2, Handsome Jack steals the show with his mix of charm and villainy. This sequel cranks up the fun and insanity with unforgettable characters and a zany storyline. For more laughs and even wilder chaos, Borderlands 3 delivers the biggest loot explosion yet, with new worlds to explore. Face off against the Calypso twins and enjoy nonstop action. The rise of Handsome Jack. The adventure blasts off with Borderlands: The Pre-Sequel, revealing how Handsome Jack became so handsome. The game throws in zero gravity, moon boots and enough sarcasm to fuel a spaceship. Jump in with GeForce NOW and get ready to laugh, loot and blast through Pandora, all from the cloud. With instant access and seamless streaming at up to 4K resolution with an Ultimate membership, enter the chaos of Borderlands anytime, anywhere. No downloads, no waiting. Suit Up, Clean Up The Oldest House needs you. Step into the shoes of the Federal Bureau of Control’s elite first responders in the highly anticipated three-player co-op first-person shooter FBC: Firebreak. Taking place six years after Control, the game is set in the Oldest House — under siege by reality-warping threats. It’s up to players to restore order before chaos wins. Equip unique Crisis Kits packed with weapons, specialized tools and paranatural augments, like a garden gnome that summons a thunderstorm or a piggy bank that spews coins. As each mission, or “Job,” drops players into unpredictable environments with shifting objectives, bizarre crises and wacky enemies, teamwork and quick thinking are key. Jump into the fray with friends and stream it on GeForce NOW instantly across devices. Experience the mind-bending action and stunning visuals powered by cloud streaming. Contain the chaos, save the Oldest House and enjoy a new kind of co-op adventure, all from the cloud. No Rules Included Score big laughs in the cloud. REMATCH gives soccer a bold twist, transforming the classic sport into a fast-paced, third-person action experience where every player controls a single athlete on the field. With no fouls, offsides or breaks, matches are nonstop and skills-based, demanding quick reflexes and seamless teamwork. Dynamic role-switching lets players jump between attack, defense and goalkeeping, while seasonal updates and various multiplayer modes keep the competition fresh and the action intense. Where arcade flair meets tactical depth, REMATCH is football, unleashed. Get instant access to the soccer pitch by streaming the title on GeForce NOW and jump into the action wherever the match calls. Time To Game Skirk has arrived. Genshin Impact’s next major update launches this week, and members can stream the latest adventures from Teyvat at GeForce quality on any device. Version 5.7 includes the new playable characters Skirk and Dahlia — as well as fresh story quests and the launch of a Stygian Onslaught combat mode. Look for the following games available to stream in the cloud this week: REMATCHBroken ArrowCrime SimulatorDate Everything!FBC: FirebreakLost in Random: The Eternal DieArchitect Life: A House Design SimulatorBorderlands Game of the Year EnhancedBorderlands 2Borderlands 3Borderlands: The Pre-SequelMETAL EDEN DemoTorque Drift 2What are you planning to play this weekend? Let us know on X or in the comments below. What's a gaming achievement you'll never forget? — NVIDIA GeForce NOWJune 18, 2025 #step #inside #vault #borderland #series
    BLOGS.NVIDIA.COM
    Step Inside the Vault: The ‘Borderland’ Series Arrives on GeForce NOW
    GeForce NOW is throwing open the vault doors to welcome the legendary Borderland series to the cloud. Whether a seasoned Vault Hunter or new to the mayhem of Pandora, prepare to experience the high-octane action and humor that define the series that includes Borderlands Game of the Year Enhanced, Borderlands 2, Borderlands 3 and Borderlands: The Pre-Sequel. Members can explore it all before the highly anticipated Borderlands 4 arrives in the cloud at launch. In addition, leap into the flames and save the day in the pulse-pounding FBC: Firebreak from Remedy Entertainment on GeForce NOW. It’s all part of the 13 new games in the cloud this week, including the latest Genshin Impact update and advanced access for REMATCH. Plus, GeForce NOW’s Summer Sale is still in full swing. For a limited time, get 40% off a six-month GeForce NOW Performance membership — perfect for diving into role-playing game favorites like the Borderlands series or any of the 2,200 titles in the platform’s cloud gaming library. Vault Hunters Assemble Gear up for a world where loot is king and chaos is always just a trigger pull away. The Borderlands series is known for its wild humor, outrageous characters and nonstop action — and now, its chaotic adventures can be streamed on GeForce NOW. Welcome to Pandora. Members revisiting the classics or jumping in for the first time can start with Borderlands Game of the Year Enhanced, the original mayhem-fueled classic now polished and packed with downloadable content. The title brings Pandora to life with a fresh coat of paint, crazy loot and the same iconic humor that started it all. New worlds, same chaos. In Borderlands 2, Handsome Jack steals the show with his mix of charm and villainy. This sequel cranks up the fun and insanity with unforgettable characters and a zany storyline. For more laughs and even wilder chaos, Borderlands 3 delivers the biggest loot explosion yet, with new worlds to explore. Face off against the Calypso twins and enjoy nonstop action. The rise of Handsome Jack. The adventure blasts off with Borderlands: The Pre-Sequel, revealing how Handsome Jack became so handsome. The game throws in zero gravity, moon boots and enough sarcasm to fuel a spaceship. Jump in with GeForce NOW and get ready to laugh, loot and blast through Pandora, all from the cloud. With instant access and seamless streaming at up to 4K resolution with an Ultimate membership, enter the chaos of Borderlands anytime, anywhere. No downloads, no waiting. Suit Up, Clean Up The Oldest House needs you. Step into the shoes of the Federal Bureau of Control’s elite first responders in the highly anticipated three-player co-op first-person shooter FBC: Firebreak. Taking place six years after Control, the game is set in the Oldest House — under siege by reality-warping threats. It’s up to players to restore order before chaos wins. Equip unique Crisis Kits packed with weapons, specialized tools and paranatural augments, like a garden gnome that summons a thunderstorm or a piggy bank that spews coins. As each mission, or “Job,” drops players into unpredictable environments with shifting objectives, bizarre crises and wacky enemies, teamwork and quick thinking are key. Jump into the fray with friends and stream it on GeForce NOW instantly across devices. Experience the mind-bending action and stunning visuals powered by cloud streaming. Contain the chaos, save the Oldest House and enjoy a new kind of co-op adventure, all from the cloud. No Rules Included Score big laughs in the cloud. REMATCH gives soccer a bold twist, transforming the classic sport into a fast-paced, third-person action experience where every player controls a single athlete on the field. With no fouls, offsides or breaks, matches are nonstop and skills-based, demanding quick reflexes and seamless teamwork. Dynamic role-switching lets players jump between attack, defense and goalkeeping, while seasonal updates and various multiplayer modes keep the competition fresh and the action intense. Where arcade flair meets tactical depth, REMATCH is football, unleashed. Get instant access to the soccer pitch by streaming the title on GeForce NOW and jump into the action wherever the match calls. Time To Game Skirk has arrived. Genshin Impact’s next major update launches this week, and members can stream the latest adventures from Teyvat at GeForce quality on any device. Version 5.7 includes the new playable characters Skirk and Dahlia — as well as fresh story quests and the launch of a Stygian Onslaught combat mode. Look for the following games available to stream in the cloud this week: REMATCH (New release on Steam, Xbox, available on PC Game Pass, June 16) Broken Arrow (New release on Steam, June 19) Crime Simulator (New release on Steam, June 17) Date Everything! (New release on Steam, June 17) FBC: Firebreak (New release on Steam, Xbox, available on PC Game Pass, June 17) Lost in Random: The Eternal Die (New release on Steam, Xbox, available on PC Game Pass, June 17) Architect Life: A House Design Simulator (New release on Steam, June 19) Borderlands Game of the Year Enhanced (Steam) Borderlands 2 (Steam, Epic Games Store) Borderlands 3 (Steam, Epic Games Store) Borderlands: The Pre-Sequel (Steam, Epic Games Store) METAL EDEN Demo (Steam) Torque Drift 2 (Epic Games Store) What are you planning to play this weekend? Let us know on X or in the comments below. What's a gaming achievement you'll never forget? — NVIDIA GeForce NOW (@NVIDIAGFN) June 18, 2025
    Like
    Love
    Wow
    Sad
    Angry
    32
    0 Comentários 0 Compartilhamentos
  • Monster Hunter Wilds’ second free title update brings fierce new monsters and more June 30

    New monsters, features, and more arrive in the Forbidden Lands with Free Title Update 2, dropping in Monster Hunter Wilds on June 30! Watch the latest trailer for a look at what awaits you.

    Play Video

    Monster Hunter Wilds – Free Title Update 2

    In addition to what’s featured in the trailer, Free Title Update 2 will also feature improvements and adjustments to various aspects of the game. Make sure to check the official Monster Hunter Wilds website for a new Director’s Letter from Game Director Yuya Tokuda coming soon, for a deeper dive into what’s coming in addition to the core new monsters and features.

    ● The Leviathan, Lagiacrus, emerges at last

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    The long-awaited Leviathan, Lagiacrus, has finally appeared in Monster Hunter Wilds! Floating at the top of the aquatic food chain, Lagiacrus is a master of the sea, boiling the surrounding water by emitting powerful currents of electricity. New missions to hunt Lagiacrus will become available for hunters at Hunter Rank 31 or above, and after clearing the “A World Turned Upside Down” main mission, and the “Forest Doshaguma” side mission.

    While you’ll fight Lagiacrus primarily on land, your hunt against this formidable foe can also take you deep underwater for a special encounter, where it feels most at home. During the underwater portion of the hunt, hunters won’t be able to use their weapons freely, but there are still ways to fight back and turn the tide of battle. Stay alert for your opportunities!

    Hunt Lagiacrus to obtain materials for new hunter and Palico armor! As usual, these sets can be used as layered armor as well.

    ● The Flying Wyvern, Seregios, strikes

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    Shining golden bright, the flying wyvern, Seregios, swoops into the Forbidden Lands with Free Title Update 2! Seregios is a highly mobile aerial monster that fires sharp bladescales, inflicting bleeding status on hunters. Keep an eye on your health and bring along rations and well-done steak when hunting this monster. Missions to hunt Seregios are available for hunters at HR 31 or above that have cleared the “A World Turned Upside Down” main mission.

    New hunter and Palico armor forged from Seregios materials awaits you!

    For hunters looking for a greater challenge, 8★ Tempered Lagiacrus and Seregios will begin appearing for hunters at HR 41 or higher, after completing their initial missions. Best of luck against these powerful monsters!

    Hunt in style with layered weapons

    With Free Title Update 2, hunters will be able to use Layered Weapons, which lets you use the look of any weapon, while keeping the stats and abilities of another.

    To unlock a weapon design as a Layered Weapon option, you’ll need to craft the final weapon in that weapon’s upgrade tree. Artian Weapons can be used as layered weapons by fully reinforcing a Rarity 8 Artian weapon.

    For weapons that change in appearance when upgraded, you’ll also have the option to use their pre-upgrade designs as well! You can also craft layered Palico weapons by forging their high-rank weapons. We hope this feature encourages you to delve deeper into crafting the powerful Artian Weapon you’ve been looking for, all while keeping the appearance of your favorite weapon.

    New optional features

    Change your choice of handler accompanying you in the field to Eric after completing the Lagiacrus mission in Free Title Update 2! You can always switch back to Alma too, but it doesn’t hurt to give our trusty handler a break from time to time.

    A new Support Hunter joins the fray

    Mina, a support hunter who wields a Sword & Shield, joins the hunt. With Free Title Update 2, you’ll be able to choose which support hunters can join you on quests.

    Photo Mode Improvements

    Snap even more creative photos of your hunts with some new options, including an Effects tab to adjust brightness and filter effects, and a Character Display tab to toggle off your Handler, Palico, Seikret, and more.

    Celebrate summer with the Festival of Accord: Flamefete seasonal event

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    The next seasonal event in Monster Hunter Wilds, the Festival of Accord: Flamefete, will take place in the Grand Hub from July 23 to August 6! Cool off with this summer themed celebration, where you can obtain new armor, gestures, and pop-up camp decorations for a limited time. You’ll also be able to eat special seasonal event meals and enjoy the fun of summer as the Grand Hub and all it’s members will be dressed to mark the occasion.

    Arch-Tempered Uth Duna slams down starting July 30

    Take on an even more powerful version of Uth Duna when Arch-Tempered Uth Duna arrives as an Event Quest and Free Challenge Quest from July 30 to August 20! Take on and defeat the challenging apex of the Scarlet Forest to obtain materials for crafting the new Uth Duna γ hunter armor set and the Felyne Uth Duna γ Palico armor set. Be sure you’re at least HR 50 or above to take on this quest.

    We’ve also got plenty of new Event Quests on the way in the weeks ahead, including some where you can earn new special equipment, quests to obtain more armor spheres, and challenge quests against Mizutsune. Be sure to keep checking back each week to see what’s new!

    A special collaboration with Fender

    Monster Hunter Wilds is collaborating with world-renowned guitar brand Fender®! From August 27 to September 24, a special Event Quest will be available to earn a collaboration gesture that lets you rock out with the Monster Hunter Rathalos Telecaster®.

    In celebration of Monster Hunter’s 20th anniversary, the globally released Monster Hunter Rathalos Telecaster® collaboration guitar is making its way into the game! Be sure to experience it both in-game and in real life!

    A new round of cosmetic DLC arrives

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    Express your style with additional DLC, including four free dance gestures. Paid cosmetic DLC, such as gestures, stickers, pendants, and more will also be available. If you’ve purchased the Premium Deluxe Edition of Monster Hunter Wilds or the Cosmetic DLC Pass, Cosmetic DLC Pack 2 and other additional items will be available to download when Free Title Update 2 releases. 

    Free Title Update roadmap

    We hope you’re excited to dive into all the content coming with Free Title Update 2! We’ll continue to release updates, with Free Title Update 3 coming at the end of September. Stay tuned for more details to come.

    A Monster Hunter Wilds background is added to the PS5 Welcome hub

    Alongside Free Title Update 2 on June 30, an animated background featuring the hunters facing Arkveld during the Inclemency will be added to the Welcome hub. Customize your PS5 Welcome hub with Monster Hunter Wilds to get you in the hunting mood.

    View and download image

    Download the image

    close
    Close

    Download this image

    How to change the backgroundWelcome hub -> Change background -> Games

    Try out Monster Hunter Wilds on PS5 with a PlayStation Plus Premium Game Trial starting on June 30

    View and download image

    Download the image

    close
    Close

    Download this image

    With the Game Trial, you can try out the full version of the game for 2 hours. If you decide to purchase the full version after the trial, your save data will carry over, allowing you to continue playing seamlessly right where you left off. If you haven’t played Monster Hunter Wilds yet, this is a great way to give it a try.

    Happy Hunting!
    #monster #hunter #wilds #second #free
    Monster Hunter Wilds’ second free title update brings fierce new monsters and more June 30
    New monsters, features, and more arrive in the Forbidden Lands with Free Title Update 2, dropping in Monster Hunter Wilds on June 30! Watch the latest trailer for a look at what awaits you. Play Video Monster Hunter Wilds – Free Title Update 2 In addition to what’s featured in the trailer, Free Title Update 2 will also feature improvements and adjustments to various aspects of the game. Make sure to check the official Monster Hunter Wilds website for a new Director’s Letter from Game Director Yuya Tokuda coming soon, for a deeper dive into what’s coming in addition to the core new monsters and features. ● The Leviathan, Lagiacrus, emerges at last View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The long-awaited Leviathan, Lagiacrus, has finally appeared in Monster Hunter Wilds! Floating at the top of the aquatic food chain, Lagiacrus is a master of the sea, boiling the surrounding water by emitting powerful currents of electricity. New missions to hunt Lagiacrus will become available for hunters at Hunter Rank 31 or above, and after clearing the “A World Turned Upside Down” main mission, and the “Forest Doshaguma” side mission. While you’ll fight Lagiacrus primarily on land, your hunt against this formidable foe can also take you deep underwater for a special encounter, where it feels most at home. During the underwater portion of the hunt, hunters won’t be able to use their weapons freely, but there are still ways to fight back and turn the tide of battle. Stay alert for your opportunities! Hunt Lagiacrus to obtain materials for new hunter and Palico armor! As usual, these sets can be used as layered armor as well. ● The Flying Wyvern, Seregios, strikes View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Shining golden bright, the flying wyvern, Seregios, swoops into the Forbidden Lands with Free Title Update 2! Seregios is a highly mobile aerial monster that fires sharp bladescales, inflicting bleeding status on hunters. Keep an eye on your health and bring along rations and well-done steak when hunting this monster. Missions to hunt Seregios are available for hunters at HR 31 or above that have cleared the “A World Turned Upside Down” main mission. New hunter and Palico armor forged from Seregios materials awaits you! For hunters looking for a greater challenge, 8★ Tempered Lagiacrus and Seregios will begin appearing for hunters at HR 41 or higher, after completing their initial missions. Best of luck against these powerful monsters! Hunt in style with layered weapons With Free Title Update 2, hunters will be able to use Layered Weapons, which lets you use the look of any weapon, while keeping the stats and abilities of another. To unlock a weapon design as a Layered Weapon option, you’ll need to craft the final weapon in that weapon’s upgrade tree. Artian Weapons can be used as layered weapons by fully reinforcing a Rarity 8 Artian weapon. For weapons that change in appearance when upgraded, you’ll also have the option to use their pre-upgrade designs as well! You can also craft layered Palico weapons by forging their high-rank weapons. We hope this feature encourages you to delve deeper into crafting the powerful Artian Weapon you’ve been looking for, all while keeping the appearance of your favorite weapon. New optional features Change your choice of handler accompanying you in the field to Eric after completing the Lagiacrus mission in Free Title Update 2! You can always switch back to Alma too, but it doesn’t hurt to give our trusty handler a break from time to time. A new Support Hunter joins the fray Mina, a support hunter who wields a Sword & Shield, joins the hunt. With Free Title Update 2, you’ll be able to choose which support hunters can join you on quests. Photo Mode Improvements Snap even more creative photos of your hunts with some new options, including an Effects tab to adjust brightness and filter effects, and a Character Display tab to toggle off your Handler, Palico, Seikret, and more. Celebrate summer with the Festival of Accord: Flamefete seasonal event View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The next seasonal event in Monster Hunter Wilds, the Festival of Accord: Flamefete, will take place in the Grand Hub from July 23 to August 6! Cool off with this summer themed celebration, where you can obtain new armor, gestures, and pop-up camp decorations for a limited time. You’ll also be able to eat special seasonal event meals and enjoy the fun of summer as the Grand Hub and all it’s members will be dressed to mark the occasion. Arch-Tempered Uth Duna slams down starting July 30 Take on an even more powerful version of Uth Duna when Arch-Tempered Uth Duna arrives as an Event Quest and Free Challenge Quest from July 30 to August 20! Take on and defeat the challenging apex of the Scarlet Forest to obtain materials for crafting the new Uth Duna γ hunter armor set and the Felyne Uth Duna γ Palico armor set. Be sure you’re at least HR 50 or above to take on this quest. We’ve also got plenty of new Event Quests on the way in the weeks ahead, including some where you can earn new special equipment, quests to obtain more armor spheres, and challenge quests against Mizutsune. Be sure to keep checking back each week to see what’s new! A special collaboration with Fender Monster Hunter Wilds is collaborating with world-renowned guitar brand Fender®! From August 27 to September 24, a special Event Quest will be available to earn a collaboration gesture that lets you rock out with the Monster Hunter Rathalos Telecaster®. In celebration of Monster Hunter’s 20th anniversary, the globally released Monster Hunter Rathalos Telecaster® collaboration guitar is making its way into the game! Be sure to experience it both in-game and in real life! A new round of cosmetic DLC arrives View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Express your style with additional DLC, including four free dance gestures. Paid cosmetic DLC, such as gestures, stickers, pendants, and more will also be available. If you’ve purchased the Premium Deluxe Edition of Monster Hunter Wilds or the Cosmetic DLC Pass, Cosmetic DLC Pack 2 and other additional items will be available to download when Free Title Update 2 releases.  Free Title Update roadmap We hope you’re excited to dive into all the content coming with Free Title Update 2! We’ll continue to release updates, with Free Title Update 3 coming at the end of September. Stay tuned for more details to come. A Monster Hunter Wilds background is added to the PS5 Welcome hub Alongside Free Title Update 2 on June 30, an animated background featuring the hunters facing Arkveld during the Inclemency will be added to the Welcome hub. Customize your PS5 Welcome hub with Monster Hunter Wilds to get you in the hunting mood. View and download image Download the image close Close Download this image How to change the backgroundWelcome hub -> Change background -> Games Try out Monster Hunter Wilds on PS5 with a PlayStation Plus Premium Game Trial starting on June 30 View and download image Download the image close Close Download this image With the Game Trial, you can try out the full version of the game for 2 hours. If you decide to purchase the full version after the trial, your save data will carry over, allowing you to continue playing seamlessly right where you left off. If you haven’t played Monster Hunter Wilds yet, this is a great way to give it a try. Happy Hunting! #monster #hunter #wilds #second #free
    BLOG.PLAYSTATION.COM
    Monster Hunter Wilds’ second free title update brings fierce new monsters and more June 30
    New monsters, features, and more arrive in the Forbidden Lands with Free Title Update 2, dropping in Monster Hunter Wilds on June 30! Watch the latest trailer for a look at what awaits you. Play Video Monster Hunter Wilds – Free Title Update 2 In addition to what’s featured in the trailer, Free Title Update 2 will also feature improvements and adjustments to various aspects of the game. Make sure to check the official Monster Hunter Wilds website for a new Director’s Letter from Game Director Yuya Tokuda coming soon, for a deeper dive into what’s coming in addition to the core new monsters and features. ● The Leviathan, Lagiacrus, emerges at last View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The long-awaited Leviathan, Lagiacrus, has finally appeared in Monster Hunter Wilds! Floating at the top of the aquatic food chain, Lagiacrus is a master of the sea, boiling the surrounding water by emitting powerful currents of electricity. New missions to hunt Lagiacrus will become available for hunters at Hunter Rank 31 or above, and after clearing the “A World Turned Upside Down” main mission, and the “Forest Doshaguma” side mission. While you’ll fight Lagiacrus primarily on land, your hunt against this formidable foe can also take you deep underwater for a special encounter, where it feels most at home. During the underwater portion of the hunt, hunters won’t be able to use their weapons freely, but there are still ways to fight back and turn the tide of battle. Stay alert for your opportunities! Hunt Lagiacrus to obtain materials for new hunter and Palico armor! As usual, these sets can be used as layered armor as well. ● The Flying Wyvern, Seregios, strikes View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Shining golden bright, the flying wyvern, Seregios, swoops into the Forbidden Lands with Free Title Update 2! Seregios is a highly mobile aerial monster that fires sharp bladescales, inflicting bleeding status on hunters. Keep an eye on your health and bring along rations and well-done steak when hunting this monster. Missions to hunt Seregios are available for hunters at HR 31 or above that have cleared the “A World Turned Upside Down” main mission. New hunter and Palico armor forged from Seregios materials awaits you! For hunters looking for a greater challenge, 8★ Tempered Lagiacrus and Seregios will begin appearing for hunters at HR 41 or higher, after completing their initial missions. Best of luck against these powerful monsters! Hunt in style with layered weapons With Free Title Update 2, hunters will be able to use Layered Weapons, which lets you use the look of any weapon, while keeping the stats and abilities of another. To unlock a weapon design as a Layered Weapon option, you’ll need to craft the final weapon in that weapon’s upgrade tree. Artian Weapons can be used as layered weapons by fully reinforcing a Rarity 8 Artian weapon. For weapons that change in appearance when upgraded, you’ll also have the option to use their pre-upgrade designs as well! You can also craft layered Palico weapons by forging their high-rank weapons. We hope this feature encourages you to delve deeper into crafting the powerful Artian Weapon you’ve been looking for, all while keeping the appearance of your favorite weapon. New optional features Change your choice of handler accompanying you in the field to Eric after completing the Lagiacrus mission in Free Title Update 2! You can always switch back to Alma too, but it doesn’t hurt to give our trusty handler a break from time to time. A new Support Hunter joins the fray Mina, a support hunter who wields a Sword & Shield, joins the hunt. With Free Title Update 2, you’ll be able to choose which support hunters can join you on quests. Photo Mode Improvements Snap even more creative photos of your hunts with some new options, including an Effects tab to adjust brightness and filter effects, and a Character Display tab to toggle off your Handler, Palico, Seikret, and more. Celebrate summer with the Festival of Accord: Flamefete seasonal event View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The next seasonal event in Monster Hunter Wilds, the Festival of Accord: Flamefete, will take place in the Grand Hub from July 23 to August 6! Cool off with this summer themed celebration, where you can obtain new armor, gestures, and pop-up camp decorations for a limited time. You’ll also be able to eat special seasonal event meals and enjoy the fun of summer as the Grand Hub and all it’s members will be dressed to mark the occasion. Arch-Tempered Uth Duna slams down starting July 30 Take on an even more powerful version of Uth Duna when Arch-Tempered Uth Duna arrives as an Event Quest and Free Challenge Quest from July 30 to August 20! Take on and defeat the challenging apex of the Scarlet Forest to obtain materials for crafting the new Uth Duna γ hunter armor set and the Felyne Uth Duna γ Palico armor set. Be sure you’re at least HR 50 or above to take on this quest. We’ve also got plenty of new Event Quests on the way in the weeks ahead, including some where you can earn new special equipment, quests to obtain more armor spheres, and challenge quests against Mizutsune. Be sure to keep checking back each week to see what’s new! A special collaboration with Fender Monster Hunter Wilds is collaborating with world-renowned guitar brand Fender®! From August 27 to September 24, a special Event Quest will be available to earn a collaboration gesture that lets you rock out with the Monster Hunter Rathalos Telecaster®. In celebration of Monster Hunter’s 20th anniversary, the globally released Monster Hunter Rathalos Telecaster® collaboration guitar is making its way into the game! Be sure to experience it both in-game and in real life! A new round of cosmetic DLC arrives View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Express your style with additional DLC, including four free dance gestures. Paid cosmetic DLC, such as gestures, stickers, pendants, and more will also be available. If you’ve purchased the Premium Deluxe Edition of Monster Hunter Wilds or the Cosmetic DLC Pass, Cosmetic DLC Pack 2 and other additional items will be available to download when Free Title Update 2 releases.  Free Title Update roadmap We hope you’re excited to dive into all the content coming with Free Title Update 2! We’ll continue to release updates, with Free Title Update 3 coming at the end of September. Stay tuned for more details to come. A Monster Hunter Wilds background is added to the PS5 Welcome hub Alongside Free Title Update 2 on June 30, an animated background featuring the hunters facing Arkveld during the Inclemency will be added to the Welcome hub. Customize your PS5 Welcome hub with Monster Hunter Wilds to get you in the hunting mood. View and download image Download the image close Close Download this image How to change the backgroundWelcome hub -> Change background -> Games Try out Monster Hunter Wilds on PS5 with a PlayStation Plus Premium Game Trial starting on June 30 View and download image Download the image close Close Download this image With the Game Trial, you can try out the full version of the game for 2 hours. If you decide to purchase the full version after the trial, your save data will carry over, allowing you to continue playing seamlessly right where you left off. If you haven’t played Monster Hunter Wilds yet, this is a great way to give it a try. Happy Hunting!
    0 Comentários 0 Compartilhamentos
  • HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE

    By TREVOR HOGG

    Images courtesy of Warner Bros. Pictures.

    Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon.

    “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.”
    —Talia Finlayson, Creative Technologist, Disguise

    Interior and exterior environments had to be created, such as the shop owned by Steve.

    “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”

    Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.”

    A virtual exploration of Steve’s shop in Midport Village.

    Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.”

    “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”
    —Laura Bell, Creative Technologist, Disguise

    Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack.

    Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.”

    Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!”

    A virtual study and final still of the cast members standing outside of the Lava Chicken Shack.

    “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.”
    —Talia Finlayson, Creative Technologist, Disguise

    The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.”

    Virtually conceptualizing the layout of Midport Village.

    Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.”

    An example of the virtual and final version of the Woodland Mansion.

    “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.”
    —Laura Bell, Creative Technologist, Disguise

    Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.”

    Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment.

    Doing a virtual scale study of the Mountainside.

    Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.”

    Piglots cause mayhem during the Wingsuit Chase.

    Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods.

    “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    #how #disguise #built #out #virtual
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve. “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.” #how #disguise #built #out #virtual
    WWW.VFXVOICE.COM
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “[A]s the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve (Jack Black). “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’s (Jack Black) Lava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younis [VAD Art Director] adapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay George [VP Tech] and I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols [VAD Supervisor], Pat Younis, Jake Tuck [Unreal Artist] and Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    0 Comentários 0 Compartilhamentos
  • Dune Awakening is one of those games that you might find interesting if you’re into survival in vast, open worlds. But honestly, it can also feel a bit tedious. The article "Tout savoir sur les classes (comment les débloquer, quelle est la meilleure…)" on ActuGaming.net dives into the various classes you can unlock in the game. Sure, it’s nice to know how to unlock them and which one might be considered the best.

    But let’s be real. It’s just another routine of figuring things out in a game that can sometimes feel like it drags on. You’ll read about the different classes, and maybe you’ll feel a spark of interest, but then again, you might just end up scrolling through your phone instead. Unlocking classes isn’t exactly thrilling, and deciding which one is the best feels like trying to choose a favorite brand of water.

    The article gives you the basics on how to unlock classes. Apparently, you have to do certain tasks or maybe complete some boring missions. The best class? Well, that’s subjective. It’s all about what you prefer, but it feels more like a chore than a choice.

    So, if you’re curious about the classes and how to unlock them, the article is there. But let’s not pretend it’s going to change your life. It’s just another piece of information in the endless sea of gaming content. If you have some time to kill, sure, give it a look. Otherwise, you could probably find something more entertaining to do.

    Anyway, if you’re into this sort of thing, you can check it out on ActuGaming.net. Just don’t expect fireworks or anything.

    #DuneAwakening #GamingClasses #OpenWorldGames #SurvivalGames #ActuGaming
    Dune Awakening is one of those games that you might find interesting if you’re into survival in vast, open worlds. But honestly, it can also feel a bit tedious. The article "Tout savoir sur les classes (comment les débloquer, quelle est la meilleure…)" on ActuGaming.net dives into the various classes you can unlock in the game. Sure, it’s nice to know how to unlock them and which one might be considered the best. But let’s be real. It’s just another routine of figuring things out in a game that can sometimes feel like it drags on. You’ll read about the different classes, and maybe you’ll feel a spark of interest, but then again, you might just end up scrolling through your phone instead. Unlocking classes isn’t exactly thrilling, and deciding which one is the best feels like trying to choose a favorite brand of water. The article gives you the basics on how to unlock classes. Apparently, you have to do certain tasks or maybe complete some boring missions. The best class? Well, that’s subjective. It’s all about what you prefer, but it feels more like a chore than a choice. So, if you’re curious about the classes and how to unlock them, the article is there. But let’s not pretend it’s going to change your life. It’s just another piece of information in the endless sea of gaming content. If you have some time to kill, sure, give it a look. Otherwise, you could probably find something more entertaining to do. Anyway, if you’re into this sort of thing, you can check it out on ActuGaming.net. Just don’t expect fireworks or anything. #DuneAwakening #GamingClasses #OpenWorldGames #SurvivalGames #ActuGaming
    Tout savoir sur les classes (comment les débloquer, quelle est la meilleure…) – Dune Awakening
    ActuGaming.net Tout savoir sur les classes (comment les débloquer, quelle est la meilleure…) – Dune Awakening Dune Awakening est un jeu de survie en monde ouvert, certes, mais il s’agit surtout […] L'article Tout savoir sur les cla
    Like
    Love
    Wow
    Sad
    Angry
    331
    1 Comentários 0 Compartilhamentos

  • ## Introduction

    Brand mascots have been part of marketing strategies for decades, serving as recognizable figures that embody the essence of companies and their products. From the iconic Bibendum, also known as the Michelin Man, to the energetic Benny the Bull, mascots tell stories and create connections. This article explores the journey of various brand mascots, highlighting those that have left an indelible mark, as well as those that have faded into obscurity.

    ## The Evolution of Brand Ma...
    ## Introduction Brand mascots have been part of marketing strategies for decades, serving as recognizable figures that embody the essence of companies and their products. From the iconic Bibendum, also known as the Michelin Man, to the energetic Benny the Bull, mascots tell stories and create connections. This article explores the journey of various brand mascots, highlighting those that have left an indelible mark, as well as those that have faded into obscurity. ## The Evolution of Brand Ma...
    Les mascottes de marques, de Bibendum à Benny the Bull
    ## Introduction Brand mascots have been part of marketing strategies for decades, serving as recognizable figures that embody the essence of companies and their products. From the iconic Bibendum, also known as the Michelin Man, to the energetic Benny the Bull, mascots tell stories and create connections. This article explores the journey of various brand mascots, highlighting those that have...
    Like
    Love
    Wow
    Sad
    Angry
    393
    1 Comentários 0 Compartilhamentos
  • In a world where creativity reigns supreme, Adobe has just gifted us with a shiny new toy: the Firefly Boards. Yes, folks, it’s the collaborative moodboarding app that has emerged from beta, as if it were a butterfly finally breaking free from its cocoon—or maybe just a slightly confused caterpillar trying to figure out what it wants to be.

    Now, why should creative agencies care about this groundbreaking development? Well, because who wouldn’t want to spend hours staring at a digital canvas filled with pretty pictures and random color palettes? Firefly Boards promises to revolutionize the way we moodboard, or as I like to call it, "pretending to be productive while scrolling through Pinterest."

    Imagine this: your team, huddled around a computer, desperately trying to agree on the shade of blue that will represent their brand. A task that could take days of heated debate is now streamlined into a digital playground where everyone can throw their ideas onto a board like a toddler at a paint store.

    But let's be real. Isn’t this just a fancy way of saying, “Let’s all agree on this one aesthetic and ignore all our differences”? Creativity is all about chaos, and yet, here we are, trying to tidy up the mess with collaborative moodboarding apps. What’s next? A group hug to decide on the font size?

    Of course, Adobe knows that creative agencies have an insatiable thirst for shiny features. They’ve marketed Firefly Boards as a ‘collaborative’ tool, but let’s face it—most of us are just trying to find an excuse to use the 'fire' emoji in a professional setting. It’s as if they’re saying, “Trust us, this will make your life easier!” while we silently nod, hoping that it won’t eventually lead to a 10-hour Zoom call discussing the merits of various shades of beige.

    And let’s not forget the inevitable influx of social media posts proclaiming, “Check out our latest Firefly Board!” — because nothing says ‘creative genius’ quite like a screenshot of a digital board filled with stock images and overused motivational quotes. Can’t wait to see how many ‘likes’ that garners!

    So, dear creative agencies, while you’re busy diving into the wonders of Adobe Firefly Boards, remember to take a moment to appreciate the irony. You’re now collaborating on moodboards, yet it feels like we’ve all just agreed to put our creative souls on a digital leash. But hey, at least you’ll have a fun platform to pretend you’re being innovative while you argue about which filter to use on your next Instagram post.

    #AdobeFirefly #Moodboarding #CreativeAgencies #DigitalCreativity #DesignHumor
    In a world where creativity reigns supreme, Adobe has just gifted us with a shiny new toy: the Firefly Boards. Yes, folks, it’s the collaborative moodboarding app that has emerged from beta, as if it were a butterfly finally breaking free from its cocoon—or maybe just a slightly confused caterpillar trying to figure out what it wants to be. Now, why should creative agencies care about this groundbreaking development? Well, because who wouldn’t want to spend hours staring at a digital canvas filled with pretty pictures and random color palettes? Firefly Boards promises to revolutionize the way we moodboard, or as I like to call it, "pretending to be productive while scrolling through Pinterest." Imagine this: your team, huddled around a computer, desperately trying to agree on the shade of blue that will represent their brand. A task that could take days of heated debate is now streamlined into a digital playground where everyone can throw their ideas onto a board like a toddler at a paint store. But let's be real. Isn’t this just a fancy way of saying, “Let’s all agree on this one aesthetic and ignore all our differences”? Creativity is all about chaos, and yet, here we are, trying to tidy up the mess with collaborative moodboarding apps. What’s next? A group hug to decide on the font size? Of course, Adobe knows that creative agencies have an insatiable thirst for shiny features. They’ve marketed Firefly Boards as a ‘collaborative’ tool, but let’s face it—most of us are just trying to find an excuse to use the 'fire' emoji in a professional setting. It’s as if they’re saying, “Trust us, this will make your life easier!” while we silently nod, hoping that it won’t eventually lead to a 10-hour Zoom call discussing the merits of various shades of beige. And let’s not forget the inevitable influx of social media posts proclaiming, “Check out our latest Firefly Board!” — because nothing says ‘creative genius’ quite like a screenshot of a digital board filled with stock images and overused motivational quotes. Can’t wait to see how many ‘likes’ that garners! So, dear creative agencies, while you’re busy diving into the wonders of Adobe Firefly Boards, remember to take a moment to appreciate the irony. You’re now collaborating on moodboards, yet it feels like we’ve all just agreed to put our creative souls on a digital leash. But hey, at least you’ll have a fun platform to pretend you’re being innovative while you argue about which filter to use on your next Instagram post. #AdobeFirefly #Moodboarding #CreativeAgencies #DigitalCreativity #DesignHumor
    Why creative agencies need to know about new Adobe Firefly Boards
    The collaborative moodboarding app is now out of beta.
    Like
    Love
    Wow
    Angry
    Sad
    512
    1 Comentários 0 Compartilhamentos
  • Hey, amazing community!

    Are you ready to embark on an incredible adventure in the mesmerizing world of Arrakis? Dune Awakening isn’t just a game; it’s a journey that will test your skills, ignite your passion, and connect you with fellow explorers who share the same burning desire for survival and discovery!

    Now, let’s talk about something crucial for your survival on this beautiful yet harsh planet—finding cobalt and carbon! These resources are essential for crafting and upgrading your gear, and they will give you the edge you need to thrive in the vast deserts of Arrakis. Imagine the thrill of uncovering these precious materials while navigating through the stunning landscapes and dodging the dangers lurking beneath the sands!

    But don’t worry, I’ve got you covered! To find cobalt and carbon, you’ll need to explore various biomes and engage with the environment. Keep your eyes peeled for specific locations where these resources are more abundant. Use your tools wisely, and remember, teamwork is key! Collaborate with friends and fellow players to maximize your resource-gathering efforts.

    As you delve deeper into the game, remember to embrace the spirit of adventure! Every challenge you face is an opportunity for growth. Whether you’re learning to navigate the treacherous dunes or mastering the art of survival, each step brings you closer to becoming a true warrior of Arrakis!

    Let’s not forget the beauty of the friendships you’ll forge along the way! The bonds created in the heat of battle and shared victories will last far beyond the game. Celebrate each achievement, no matter how small, and encourage one another to push through the tough times. Together, we can create a thriving community that uplifts and inspires!

    So grab your gear, rally your friends, and get ready to dive into the thrilling world of Dune Awakening! Your adventure is just beginning, and I can’t wait to hear all about your discoveries and triumphs! Let’s make magic happen on Arrakis!

    Remember, every great explorer started with a single step. Take yours today!

    #DuneAwakening #ArrakisAdventure #SurvivalGaming #CobaltAndCarbon #TogetherWeThrive
    🌟 Hey, amazing community! 🌟 Are you ready to embark on an incredible adventure in the mesmerizing world of Arrakis? 🚀🌌 Dune Awakening isn’t just a game; it’s a journey that will test your skills, ignite your passion, and connect you with fellow explorers who share the same burning desire for survival and discovery! 🔥 Now, let’s talk about something crucial for your survival on this beautiful yet harsh planet—finding cobalt and carbon! 🌍💎 These resources are essential for crafting and upgrading your gear, and they will give you the edge you need to thrive in the vast deserts of Arrakis. Imagine the thrill of uncovering these precious materials while navigating through the stunning landscapes and dodging the dangers lurking beneath the sands! 🏜️💨 But don’t worry, I’ve got you covered! To find cobalt and carbon, you’ll need to explore various biomes and engage with the environment. Keep your eyes peeled for specific locations where these resources are more abundant. Use your tools wisely, and remember, teamwork is key! Collaborate with friends and fellow players to maximize your resource-gathering efforts. 🤝✨ As you delve deeper into the game, remember to embrace the spirit of adventure! Every challenge you face is an opportunity for growth. Whether you’re learning to navigate the treacherous dunes or mastering the art of survival, each step brings you closer to becoming a true warrior of Arrakis! 💪🌠 Let’s not forget the beauty of the friendships you’ll forge along the way! 🌈💖 The bonds created in the heat of battle and shared victories will last far beyond the game. Celebrate each achievement, no matter how small, and encourage one another to push through the tough times. Together, we can create a thriving community that uplifts and inspires! 🙌🎉 So grab your gear, rally your friends, and get ready to dive into the thrilling world of Dune Awakening! Your adventure is just beginning, and I can’t wait to hear all about your discoveries and triumphs! Let’s make magic happen on Arrakis! ✨🌟 Remember, every great explorer started with a single step. Take yours today! 🚀💖 #DuneAwakening #ArrakisAdventure #SurvivalGaming #CobaltAndCarbon #TogetherWeThrive
    Où trouver du cobalt/carbone sur Arrakis ? | Dune Awakening
    ActuGaming.net Où trouver du cobalt/carbone sur Arrakis ? | Dune Awakening Dune Awakening est un MMORPG axé sur la survie prenant place sur Arrakis, une planète […] L'article Où trouver du cobalt/carbone sur Arrakis ? | Dune Awakening est disp
    Like
    Love
    Wow
    Angry
    Sad
    648
    1 Comentários 0 Compartilhamentos
Páginas impulsionada