• All the Stars, All the Time

    Some of the largest objects in the night sky to view through a telescope are galaxies and supernova remnants, often many times larger in size than the moon but generally …read more
    All the Stars, All the Time Some of the largest objects in the night sky to view through a telescope are galaxies and supernova remnants, often many times larger in size than the moon but generally …read more
    HACKADAY.COM
    All the Stars, All the Time
    Some of the largest objects in the night sky to view through a telescope are galaxies and supernova remnants, often many times larger in size than the moon but generally …read more
    1 Комментарии 0 Поделились
  • European Robot Makers Adopt NVIDIA Isaac, Omniverse and Halos to Develop Safe, Physical AI-Driven Robot Fleets

    In the face of growing labor shortages and need for sustainability, European manufacturers are racing to reinvent their processes to become software-defined and AI-driven.
    To achieve this, robot developers and industrial digitalization solution providers are working with NVIDIA to build safe, AI-driven robots and industrial technologies to drive modern, sustainable manufacturing.
    At NVIDIA GTC Paris at VivaTech, Europe’s leading robotics companies including Agile Robots, Extend Robotics, Humanoid, idealworks, Neura Robotics, SICK, Universal Robots, Vorwerk and Wandelbots are showcasing their latest AI-driven robots and automation breakthroughs, all accelerated by NVIDIA technologies. In addition, NVIDIA is releasing new models and tools to support the entire robotics ecosystem.
    NVIDIA Releases Tools for Accelerating Robot Development and Safety
    NVIDIA Isaac GR00T N1.5, an open foundation model for humanoid robot reasoning and skills, is now available for download on Hugging Face. This update enhances the model’s adaptability and ability to follow instructions, significantly improving its performance in material handling and manufacturing tasks. The NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 open-source robotics simulation and learning frameworks, optimized for NVIDIA RTX PRO 6000 workstations, are available on GitHub for developer preview.
    In addition, NVIDIA announced that NVIDIA Halos — a full-stack, comprehensive safety system that unifies hardware architecture, AI models, software, tools and services — now expands to robotics, promoting safety across the entire development lifecycle of AI-driven robots.
    The NVIDIA Halos AI Systems Inspection Lab has earned accreditation from the ANSI National Accreditation Boardto perform inspections across functional safety for robotics, in addition to automotive vehicles.
    “NVIDIA’s latest evaluation with ANAB verifies the demonstration of competence and compliance with internationally recognized standards, helping ensure that developers of autonomous machines — from automotive to robotics — can meet the highest benchmarks for functional safety,” said R. Douglas Leonard Jr., executive director of ANAB.
    Arcbest, Advantech, Bluewhite, Boston Dynamics, FORT, Inxpect, KION, NexCobot — a NEXCOM company, and Synapticon are among the first robotics companies to join the Halos Inspection Lab, ensuring their products meet NVIDIA safety and cybersecurity requirements.
    To support robotics leaders in strengthening safety across the entire development lifecycle of AI-driven robots, Halos will now provide:

    Safety extension packages for the NVIDIA IGX platform, enabling manufacturers to easily program safety functions into their robots, supported by TÜV Rheinland’s inspection of NVIDIA IGX.
    A robotic safety platform, which includes IGX and NVIDIA Holoscan Sensor Bridge for a unified approach to designing sensor-to-compute architecture with built-in AI safety.
    An outside-in safety AI inspector — an AI-powered agent for monitoring robot operations, helping improve worker safety.

    Europe’s Robotics Ecosystem Builds on NVIDIA’s Three Computers
    Europe’s leading robotics developers and solution providers are integrating the NVIDIA Isaac robotics platform to train, simulate and deploy robots across different embodiments.
    Agile Robots is post-training the GR00T N1 model in Isaac Lab to train its dual-arm manipulator robots, which run on NVIDIA Jetson hardware, to execute a variety of tasks in industrial environments.
    Meanwhile, idealworks has adopted the Mega NVIDIA Omniverse Blueprint for robotic fleet simulation to extend the blueprint’s capabilities to humanoids. Building on the VDA 5050 framework, idealworks contributes to the development of guidance that supports tasks uniquely enabled by humanoid robots, such as picking, moving and placing objects.
    Neura Robotics is integrating NVIDIA Isaac to further enhance its robot development workflows. The company is using GR00T-Mimic to post-train the Isaac GR00T N1 robot foundation model for its service robot MiPA. Neura is also collaborating with SAP and NVIDIA to integrate SAP’s Joule agents with its robots, using the Mega NVIDIA Omniverse Blueprint to simulate and refine robot behavior in complex, realistic operational scenarios before deployment.
    Vorwerk is using NVIDIA technologies to power its AI-driven collaborative robots. The company is post-training GR00T N1 models in Isaac Lab with its custom synthetic data pipeline, which is built on Isaac GR00T-Mimic and powered by the NVIDIA Omniverse platform. The enhanced models are then deployed on NVIDIA Jetson AGX, Jetson Orin or Jetson Thor modules for advanced, real-time home robotics.
    Humanoid is using NVIDIA’s full robotics stack, including Isaac Sim and Isaac Lab, to cut its prototyping time down by six weeks. The company is training its vision language action models on NVIDIA DGX B200 systems to boost the cognitive abilities of its robots, allowing them to operate autonomously in complex environments using Jetson Thor onboard computing.
    Universal Robots is introducing UR15, its fastest collaborative robot yet, to the European market. Using UR’s AI Accelerator — developed on NVIDIA Isaac’s CUDA-accelerated libraries and AI models, as well as NVIDIA Jetson AGX Orin — manufacturers can build AI applications to embed intelligence into the company’s new cobots.
    Wandelbots is showcasing its NOVA Operating System, now integrated with Omniverse, to simulate, validate and optimize robotic behaviors virtually before deploying them to physical robots. Wandelbots also announced a collaboration with EY and EDAG to offer manufacturers a scalable automation platform on Omniverse that speeds up the transition from proof of concept to full-scale deployment.
    Extend Robotics is using the Isaac GR00T platform to enable customers to control and train robots for industrial tasks like visual inspection and handling radioactive materials. The company’s Advanced Mechanics Assistance System lets users collect demonstration data and generate diverse synthetic datasets with NVIDIA GR00T-Mimic and GR00T-Gen to train the GR00T N1 foundation model.
    SICK is enhancing its autonomous perception solutions by integrating new certified sensor models — as well as 2D and 3D lidars, safety scanners and cameras — into NVIDIA Isaac Sim. This enables engineers to virtually design, test and validate machines using SICK’s sensing models within Omniverse, supporting processes spanning product development to large-scale robotic fleet management.
    Toyota Material Handling Europe is working with SoftServe to simulate its autonomous mobile robots working alongside human workers, using the Mega NVIDIA Omniverse Blueprint. Toyota Material Handling Europe is testing and simulating a multitude of traffic scenarios — allowing the company to refine its AI algorithms before real-world deployment.
    NVIDIA’s partner ecosystem is enabling European industries to tap into intelligent, AI-powered robotics. By harnessing advanced simulation, digital twins and generative AI, manufacturers are rapidly developing and deploying safe, adaptable robot fleets that address labor shortages, boost sustainability and drive operational efficiency.
    Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    See notice regarding software product information.
    #european #robot #makers #adopt #nvidia
    European Robot Makers Adopt NVIDIA Isaac, Omniverse and Halos to Develop Safe, Physical AI-Driven Robot Fleets
    In the face of growing labor shortages and need for sustainability, European manufacturers are racing to reinvent their processes to become software-defined and AI-driven. To achieve this, robot developers and industrial digitalization solution providers are working with NVIDIA to build safe, AI-driven robots and industrial technologies to drive modern, sustainable manufacturing. At NVIDIA GTC Paris at VivaTech, Europe’s leading robotics companies including Agile Robots, Extend Robotics, Humanoid, idealworks, Neura Robotics, SICK, Universal Robots, Vorwerk and Wandelbots are showcasing their latest AI-driven robots and automation breakthroughs, all accelerated by NVIDIA technologies. In addition, NVIDIA is releasing new models and tools to support the entire robotics ecosystem. NVIDIA Releases Tools for Accelerating Robot Development and Safety NVIDIA Isaac GR00T N1.5, an open foundation model for humanoid robot reasoning and skills, is now available for download on Hugging Face. This update enhances the model’s adaptability and ability to follow instructions, significantly improving its performance in material handling and manufacturing tasks. The NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 open-source robotics simulation and learning frameworks, optimized for NVIDIA RTX PRO 6000 workstations, are available on GitHub for developer preview. In addition, NVIDIA announced that NVIDIA Halos — a full-stack, comprehensive safety system that unifies hardware architecture, AI models, software, tools and services — now expands to robotics, promoting safety across the entire development lifecycle of AI-driven robots. The NVIDIA Halos AI Systems Inspection Lab has earned accreditation from the ANSI National Accreditation Boardto perform inspections across functional safety for robotics, in addition to automotive vehicles. “NVIDIA’s latest evaluation with ANAB verifies the demonstration of competence and compliance with internationally recognized standards, helping ensure that developers of autonomous machines — from automotive to robotics — can meet the highest benchmarks for functional safety,” said R. Douglas Leonard Jr., executive director of ANAB. Arcbest, Advantech, Bluewhite, Boston Dynamics, FORT, Inxpect, KION, NexCobot — a NEXCOM company, and Synapticon are among the first robotics companies to join the Halos Inspection Lab, ensuring their products meet NVIDIA safety and cybersecurity requirements. To support robotics leaders in strengthening safety across the entire development lifecycle of AI-driven robots, Halos will now provide: Safety extension packages for the NVIDIA IGX platform, enabling manufacturers to easily program safety functions into their robots, supported by TÜV Rheinland’s inspection of NVIDIA IGX. A robotic safety platform, which includes IGX and NVIDIA Holoscan Sensor Bridge for a unified approach to designing sensor-to-compute architecture with built-in AI safety. An outside-in safety AI inspector — an AI-powered agent for monitoring robot operations, helping improve worker safety. Europe’s Robotics Ecosystem Builds on NVIDIA’s Three Computers Europe’s leading robotics developers and solution providers are integrating the NVIDIA Isaac robotics platform to train, simulate and deploy robots across different embodiments. Agile Robots is post-training the GR00T N1 model in Isaac Lab to train its dual-arm manipulator robots, which run on NVIDIA Jetson hardware, to execute a variety of tasks in industrial environments. Meanwhile, idealworks has adopted the Mega NVIDIA Omniverse Blueprint for robotic fleet simulation to extend the blueprint’s capabilities to humanoids. Building on the VDA 5050 framework, idealworks contributes to the development of guidance that supports tasks uniquely enabled by humanoid robots, such as picking, moving and placing objects. Neura Robotics is integrating NVIDIA Isaac to further enhance its robot development workflows. The company is using GR00T-Mimic to post-train the Isaac GR00T N1 robot foundation model for its service robot MiPA. Neura is also collaborating with SAP and NVIDIA to integrate SAP’s Joule agents with its robots, using the Mega NVIDIA Omniverse Blueprint to simulate and refine robot behavior in complex, realistic operational scenarios before deployment. Vorwerk is using NVIDIA technologies to power its AI-driven collaborative robots. The company is post-training GR00T N1 models in Isaac Lab with its custom synthetic data pipeline, which is built on Isaac GR00T-Mimic and powered by the NVIDIA Omniverse platform. The enhanced models are then deployed on NVIDIA Jetson AGX, Jetson Orin or Jetson Thor modules for advanced, real-time home robotics. Humanoid is using NVIDIA’s full robotics stack, including Isaac Sim and Isaac Lab, to cut its prototyping time down by six weeks. The company is training its vision language action models on NVIDIA DGX B200 systems to boost the cognitive abilities of its robots, allowing them to operate autonomously in complex environments using Jetson Thor onboard computing. Universal Robots is introducing UR15, its fastest collaborative robot yet, to the European market. Using UR’s AI Accelerator — developed on NVIDIA Isaac’s CUDA-accelerated libraries and AI models, as well as NVIDIA Jetson AGX Orin — manufacturers can build AI applications to embed intelligence into the company’s new cobots. Wandelbots is showcasing its NOVA Operating System, now integrated with Omniverse, to simulate, validate and optimize robotic behaviors virtually before deploying them to physical robots. Wandelbots also announced a collaboration with EY and EDAG to offer manufacturers a scalable automation platform on Omniverse that speeds up the transition from proof of concept to full-scale deployment. Extend Robotics is using the Isaac GR00T platform to enable customers to control and train robots for industrial tasks like visual inspection and handling radioactive materials. The company’s Advanced Mechanics Assistance System lets users collect demonstration data and generate diverse synthetic datasets with NVIDIA GR00T-Mimic and GR00T-Gen to train the GR00T N1 foundation model. SICK is enhancing its autonomous perception solutions by integrating new certified sensor models — as well as 2D and 3D lidars, safety scanners and cameras — into NVIDIA Isaac Sim. This enables engineers to virtually design, test and validate machines using SICK’s sensing models within Omniverse, supporting processes spanning product development to large-scale robotic fleet management. Toyota Material Handling Europe is working with SoftServe to simulate its autonomous mobile robots working alongside human workers, using the Mega NVIDIA Omniverse Blueprint. Toyota Material Handling Europe is testing and simulating a multitude of traffic scenarios — allowing the company to refine its AI algorithms before real-world deployment. NVIDIA’s partner ecosystem is enabling European industries to tap into intelligent, AI-powered robotics. By harnessing advanced simulation, digital twins and generative AI, manufacturers are rapidly developing and deploying safe, adaptable robot fleets that address labor shortages, boost sustainability and drive operational efficiency. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions. See notice regarding software product information. #european #robot #makers #adopt #nvidia
    BLOGS.NVIDIA.COM
    European Robot Makers Adopt NVIDIA Isaac, Omniverse and Halos to Develop Safe, Physical AI-Driven Robot Fleets
    In the face of growing labor shortages and need for sustainability, European manufacturers are racing to reinvent their processes to become software-defined and AI-driven. To achieve this, robot developers and industrial digitalization solution providers are working with NVIDIA to build safe, AI-driven robots and industrial technologies to drive modern, sustainable manufacturing. At NVIDIA GTC Paris at VivaTech, Europe’s leading robotics companies including Agile Robots, Extend Robotics, Humanoid, idealworks, Neura Robotics, SICK, Universal Robots, Vorwerk and Wandelbots are showcasing their latest AI-driven robots and automation breakthroughs, all accelerated by NVIDIA technologies. In addition, NVIDIA is releasing new models and tools to support the entire robotics ecosystem. NVIDIA Releases Tools for Accelerating Robot Development and Safety NVIDIA Isaac GR00T N1.5, an open foundation model for humanoid robot reasoning and skills, is now available for download on Hugging Face. This update enhances the model’s adaptability and ability to follow instructions, significantly improving its performance in material handling and manufacturing tasks. The NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 open-source robotics simulation and learning frameworks, optimized for NVIDIA RTX PRO 6000 workstations, are available on GitHub for developer preview. In addition, NVIDIA announced that NVIDIA Halos — a full-stack, comprehensive safety system that unifies hardware architecture, AI models, software, tools and services — now expands to robotics, promoting safety across the entire development lifecycle of AI-driven robots. The NVIDIA Halos AI Systems Inspection Lab has earned accreditation from the ANSI National Accreditation Board (ANAB) to perform inspections across functional safety for robotics, in addition to automotive vehicles. “NVIDIA’s latest evaluation with ANAB verifies the demonstration of competence and compliance with internationally recognized standards, helping ensure that developers of autonomous machines — from automotive to robotics — can meet the highest benchmarks for functional safety,” said R. Douglas Leonard Jr., executive director of ANAB. Arcbest, Advantech, Bluewhite, Boston Dynamics, FORT, Inxpect, KION, NexCobot — a NEXCOM company, and Synapticon are among the first robotics companies to join the Halos Inspection Lab, ensuring their products meet NVIDIA safety and cybersecurity requirements. To support robotics leaders in strengthening safety across the entire development lifecycle of AI-driven robots, Halos will now provide: Safety extension packages for the NVIDIA IGX platform, enabling manufacturers to easily program safety functions into their robots, supported by TÜV Rheinland’s inspection of NVIDIA IGX. A robotic safety platform, which includes IGX and NVIDIA Holoscan Sensor Bridge for a unified approach to designing sensor-to-compute architecture with built-in AI safety. An outside-in safety AI inspector — an AI-powered agent for monitoring robot operations, helping improve worker safety. Europe’s Robotics Ecosystem Builds on NVIDIA’s Three Computers Europe’s leading robotics developers and solution providers are integrating the NVIDIA Isaac robotics platform to train, simulate and deploy robots across different embodiments. Agile Robots is post-training the GR00T N1 model in Isaac Lab to train its dual-arm manipulator robots, which run on NVIDIA Jetson hardware, to execute a variety of tasks in industrial environments. Meanwhile, idealworks has adopted the Mega NVIDIA Omniverse Blueprint for robotic fleet simulation to extend the blueprint’s capabilities to humanoids. Building on the VDA 5050 framework, idealworks contributes to the development of guidance that supports tasks uniquely enabled by humanoid robots, such as picking, moving and placing objects. Neura Robotics is integrating NVIDIA Isaac to further enhance its robot development workflows. The company is using GR00T-Mimic to post-train the Isaac GR00T N1 robot foundation model for its service robot MiPA. Neura is also collaborating with SAP and NVIDIA to integrate SAP’s Joule agents with its robots, using the Mega NVIDIA Omniverse Blueprint to simulate and refine robot behavior in complex, realistic operational scenarios before deployment. Vorwerk is using NVIDIA technologies to power its AI-driven collaborative robots. The company is post-training GR00T N1 models in Isaac Lab with its custom synthetic data pipeline, which is built on Isaac GR00T-Mimic and powered by the NVIDIA Omniverse platform. The enhanced models are then deployed on NVIDIA Jetson AGX, Jetson Orin or Jetson Thor modules for advanced, real-time home robotics. Humanoid is using NVIDIA’s full robotics stack, including Isaac Sim and Isaac Lab, to cut its prototyping time down by six weeks. The company is training its vision language action models on NVIDIA DGX B200 systems to boost the cognitive abilities of its robots, allowing them to operate autonomously in complex environments using Jetson Thor onboard computing. Universal Robots is introducing UR15, its fastest collaborative robot yet, to the European market. Using UR’s AI Accelerator — developed on NVIDIA Isaac’s CUDA-accelerated libraries and AI models, as well as NVIDIA Jetson AGX Orin — manufacturers can build AI applications to embed intelligence into the company’s new cobots. Wandelbots is showcasing its NOVA Operating System, now integrated with Omniverse, to simulate, validate and optimize robotic behaviors virtually before deploying them to physical robots. Wandelbots also announced a collaboration with EY and EDAG to offer manufacturers a scalable automation platform on Omniverse that speeds up the transition from proof of concept to full-scale deployment. Extend Robotics is using the Isaac GR00T platform to enable customers to control and train robots for industrial tasks like visual inspection and handling radioactive materials. The company’s Advanced Mechanics Assistance System lets users collect demonstration data and generate diverse synthetic datasets with NVIDIA GR00T-Mimic and GR00T-Gen to train the GR00T N1 foundation model. SICK is enhancing its autonomous perception solutions by integrating new certified sensor models — as well as 2D and 3D lidars, safety scanners and cameras — into NVIDIA Isaac Sim. This enables engineers to virtually design, test and validate machines using SICK’s sensing models within Omniverse, supporting processes spanning product development to large-scale robotic fleet management. Toyota Material Handling Europe is working with SoftServe to simulate its autonomous mobile robots working alongside human workers, using the Mega NVIDIA Omniverse Blueprint. Toyota Material Handling Europe is testing and simulating a multitude of traffic scenarios — allowing the company to refine its AI algorithms before real-world deployment. NVIDIA’s partner ecosystem is enabling European industries to tap into intelligent, AI-powered robotics. By harnessing advanced simulation, digital twins and generative AI, manufacturers are rapidly developing and deploying safe, adaptable robot fleets that address labor shortages, boost sustainability and drive operational efficiency. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions. See notice regarding software product information.
    Like
    Love
    Wow
    Angry
    15
    0 Комментарии 0 Поделились
  • NVIDIA and Partners Highlight Next-Generation Robotics, Automation and AI Technologies at Automatica

    From the heart of Germany’s automotive sector to manufacturing hubs across France and Italy, Europe is embracing industrial AI and advanced AI-powered robotics to address labor shortages, boost productivity and fuel sustainable economic growth.
    Robotics companies are developing humanoid robots and collaborative systems that integrate AI into real-world manufacturing applications. Supported by a billion investment initiative and coordinated efforts from the European Commission, Europe is positioning itself at the forefront of the next wave of industrial automation, powered by AI.
    This momentum is on full display at Automatica — Europe’s premier conference on advancements in robotics, machine vision and intelligent manufacturing — taking place this week in Munich, Germany.
    NVIDIA and its ecosystem of partners and customers are showcasing next-generation robots, automation and AI technologies designed to accelerate the continent’s leadership in smart manufacturing and logistics.
    NVIDIA Technologies Boost Robotics Development 
    Central to advancing robotics development is Europe’s first industrial AI cloud, announced at NVIDIA GTC Paris at VivaTech earlier this month. The Germany-based AI factory, featuring 10,000 NVIDIA GPUs, provides European manufacturers with secure, sovereign and centralized AI infrastructure for industrial workloads. It will support applications ranging from design and engineering to factory digital twins and robotics.
    To help accelerate humanoid development, NVIDIA released NVIDIA Isaac GR00T N1.5 — an open foundation model for humanoid robot reasoning and skills. This update enhances the model’s adaptability and ability to follow instructions, significantly improving its performance in material handling and manufacturing tasks.
    To help post-train GR00T N1.5, NVIDIA has also released the Isaac GR00T-Dreams blueprint — a reference workflow for generating vast amounts of synthetic trajectory data from a small number of human demonstrations — enabling robots to generalize across behaviors and adapt to new environments with minimal human demonstration data.
    In addition, early developer previews of NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 — open-source robot simulation and learning frameworks optimized for NVIDIA RTX PRO 6000 workstations — are now available on GitHub.
    Image courtesy of Wandelbots.
    Robotics Leaders Tap NVIDIA Simulation Technology to Develop and Deploy Humanoids and More 
    Robotics developers and solutions providers across the globe are integrating NVIDIA’s three computers to train, simulate and deploy robots.
    NEURA Robotics, a German robotics company and pioneer for cognitive robots, unveiled the third generation of its humanoid, 4NE1, designed to assist humans in domestic and professional environments through advanced cognitive capabilities and humanlike interaction. 4NE1 is powered by GR00T N1 and was trained in Isaac Sim and Isaac Lab before real-world deployment.
    NEURA Robotics is also presenting Neuraverse, a digital twin and interconnected ecosystem for robot training, skills and applications, fully compatible with NVIDIA Omniverse technologies.
    Delta Electronics, a global leader in power management and smart green solutions, is debuting two next-generation collaborative robots: D-Bot Mar and D-Bot 2 in 1 — both trained using Omniverse and Isaac Sim technologies and libraries. These cobots are engineered to transform intralogistics and optimize production flows.
    Wandelbots, the creator of the Wandelbots NOVA software platform for industrial robotics, is partnering with SoftServe, a global IT consulting and digital services provider, to scale simulation-first automating using NVIDIA Isaac Sim, enabling virtual validation and real-world deployment with maximum impact.
    Cyngn, a pioneer in autonomous mobile robotics, is integrating its DriveMod technology into Isaac Sim to enable large-scale, high fidelity virtual testing of advanced autonomous operation. Purpose-built for industrial applications, DriveMod is already deployed on vehicles such as the Motrec MT-160 Tugger and BYD Forklift, delivering sophisticated automation to material handling operations.
    Doosan Robotics, a company specializing in AI robotic solutions, will showcase its “sim to real” solution, using NVIDIA Isaac Sim and cuRobo. Doosan will be showcasing how to seamlessly transfer tasks from simulation to real robots across a wide range of applications — from manufacturing to service industries.
    Franka Robotics has integrated Isaac GR00T N1.5 into a dual-arm Franka Research 3robot for robotic control. The integration of GR00T N1.5 allows the system to interpret visual input, understand task context and autonomously perform complex manipulation — without the need for task-specific programming or hardcoded logic.
    Image courtesy of Franka Robotics.
    Hexagon, the global leader in measurement technologies, launched its new humanoid, dubbed AEON. With its unique locomotion system and multimodal sensor fusion, and powered by NVIDIA’s three-computer solution, AEON is engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support.
    Intrinsic, a software and AI robotics company, is integrating Intrinsic Flowstate with  Omniverse and OpenUSD for advanced visualization and digital twins that can be used in many industrial use cases. The company is also using NVIDIA foundation models to enhance robot capabilities like grasp planning through AI and simulation technologies.
    SCHUNK, a global leader in gripping systems and automation technology, is showcasing its innovative grasping kit powered by the NVIDIA Jetson AGX Orin module. The kit intelligently detects objects and calculates optimal grasping points. Schunk is also demonstrating seamless simulation-to-reality transfer using IGS Virtuous software — built on Omniverse technologies — to control a real robot through simulation in a pick-and-place scenario.
    Universal Robots is showcasing UR15, its fastest cobot yet. Powered by the UR AI Accelerator — developed with NVIDIA and running on Jetson AGX Orin using CUDA-accelerated Isaac libraries — UR15 helps set a new standard for industrial automation.

    Vention, a full-stack software and hardware automation company, launched its Machine Motion AI, built on CUDA-accelerated Isaac libraries and powered by Jetson. Vention is also expanding its lineup of robotic offerings by adding the FR3 robot from Franka Robotics to its ecosystem, enhancing its solutions for academic and research applications.
    Image courtesy of Vention.
    Learn more about the latest robotics advancements by joining NVIDIA at Automatica, running through Friday, June 27. 
    #nvidia #partners #highlight #nextgeneration #robotics
    NVIDIA and Partners Highlight Next-Generation Robotics, Automation and AI Technologies at Automatica
    From the heart of Germany’s automotive sector to manufacturing hubs across France and Italy, Europe is embracing industrial AI and advanced AI-powered robotics to address labor shortages, boost productivity and fuel sustainable economic growth. Robotics companies are developing humanoid robots and collaborative systems that integrate AI into real-world manufacturing applications. Supported by a billion investment initiative and coordinated efforts from the European Commission, Europe is positioning itself at the forefront of the next wave of industrial automation, powered by AI. This momentum is on full display at Automatica — Europe’s premier conference on advancements in robotics, machine vision and intelligent manufacturing — taking place this week in Munich, Germany. NVIDIA and its ecosystem of partners and customers are showcasing next-generation robots, automation and AI technologies designed to accelerate the continent’s leadership in smart manufacturing and logistics. NVIDIA Technologies Boost Robotics Development  Central to advancing robotics development is Europe’s first industrial AI cloud, announced at NVIDIA GTC Paris at VivaTech earlier this month. The Germany-based AI factory, featuring 10,000 NVIDIA GPUs, provides European manufacturers with secure, sovereign and centralized AI infrastructure for industrial workloads. It will support applications ranging from design and engineering to factory digital twins and robotics. To help accelerate humanoid development, NVIDIA released NVIDIA Isaac GR00T N1.5 — an open foundation model for humanoid robot reasoning and skills. This update enhances the model’s adaptability and ability to follow instructions, significantly improving its performance in material handling and manufacturing tasks. To help post-train GR00T N1.5, NVIDIA has also released the Isaac GR00T-Dreams blueprint — a reference workflow for generating vast amounts of synthetic trajectory data from a small number of human demonstrations — enabling robots to generalize across behaviors and adapt to new environments with minimal human demonstration data. In addition, early developer previews of NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 — open-source robot simulation and learning frameworks optimized for NVIDIA RTX PRO 6000 workstations — are now available on GitHub. Image courtesy of Wandelbots. Robotics Leaders Tap NVIDIA Simulation Technology to Develop and Deploy Humanoids and More  Robotics developers and solutions providers across the globe are integrating NVIDIA’s three computers to train, simulate and deploy robots. NEURA Robotics, a German robotics company and pioneer for cognitive robots, unveiled the third generation of its humanoid, 4NE1, designed to assist humans in domestic and professional environments through advanced cognitive capabilities and humanlike interaction. 4NE1 is powered by GR00T N1 and was trained in Isaac Sim and Isaac Lab before real-world deployment. NEURA Robotics is also presenting Neuraverse, a digital twin and interconnected ecosystem for robot training, skills and applications, fully compatible with NVIDIA Omniverse technologies. Delta Electronics, a global leader in power management and smart green solutions, is debuting two next-generation collaborative robots: D-Bot Mar and D-Bot 2 in 1 — both trained using Omniverse and Isaac Sim technologies and libraries. These cobots are engineered to transform intralogistics and optimize production flows. Wandelbots, the creator of the Wandelbots NOVA software platform for industrial robotics, is partnering with SoftServe, a global IT consulting and digital services provider, to scale simulation-first automating using NVIDIA Isaac Sim, enabling virtual validation and real-world deployment with maximum impact. Cyngn, a pioneer in autonomous mobile robotics, is integrating its DriveMod technology into Isaac Sim to enable large-scale, high fidelity virtual testing of advanced autonomous operation. Purpose-built for industrial applications, DriveMod is already deployed on vehicles such as the Motrec MT-160 Tugger and BYD Forklift, delivering sophisticated automation to material handling operations. Doosan Robotics, a company specializing in AI robotic solutions, will showcase its “sim to real” solution, using NVIDIA Isaac Sim and cuRobo. Doosan will be showcasing how to seamlessly transfer tasks from simulation to real robots across a wide range of applications — from manufacturing to service industries. Franka Robotics has integrated Isaac GR00T N1.5 into a dual-arm Franka Research 3robot for robotic control. The integration of GR00T N1.5 allows the system to interpret visual input, understand task context and autonomously perform complex manipulation — without the need for task-specific programming or hardcoded logic. Image courtesy of Franka Robotics. Hexagon, the global leader in measurement technologies, launched its new humanoid, dubbed AEON. With its unique locomotion system and multimodal sensor fusion, and powered by NVIDIA’s three-computer solution, AEON is engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Intrinsic, a software and AI robotics company, is integrating Intrinsic Flowstate with  Omniverse and OpenUSD for advanced visualization and digital twins that can be used in many industrial use cases. The company is also using NVIDIA foundation models to enhance robot capabilities like grasp planning through AI and simulation technologies. SCHUNK, a global leader in gripping systems and automation technology, is showcasing its innovative grasping kit powered by the NVIDIA Jetson AGX Orin module. The kit intelligently detects objects and calculates optimal grasping points. Schunk is also demonstrating seamless simulation-to-reality transfer using IGS Virtuous software — built on Omniverse technologies — to control a real robot through simulation in a pick-and-place scenario. Universal Robots is showcasing UR15, its fastest cobot yet. Powered by the UR AI Accelerator — developed with NVIDIA and running on Jetson AGX Orin using CUDA-accelerated Isaac libraries — UR15 helps set a new standard for industrial automation. Vention, a full-stack software and hardware automation company, launched its Machine Motion AI, built on CUDA-accelerated Isaac libraries and powered by Jetson. Vention is also expanding its lineup of robotic offerings by adding the FR3 robot from Franka Robotics to its ecosystem, enhancing its solutions for academic and research applications. Image courtesy of Vention. Learn more about the latest robotics advancements by joining NVIDIA at Automatica, running through Friday, June 27.  #nvidia #partners #highlight #nextgeneration #robotics
    BLOGS.NVIDIA.COM
    NVIDIA and Partners Highlight Next-Generation Robotics, Automation and AI Technologies at Automatica
    From the heart of Germany’s automotive sector to manufacturing hubs across France and Italy, Europe is embracing industrial AI and advanced AI-powered robotics to address labor shortages, boost productivity and fuel sustainable economic growth. Robotics companies are developing humanoid robots and collaborative systems that integrate AI into real-world manufacturing applications. Supported by a $200 billion investment initiative and coordinated efforts from the European Commission, Europe is positioning itself at the forefront of the next wave of industrial automation, powered by AI. This momentum is on full display at Automatica — Europe’s premier conference on advancements in robotics, machine vision and intelligent manufacturing — taking place this week in Munich, Germany. NVIDIA and its ecosystem of partners and customers are showcasing next-generation robots, automation and AI technologies designed to accelerate the continent’s leadership in smart manufacturing and logistics. NVIDIA Technologies Boost Robotics Development  Central to advancing robotics development is Europe’s first industrial AI cloud, announced at NVIDIA GTC Paris at VivaTech earlier this month. The Germany-based AI factory, featuring 10,000 NVIDIA GPUs, provides European manufacturers with secure, sovereign and centralized AI infrastructure for industrial workloads. It will support applications ranging from design and engineering to factory digital twins and robotics. To help accelerate humanoid development, NVIDIA released NVIDIA Isaac GR00T N1.5 — an open foundation model for humanoid robot reasoning and skills. This update enhances the model’s adaptability and ability to follow instructions, significantly improving its performance in material handling and manufacturing tasks. To help post-train GR00T N1.5, NVIDIA has also released the Isaac GR00T-Dreams blueprint — a reference workflow for generating vast amounts of synthetic trajectory data from a small number of human demonstrations — enabling robots to generalize across behaviors and adapt to new environments with minimal human demonstration data. In addition, early developer previews of NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 — open-source robot simulation and learning frameworks optimized for NVIDIA RTX PRO 6000 workstations — are now available on GitHub. Image courtesy of Wandelbots. Robotics Leaders Tap NVIDIA Simulation Technology to Develop and Deploy Humanoids and More  Robotics developers and solutions providers across the globe are integrating NVIDIA’s three computers to train, simulate and deploy robots. NEURA Robotics, a German robotics company and pioneer for cognitive robots, unveiled the third generation of its humanoid, 4NE1, designed to assist humans in domestic and professional environments through advanced cognitive capabilities and humanlike interaction. 4NE1 is powered by GR00T N1 and was trained in Isaac Sim and Isaac Lab before real-world deployment. NEURA Robotics is also presenting Neuraverse, a digital twin and interconnected ecosystem for robot training, skills and applications, fully compatible with NVIDIA Omniverse technologies. Delta Electronics, a global leader in power management and smart green solutions, is debuting two next-generation collaborative robots: D-Bot Mar and D-Bot 2 in 1 — both trained using Omniverse and Isaac Sim technologies and libraries. These cobots are engineered to transform intralogistics and optimize production flows. Wandelbots, the creator of the Wandelbots NOVA software platform for industrial robotics, is partnering with SoftServe, a global IT consulting and digital services provider, to scale simulation-first automating using NVIDIA Isaac Sim, enabling virtual validation and real-world deployment with maximum impact. Cyngn, a pioneer in autonomous mobile robotics, is integrating its DriveMod technology into Isaac Sim to enable large-scale, high fidelity virtual testing of advanced autonomous operation. Purpose-built for industrial applications, DriveMod is already deployed on vehicles such as the Motrec MT-160 Tugger and BYD Forklift, delivering sophisticated automation to material handling operations. Doosan Robotics, a company specializing in AI robotic solutions, will showcase its “sim to real” solution, using NVIDIA Isaac Sim and cuRobo. Doosan will be showcasing how to seamlessly transfer tasks from simulation to real robots across a wide range of applications — from manufacturing to service industries. Franka Robotics has integrated Isaac GR00T N1.5 into a dual-arm Franka Research 3 (FR3) robot for robotic control. The integration of GR00T N1.5 allows the system to interpret visual input, understand task context and autonomously perform complex manipulation — without the need for task-specific programming or hardcoded logic. Image courtesy of Franka Robotics. Hexagon, the global leader in measurement technologies, launched its new humanoid, dubbed AEON. With its unique locomotion system and multimodal sensor fusion, and powered by NVIDIA’s three-computer solution, AEON is engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Intrinsic, a software and AI robotics company, is integrating Intrinsic Flowstate with  Omniverse and OpenUSD for advanced visualization and digital twins that can be used in many industrial use cases. The company is also using NVIDIA foundation models to enhance robot capabilities like grasp planning through AI and simulation technologies. SCHUNK, a global leader in gripping systems and automation technology, is showcasing its innovative grasping kit powered by the NVIDIA Jetson AGX Orin module. The kit intelligently detects objects and calculates optimal grasping points. Schunk is also demonstrating seamless simulation-to-reality transfer using IGS Virtuous software — built on Omniverse technologies — to control a real robot through simulation in a pick-and-place scenario. Universal Robots is showcasing UR15, its fastest cobot yet. Powered by the UR AI Accelerator — developed with NVIDIA and running on Jetson AGX Orin using CUDA-accelerated Isaac libraries — UR15 helps set a new standard for industrial automation. Vention, a full-stack software and hardware automation company, launched its Machine Motion AI, built on CUDA-accelerated Isaac libraries and powered by Jetson. Vention is also expanding its lineup of robotic offerings by adding the FR3 robot from Franka Robotics to its ecosystem, enhancing its solutions for academic and research applications. Image courtesy of Vention. Learn more about the latest robotics advancements by joining NVIDIA at Automatica, running through Friday, June 27. 
    Like
    Love
    Wow
    Sad
    Angry
    19
    0 Комментарии 0 Поделились
  • BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4

    By TREVOR HOGG
    Images courtesy of Prime Video.

    For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!”

    When Splintersplits in two, the cloning effect was inspired by cellular mitosis.

    “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!”
    —Stephan Fleet, VFX Supervisor

    A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith, who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.”

    Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed.

    Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.”

    “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.”
    —Stephan Fleet, VFX Supervisor

    The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be, so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a humanyou tend to want to give it human gestures and eyebrows. Erik Kripkesaid, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.”

    A building is replaced by a massive crowd attending a rally being held by Homelander.

    In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep wasin one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.”

    In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around.

    “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.”
    —Stephan Fleet, VFX Supervisor

    Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.”

    The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes.

    Once injected with Compound V, Hugh Campbell Sr.develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.”

    Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination.

    Homelanderbreaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.”

    “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.”
    —Stephan Fleet, VFX Supervisor

    Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution.

    A different spin on the bloodbath occurs during a fight when a drugged Frenchiehallucinates as Kimiko Miyashirogoes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.”

    Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4.

    When Splintersplits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker. “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.”
    #bouncing #rubber #duckies #flying #sheep
    BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4
    By TREVOR HOGG Images courtesy of Prime Video. For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” When Splintersplits in two, the cloning effect was inspired by cellular mitosis. “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” —Stephan Fleet, VFX Supervisor A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith, who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed. Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.” “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” —Stephan Fleet, VFX Supervisor The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be, so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a humanyou tend to want to give it human gestures and eyebrows. Erik Kripkesaid, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.” A building is replaced by a massive crowd attending a rally being held by Homelander. In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep wasin one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.” In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” —Stephan Fleet, VFX Supervisor Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes. Once injected with Compound V, Hugh Campbell Sr.develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.” Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination. Homelanderbreaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.” “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” —Stephan Fleet, VFX Supervisor Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution. A different spin on the bloodbath occurs during a fight when a drugged Frenchiehallucinates as Kimiko Miyashirogoes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.” Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4. When Splintersplits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker. “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.” #bouncing #rubber #duckies #flying #sheep
    WWW.VFXVOICE.COM
    BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4
    By TREVOR HOGG Images courtesy of Prime Video. For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” When Splinter (Rob Benedict) splits in two, the cloning effect was inspired by cellular mitosis. “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” —Stephan Fleet, VFX Supervisor A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith [Previs Director], who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed. Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.” “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” —Stephan Fleet, VFX Supervisor The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be [from Season 3], so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a human [is] you tend to want to give it human gestures and eyebrows. Erik Kripke [Creator, Executive Producer, Showrunner, Director, Writer] said, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.” A building is replaced by a massive crowd attending a rally being held by Homelander. In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep was [acting in a certain way] in one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.” In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” —Stephan Fleet, VFX Supervisor Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes. Once injected with Compound V, Hugh Campbell Sr. (Simon Pegg) develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.” Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination. Homelander (Anthony Starr) breaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.” “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” —Stephan Fleet, VFX Supervisor Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution. A different spin on the bloodbath occurs during a fight when a drugged Frenchie (Tomer Capone) hallucinates as Kimiko Miyashiro (Karen Fukuhara) goes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.” Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4. When Splinter (Rob Benedict) splits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker (Valorie Curry). “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.”
    0 Комментарии 0 Поделились
  • Watch Ben Starr Date Himself In Date Everything

    Since his breakthrough role in Final Fantasy XVI, fans can't get enough of Ben Starr. Neither can he, as in a video for GameSpot, the voice actor decided to play through Date Everything by only dating his own character.The surreal and comedic sandbox dating sim lets you literally date everything, thanks to a pair of magical glasses called Dateviators, which transform everyday household objects into dateable characters with their own stories. Each one is brought to by a huge roster of voice actors, including Ashly Burch, Matthew Mercer, Laura Bailey, Felicia Day, Steve Blum, Ashley Johnson, as well as the game's lead designer and veteran voice actor Ray Chase of Final Fantasy XV fame. Despite more than 100 dateable objects, "resident video game narcissist" Starr has opted to only date himself. Starr actually voices multiple characters as he plays a personified door called Dorian and there are 17 variations of Dorian throughout the house. There's Front Dorian who wears a little hat, Back Dorian, which the actor said he recorded "facing away from the microphone with my hand over my mouth", as well as Trap Dorian, who happens to wear a lot less clothes than the rest.Continue Reading at GameSpot
    #watch #ben #starr #date #himself
    Watch Ben Starr Date Himself In Date Everything
    Since his breakthrough role in Final Fantasy XVI, fans can't get enough of Ben Starr. Neither can he, as in a video for GameSpot, the voice actor decided to play through Date Everything by only dating his own character.The surreal and comedic sandbox dating sim lets you literally date everything, thanks to a pair of magical glasses called Dateviators, which transform everyday household objects into dateable characters with their own stories. Each one is brought to by a huge roster of voice actors, including Ashly Burch, Matthew Mercer, Laura Bailey, Felicia Day, Steve Blum, Ashley Johnson, as well as the game's lead designer and veteran voice actor Ray Chase of Final Fantasy XV fame. Despite more than 100 dateable objects, "resident video game narcissist" Starr has opted to only date himself. Starr actually voices multiple characters as he plays a personified door called Dorian and there are 17 variations of Dorian throughout the house. There's Front Dorian who wears a little hat, Back Dorian, which the actor said he recorded "facing away from the microphone with my hand over my mouth", as well as Trap Dorian, who happens to wear a lot less clothes than the rest.Continue Reading at GameSpot #watch #ben #starr #date #himself
    WWW.GAMESPOT.COM
    Watch Ben Starr Date Himself In Date Everything
    Since his breakthrough role in Final Fantasy XVI, fans can't get enough of Ben Starr. Neither can he, as in a video for GameSpot, the voice actor decided to play through Date Everything by only dating his own character.The surreal and comedic sandbox dating sim lets you literally date everything, thanks to a pair of magical glasses called Dateviators, which transform everyday household objects into dateable characters with their own stories. Each one is brought to by a huge roster of voice actors, including Ashly Burch, Matthew Mercer, Laura Bailey, Felicia Day, Steve Blum, Ashley Johnson, as well as the game's lead designer and veteran voice actor Ray Chase of Final Fantasy XV fame. Despite more than 100 dateable objects, "resident video game narcissist" Starr has opted to only date himself. Starr actually voices multiple characters as he plays a personified door called Dorian and there are 17 variations of Dorian throughout the house. There's Front Dorian who wears a little hat, Back Dorian, which the actor said he recorded "facing away from the microphone with my hand over my mouth", as well as Trap Dorian, who happens to wear a lot less clothes than the rest.Continue Reading at GameSpot
    0 Комментарии 0 Поделились
  • Hello, amazing friends!

    Today, I’m bursting with excitement to share something truly revolutionary that’s going to change the way we think about 3D scanning! Have you ever dreamt of capturing the beauty of the world around us in stunning detail? Well, dream no more because with the EINSTAR VEGA, this dream is now a breathtaking reality!

    In collaboration with Shining 3D, the EINSTAR VEGA is not just any 3D scanner; it's an all-in-one powerhouse that opens up a realm of possibilities for artists, enthusiasts, and studios alike! For years, many of us have explored the fascinating world of 3D digitization, but access to high-quality scanning technology has often felt distant — until now!

    Imagine effortlessly scanning objects, people, and places, all while achieving remarkable precision and detail. This device is designed with passion and creativity in mind, making it perfectly suited for both seasoned professionals and those just starting their journey into the magical world of 3D scanning.

    The EINSTAR VEGA empowers you to unleash your creativity like never before. Whether you’re an artist looking to replicate your sculptures, a designer aiming to bring your visions to life, or even a small studio wanting to elevate your projects, this scanner is an absolute game-changer!

    Let’s take a moment to appreciate how this technology makes the wonders of 3D scanning accessible to everyone. It’s not just about the tools we use—it's about the dreams we can create and the stories we can tell through our art!

    With the EINSTAR VEGA, you’re not just investing in a scanner; you’re investing in your future! Imagine the joy of sharing your 3D creations with the world, inspiring others, and pushing the boundaries of what’s possible. The sky's the limit, and I believe every one of you has the potential to soar!

    So, let’s embrace this incredible innovation together! Let’s dive into the world of 3D scanning, explore our creativity, and inspire each other to reach new heights! Remember, every great journey begins with a single step, and with the EINSTAR VEGA by your side, that first step has never been easier!

    Stay inspired, dream big, and let your creativity shine!

    #EINSTARVEGA #3DScanning #Shining3D #CreativityUnleashed #Inspiration
    🌟 Hello, amazing friends! 🌟 Today, I’m bursting with excitement to share something truly revolutionary that’s going to change the way we think about 3D scanning! 🎉 Have you ever dreamt of capturing the beauty of the world around us in stunning detail? Well, dream no more because with the EINSTAR VEGA, this dream is now a breathtaking reality! 📸✨ In collaboration with Shining 3D, the EINSTAR VEGA is not just any 3D scanner; it's an all-in-one powerhouse that opens up a realm of possibilities for artists, enthusiasts, and studios alike! 🖌️💫 For years, many of us have explored the fascinating world of 3D digitization, but access to high-quality scanning technology has often felt distant — until now! 🚀 Imagine effortlessly scanning objects, people, and places, all while achieving remarkable precision and detail. This device is designed with passion and creativity in mind, making it perfectly suited for both seasoned professionals and those just starting their journey into the magical world of 3D scanning. 🌈💖 The EINSTAR VEGA empowers you to unleash your creativity like never before. Whether you’re an artist looking to replicate your sculptures, a designer aiming to bring your visions to life, or even a small studio wanting to elevate your projects, this scanner is an absolute game-changer! 🌍❤️ Let’s take a moment to appreciate how this technology makes the wonders of 3D scanning accessible to everyone. It’s not just about the tools we use—it's about the dreams we can create and the stories we can tell through our art! 🗣️✨ With the EINSTAR VEGA, you’re not just investing in a scanner; you’re investing in your future! Imagine the joy of sharing your 3D creations with the world, inspiring others, and pushing the boundaries of what’s possible. The sky's the limit, and I believe every one of you has the potential to soar! 🌟💪 So, let’s embrace this incredible innovation together! Let’s dive into the world of 3D scanning, explore our creativity, and inspire each other to reach new heights! Remember, every great journey begins with a single step, and with the EINSTAR VEGA by your side, that first step has never been easier! 🎈🚀 Stay inspired, dream big, and let your creativity shine! 💖✨ #EINSTARVEGA #3DScanning #Shining3D #CreativityUnleashed #Inspiration
    EINSTAR VEGA : découvrez en vidéo ce scanner 3D tout en un !
    En partenariat avec Shining 3D Comme vous le savez, chez 3DVF, nous adorons la numérisation 3D, et cela fait des années que nous explorons différentes manières de scanner des objets, des personnes et des lieux. Cependant, pendant longtemps, certaines
    Like
    Love
    Wow
    Angry
    Sad
    605
    1 Комментарии 0 Поделились
  • ‘Balls, Dice & Stickers’ Creates Carefully Planned Mayhem

    Balls, Dice & Stickers asks you to launch a ball at some dice that trigger a ton of ridiculous effects each time you hit them.

    I am not sure what I did to upset paperclips, mice, and the manifestations of the past, but they’re all here to give me a hard time unless I beat them up with some damaging dice. I won’t be rolling those dice, though. That would be a little too straightforward in this delightfully chaotic game. Instead, I’ll be launching a ball at the dice and trying to get the ball to bounce around the room, hitting the dice as much as possible before the ball pings out the bottom of the screen.

    Except THAT is also not all there is to it. Each round, you get a sticker you can apply to one of your dice. These stickers cause wildly varied effects that often build off of the other stickers. For instance, you can add a beehive to one of the dice. This can spawn a bee, which in turn will shoot needles at certain things and will like other objects. Tape adds a banana to the playing field which can provide you points. The Pub spawns a drunk driver, and that drunk driver might get caught by the police car that you spawn from landing on another die. And these dice effects all stack on top of one another as you progress through the rounds, resulting in a bustling field of dozens of bizarre, silly effects all working in tandem with one another.
    Balls, Dice & Stickers is really something to behold after you’ve got a few rounds under your belt. Describing it really doesn’t do justice to how much fun this game is once it gets rolling, so I highly recommend trying out the alpha build on itch.io. I can’t even imagine how much sillier it’s going to get in its full release.
    Balls, Dice & Stickers is availble nowon itch.io. You can add the future full release of the game to your Wishlist on Steam.
    About The Author
    #balls #dice #ampamp #stickers #creates
    ‘Balls, Dice & Stickers’ Creates Carefully Planned Mayhem
    Balls, Dice & Stickers asks you to launch a ball at some dice that trigger a ton of ridiculous effects each time you hit them. I am not sure what I did to upset paperclips, mice, and the manifestations of the past, but they’re all here to give me a hard time unless I beat them up with some damaging dice. I won’t be rolling those dice, though. That would be a little too straightforward in this delightfully chaotic game. Instead, I’ll be launching a ball at the dice and trying to get the ball to bounce around the room, hitting the dice as much as possible before the ball pings out the bottom of the screen. Except THAT is also not all there is to it. Each round, you get a sticker you can apply to one of your dice. These stickers cause wildly varied effects that often build off of the other stickers. For instance, you can add a beehive to one of the dice. This can spawn a bee, which in turn will shoot needles at certain things and will like other objects. Tape adds a banana to the playing field which can provide you points. The Pub spawns a drunk driver, and that drunk driver might get caught by the police car that you spawn from landing on another die. And these dice effects all stack on top of one another as you progress through the rounds, resulting in a bustling field of dozens of bizarre, silly effects all working in tandem with one another. Balls, Dice & Stickers is really something to behold after you’ve got a few rounds under your belt. Describing it really doesn’t do justice to how much fun this game is once it gets rolling, so I highly recommend trying out the alpha build on itch.io. I can’t even imagine how much sillier it’s going to get in its full release. Balls, Dice & Stickers is availble nowon itch.io. You can add the future full release of the game to your Wishlist on Steam. About The Author #balls #dice #ampamp #stickers #creates
    INDIEGAMESPLUS.COM
    ‘Balls, Dice & Stickers’ Creates Carefully Planned Mayhem
    Balls, Dice & Stickers asks you to launch a ball at some dice that trigger a ton of ridiculous effects each time you hit them. I am not sure what I did to upset paperclips, mice, and the manifestations of the past, but they’re all here to give me a hard time unless I beat them up with some damaging dice. I won’t be rolling those dice, though. That would be a little too straightforward in this delightfully chaotic game. Instead, I’ll be launching a ball at the dice and trying to get the ball to bounce around the room, hitting the dice as much as possible before the ball pings out the bottom of the screen. Except THAT is also not all there is to it. Each round, you get a sticker you can apply to one of your dice. These stickers cause wildly varied effects that often build off of the other stickers. For instance, you can add a beehive to one of the dice. This can spawn a bee, which in turn will shoot needles at certain things and will like other objects. Tape adds a banana to the playing field which can provide you points. The Pub spawns a drunk driver, and that drunk driver might get caught by the police car that you spawn from landing on another die. And these dice effects all stack on top of one another as you progress through the rounds, resulting in a bustling field of dozens of bizarre, silly effects all working in tandem with one another. Balls, Dice & Stickers is really something to behold after you’ve got a few rounds under your belt. Describing it really doesn’t do justice to how much fun this game is once it gets rolling, so I highly recommend trying out the alpha build on itch.io. I can’t even imagine how much sillier it’s going to get in its full release. Balls, Dice & Stickers is availble now (in an alpha format) on itch.io. You can add the future full release of the game to your Wishlist on Steam. About The Author
    Like
    Love
    Wow
    Sad
    Angry
    509
    2 Комментарии 0 Поделились
  • Inside Summer Game Fest 2025: How Geoff Keighley and Producers Pulled Off Event Amid Industry Layoffs, ‘GTA 6’ Delay and Switch 2 Release

    With the ongoing jobs cuts across the gaming industry, the shift of “Grand Theft Auto 6” from release this fall to a launch next spring, and the distraction of the first new Nintendo console in eight years, there was a chance that Summer Game Fest 2025 wouldn’t have the same allure as the annual video game showcase has had in years past.Related Stories

    But the gamers came out in full force for the Geoff Keighley-hosted event on June 6, which live-streamed out of the YouTube Theater at SoFi Stadium in Los Angeles.

    Popular on Variety

    “Viewership was up significantly year over year,” Keighley told Variety. “Stream charts said it doubled its audience year over year for the peak concurrency to over 3 million peak concurrent viewers, which does not include China.”

    In person, both the Summer Game Fest live showcase event and its subsequent weekend Play Days event for developers and press saw “significantly higher” media creator attendance this year: more than 600 registered attendees vs. “somewhere in the 400s” in 2024, per SGF. The boost is an indicator that both the current U.S. political climate and significant changes in 2025’s game release schedule, like the delay of “Grand Theft Auto 6” until next May, didn’t affect interest in the event.

    “Things happen in the industry all the time that are big news worthy happenings,” Summer Game Fest producer and iam8bit co-creator Amanda White. “Switch 2 just happened and we’re here, it’s all working out, everybody’s having a great time playing games. It’s not irrelevant — it’s just part of the way things go.”

    As big a hit as the Switch 2 was with consumers upon release — selling more than 3.5 million units during the first four days after its June 5 launch — and noted multiple times during the Summer Game Fest live showcase on June 6, Nintendo’s new console was not the star of the three-day Play Days event for developers and media in Downtown Los Angeles, which ran June 7-9.

    “I have not seen a single attendee with a Switch 2 on campus,” SGF producer and iam8bit co-creator Jon M. Gibson said with a laugh. “There’s a few Switch 2s that Nintendo supplied. Some dev kits for Bandai and for Capcom. Of course, the launch happened on Thursday, so bandwidth from Nintendo is stretched thin with all the midnight launches and stuff. But they’re really supportive and supply some for some pre-release games, which is exciting.”

    Some big video publishers such as EA, Take-Two and Ubisoft skipped this year’s SGF, eliminating potential splashy in-show hits for eagerly anticipated games like “Grand Theft Auto 6.” But SGF still managed a few big moments, like the announcement and trailer release for “Resident Evil Requiem.” Gibson and White attribute that reveal and other moments like it to the immense trust the festival has managed to build up with video game publishers in just a few years.

    “We are very proud of our ability to keep the trust of all the publishers on campus,” Gibson said. “Six years into SGF as a whole, four years into Play Days, we’re very good. Because we have to print everything ahead of time, too. So there are lots of unannounced things that we’re very careful about who sees what. We have vendors who print and produce and manufacture physical objects under very tight wraps. We’re just very protective, because we know what it means to have to keep a secret because we’ve had our own games that we’ve had to announce, as well. Capcom is a great example with ‘Resident Evil.’ We knew that for a very long time, but they trusted us with information, and we were very careful about what our team actually knew what was going on.”

    And even though some of the gaming giants sat this year out, White says conversations were already happening on the Play Days campus about who is ready to return next year and what they’ll bring.

    “People get excited, they come and see. And each year we grow, so people see more potential,” White said.

    As for next year, the June show will take place just a few weeks after the planned May 26 release for “GTA 6.” While Switch 2 didn’t seem to distract too much, will the draw of playing the newly launched “GTA 6” prove to be so powerful it outshines whatever could be announced at SGF 2026?

    “My view is that all boats rise with ‘GTA’ launch,” Keighley said. “It is a singular cultural event that is the biggest thing in all of entertainment this decade. It will bring more people into gaming, sell lots of consoles and bring back lapsed gamers. There will never be a better time to feel the excitement and energy around gaming than SGF 2026.”

    See more from Variety‘s Q&A with Keighley about Summer Game Fest 2025 below.

    How was this year’s show impacted by the date shift for “GTA 6”? How much was planned before and after that big announcement?

    So far as I know there wasn’t any material impact, but I think the date move did allow a number of teams to feel more confident announcing their launch dates.

    Halfway through the year, what do you see as some of the biggest trends in gaming for 2025, and how did you look to reflect that in the show?

    We continue to see some of the most interesting and successful games come from smaller teams outside of the traditional publisher system – games like “Clair Obscur,” “Blue Prince” and “REPO.” So we wanted to highlight some of those projects at the show like “Ill” and “Mortal Shell 2.”

    What game announcements and trailers do you think resonated most with audiences after this show? What assets were the most popular?

    “Resident Evil Requiem” was a massive moment. Also we saw a lot of love for “Ill” from a small team in Canada and Armenia.
    #inside #summer #game #fest #how
    Inside Summer Game Fest 2025: How Geoff Keighley and Producers Pulled Off Event Amid Industry Layoffs, ‘GTA 6’ Delay and Switch 2 Release
    With the ongoing jobs cuts across the gaming industry, the shift of “Grand Theft Auto 6” from release this fall to a launch next spring, and the distraction of the first new Nintendo console in eight years, there was a chance that Summer Game Fest 2025 wouldn’t have the same allure as the annual video game showcase has had in years past.Related Stories But the gamers came out in full force for the Geoff Keighley-hosted event on June 6, which live-streamed out of the YouTube Theater at SoFi Stadium in Los Angeles. Popular on Variety “Viewership was up significantly year over year,” Keighley told Variety. “Stream charts said it doubled its audience year over year for the peak concurrency to over 3 million peak concurrent viewers, which does not include China.” In person, both the Summer Game Fest live showcase event and its subsequent weekend Play Days event for developers and press saw “significantly higher” media creator attendance this year: more than 600 registered attendees vs. “somewhere in the 400s” in 2024, per SGF. The boost is an indicator that both the current U.S. political climate and significant changes in 2025’s game release schedule, like the delay of “Grand Theft Auto 6” until next May, didn’t affect interest in the event. “Things happen in the industry all the time that are big news worthy happenings,” Summer Game Fest producer and iam8bit co-creator Amanda White. “Switch 2 just happened and we’re here, it’s all working out, everybody’s having a great time playing games. It’s not irrelevant — it’s just part of the way things go.” As big a hit as the Switch 2 was with consumers upon release — selling more than 3.5 million units during the first four days after its June 5 launch — and noted multiple times during the Summer Game Fest live showcase on June 6, Nintendo’s new console was not the star of the three-day Play Days event for developers and media in Downtown Los Angeles, which ran June 7-9. “I have not seen a single attendee with a Switch 2 on campus,” SGF producer and iam8bit co-creator Jon M. Gibson said with a laugh. “There’s a few Switch 2s that Nintendo supplied. Some dev kits for Bandai and for Capcom. Of course, the launch happened on Thursday, so bandwidth from Nintendo is stretched thin with all the midnight launches and stuff. But they’re really supportive and supply some for some pre-release games, which is exciting.” Some big video publishers such as EA, Take-Two and Ubisoft skipped this year’s SGF, eliminating potential splashy in-show hits for eagerly anticipated games like “Grand Theft Auto 6.” But SGF still managed a few big moments, like the announcement and trailer release for “Resident Evil Requiem.” Gibson and White attribute that reveal and other moments like it to the immense trust the festival has managed to build up with video game publishers in just a few years. “We are very proud of our ability to keep the trust of all the publishers on campus,” Gibson said. “Six years into SGF as a whole, four years into Play Days, we’re very good. Because we have to print everything ahead of time, too. So there are lots of unannounced things that we’re very careful about who sees what. We have vendors who print and produce and manufacture physical objects under very tight wraps. We’re just very protective, because we know what it means to have to keep a secret because we’ve had our own games that we’ve had to announce, as well. Capcom is a great example with ‘Resident Evil.’ We knew that for a very long time, but they trusted us with information, and we were very careful about what our team actually knew what was going on.” And even though some of the gaming giants sat this year out, White says conversations were already happening on the Play Days campus about who is ready to return next year and what they’ll bring. “People get excited, they come and see. And each year we grow, so people see more potential,” White said. As for next year, the June show will take place just a few weeks after the planned May 26 release for “GTA 6.” While Switch 2 didn’t seem to distract too much, will the draw of playing the newly launched “GTA 6” prove to be so powerful it outshines whatever could be announced at SGF 2026? “My view is that all boats rise with ‘GTA’ launch,” Keighley said. “It is a singular cultural event that is the biggest thing in all of entertainment this decade. It will bring more people into gaming, sell lots of consoles and bring back lapsed gamers. There will never be a better time to feel the excitement and energy around gaming than SGF 2026.” See more from Variety‘s Q&A with Keighley about Summer Game Fest 2025 below. How was this year’s show impacted by the date shift for “GTA 6”? How much was planned before and after that big announcement? So far as I know there wasn’t any material impact, but I think the date move did allow a number of teams to feel more confident announcing their launch dates. Halfway through the year, what do you see as some of the biggest trends in gaming for 2025, and how did you look to reflect that in the show? We continue to see some of the most interesting and successful games come from smaller teams outside of the traditional publisher system – games like “Clair Obscur,” “Blue Prince” and “REPO.” So we wanted to highlight some of those projects at the show like “Ill” and “Mortal Shell 2.” What game announcements and trailers do you think resonated most with audiences after this show? What assets were the most popular? “Resident Evil Requiem” was a massive moment. Also we saw a lot of love for “Ill” from a small team in Canada and Armenia. #inside #summer #game #fest #how
    VARIETY.COM
    Inside Summer Game Fest 2025: How Geoff Keighley and Producers Pulled Off Event Amid Industry Layoffs, ‘GTA 6’ Delay and Switch 2 Release
    With the ongoing jobs cuts across the gaming industry, the shift of “Grand Theft Auto 6” from release this fall to a launch next spring, and the distraction of the first new Nintendo console in eight years, there was a chance that Summer Game Fest 2025 wouldn’t have the same allure as the annual video game showcase has had in years past. (There was also the factor of the actors strike against video game companies, which, as of June 11, has been called off by SAG-AFTRA.) Related Stories But the gamers came out in full force for the Geoff Keighley-hosted event on June 6, which live-streamed out of the YouTube Theater at SoFi Stadium in Los Angeles. Popular on Variety “Viewership was up significantly year over year,” Keighley told Variety. “Stream charts said it doubled its audience year over year for the peak concurrency to over 3 million peak concurrent viewers, which does not include China.” In person, both the Summer Game Fest live showcase event and its subsequent weekend Play Days event for developers and press saw “significantly higher” media creator attendance this year: more than 600 registered attendees vs. “somewhere in the 400s” in 2024, per SGF. The boost is an indicator that both the current U.S. political climate and significant changes in 2025’s game release schedule, like the delay of “Grand Theft Auto 6” until next May, didn’t affect interest in the event. “Things happen in the industry all the time that are big news worthy happenings,” Summer Game Fest producer and iam8bit co-creator Amanda White. “Switch 2 just happened and we’re here, it’s all working out, everybody’s having a great time playing games. It’s not irrelevant — it’s just part of the way things go.” As big a hit as the Switch 2 was with consumers upon release — selling more than 3.5 million units during the first four days after its June 5 launch — and noted multiple times during the Summer Game Fest live showcase on June 6, Nintendo’s new console was not the star of the three-day Play Days event for developers and media in Downtown Los Angeles, which ran June 7-9. “I have not seen a single attendee with a Switch 2 on campus,” SGF producer and iam8bit co-creator Jon M. Gibson said with a laugh. “There’s a few Switch 2s that Nintendo supplied. Some dev kits for Bandai and for Capcom. Of course, the launch happened on Thursday, so bandwidth from Nintendo is stretched thin with all the midnight launches and stuff. But they’re really supportive and supply some for some pre-release games, which is exciting.” Some big video publishers such as EA, Take-Two and Ubisoft skipped this year’s SGF, eliminating potential splashy in-show hits for eagerly anticipated games like “Grand Theft Auto 6.” But SGF still managed a few big moments, like the announcement and trailer release for “Resident Evil Requiem.” Gibson and White attribute that reveal and other moments like it to the immense trust the festival has managed to build up with video game publishers in just a few years. “We are very proud of our ability to keep the trust of all the publishers on campus,” Gibson said. “Six years into SGF as a whole, four years into Play Days, we’re very good. Because we have to print everything ahead of time, too. So there are lots of unannounced things that we’re very careful about who sees what. We have vendors who print and produce and manufacture physical objects under very tight wraps. We’re just very protective, because we know what it means to have to keep a secret because we’ve had our own games that we’ve had to announce, as well. Capcom is a great example with ‘Resident Evil.’ We knew that for a very long time, but they trusted us with information, and we were very careful about what our team actually knew what was going on.” And even though some of the gaming giants sat this year out, White says conversations were already happening on the Play Days campus about who is ready to return next year and what they’ll bring. “People get excited, they come and see. And each year we grow, so people see more potential,” White said. As for next year, the June show will take place just a few weeks after the planned May 26 release for “GTA 6.” While Switch 2 didn’t seem to distract too much, will the draw of playing the newly launched “GTA 6” prove to be so powerful it outshines whatever could be announced at SGF 2026? “My view is that all boats rise with ‘GTA’ launch,” Keighley said. “It is a singular cultural event that is the biggest thing in all of entertainment this decade. It will bring more people into gaming, sell lots of consoles and bring back lapsed gamers. There will never be a better time to feel the excitement and energy around gaming than SGF 2026.” See more from Variety‘s Q&A with Keighley about Summer Game Fest 2025 below. How was this year’s show impacted by the date shift for “GTA 6”? How much was planned before and after that big announcement? So far as I know there wasn’t any material impact, but I think the date move did allow a number of teams to feel more confident announcing their launch dates. Halfway through the year, what do you see as some of the biggest trends in gaming for 2025, and how did you look to reflect that in the show? We continue to see some of the most interesting and successful games come from smaller teams outside of the traditional publisher system – games like “Clair Obscur,” “Blue Prince” and “REPO.” So we wanted to highlight some of those projects at the show like “Ill” and “Mortal Shell 2.” What game announcements and trailers do you think resonated most with audiences after this show? What assets were the most popular? “Resident Evil Requiem” was a massive moment. Also we saw a lot of love for “Ill” from a small team in Canada and Armenia.
    Like
    Love
    Wow
    Sad
    Angry
    532
    0 Комментарии 0 Поделились
  • Animate the Smart Way in Blender (Procedural Animation Tutorial) #b3d

    In this video, Louis du Montshows how to animate objects using Geometry Node, unlocking quick control and variation which scales.
    ⇨ Robotic Planet:
    ⇨ Project Files:

    CHAPTERS
    00:00 - Intro
    00:33 - Joining Objects
    04:01 - Ambient Ship Motion
    09:04 - Ambient Laser Motion
    11:06 - Disc Rotation
    12:26 - Using Group Inputs
    15:02 - Outro

    MY SYSTEM
    CPU: Ryzen 5900x
    GPU: GeForce RTX 3090
    RAM: 96 GB

    FOLLOW CG BOOST
    ⇨ X:
    ⇨ Instagram: /
    ⇨ Web: /
    #animate #smart #way #blender #procedural
    Animate the Smart Way in Blender (Procedural Animation Tutorial) #b3d
    In this video, Louis du Montshows how to animate objects using Geometry Node, unlocking quick control and variation which scales. ⇨ Robotic Planet: ⇨ Project Files: CHAPTERS 00:00 - Intro 00:33 - Joining Objects 04:01 - Ambient Ship Motion 09:04 - Ambient Laser Motion 11:06 - Disc Rotation 12:26 - Using Group Inputs 15:02 - Outro MY SYSTEM CPU: Ryzen 5900x GPU: GeForce RTX 3090 RAM: 96 GB FOLLOW CG BOOST ⇨ X: ⇨ Instagram: / ⇨ Web: / #animate #smart #way #blender #procedural
    WWW.YOUTUBE.COM
    Animate the Smart Way in Blender (Procedural Animation Tutorial) #b3d
    In this video, Louis du Mont (@ldm) shows how to animate objects using Geometry Node, unlocking quick control and variation which scales. ⇨ Robotic Planet: https://cgboost.link/robotic-planet-449836 ⇨ Project Files: https://www.cgboost.com/resources CHAPTERS 00:00 - Intro 00:33 - Joining Objects 04:01 - Ambient Ship Motion 09:04 - Ambient Laser Motion 11:06 - Disc Rotation 12:26 - Using Group Inputs 15:02 - Outro MY SYSTEM CPU: Ryzen 5900x GPU: GeForce RTX 3090 RAM: 96 GB FOLLOW CG BOOST ⇨ X: https://twitter.com/cgboost ⇨ Instagram: https://www.instagram.com/cg_boost/ ⇨ Web: https://cgboost.com/
    Like
    Love
    Wow
    Sad
    Angry
    525
    0 Комментарии 0 Поделились
  • MillerKnoll opens new design archive showcasing over one million objects from the company’s history

    In a 12,000-square-foot warehouse in Zeeland, Michigan, hundreds of chairs, sofas, and loveseats rest on open storage racks. Their bold colors and elegant forms stand in striking contrast to the industrial setting. A plush recliner, seemingly made for sinking into, sits beside a mesh desk chair like those found in generic office cubicles. Nearby, a rare prototype of the Knoll Womb® Chair, gifted by Eero Saarinen to his mother, blooms open like a flower–inviting someone to sit. There’s also mahogany furniture designed by Gilbert Rohde for Herman Miller, originally unveiled at the 1933 World’s Fair; early office pieces by Florence Knoll; and a sculptural paper lamp by Isamu Noguchi. This is the newly unveiled MillerKnoll Archive, a space that honors the distinct legacies of its formerly rival brands. In collaboration with New York–based design firm Standard Issue, MillerKnoll has created a permanent display of its most iconic designs at the company’s Michigan Design Yard headquarters.

    In the early 1920s, Dutch-born businessman Herman Miller became the majority stakeholder in a Zeeland, Michigan, company where his son-in-law served as president. Following the acquisition, Star Furniture Co. was renamed the Herman Miller Furniture Company. Meanwhile, across the Atlantic in Stuttgart, Germany, Walter Knoll joined his family’s furniture business and formed close ties with modernist pioneers Ludwig Mies van der Rohe and Walter Gropius, immersing himself in the Bauhaus movement as Germany edged toward war. 
    Just before the outbreak of World War II, Walter Knoll relocated to the United States and established his own furniture company in New York City. Around the same time, Michigan native Florence Schust was studying at the Cranbrook Academy of Art under Eliel Saarinen. There, she met Eero Saarinen and Charles Eames. Schust, who later married Walter Knoll, and Saarinen would go on to become key designers for the company, while Eames would play a similarly pivotal role at Herman Miller—setting both firms on parallel paths in the world of modern design.
    The facility was designed in collaboration with New York-based design firm Standard Issue. The archive, located in MillerKnoll’s Design Yard Headquarters, is 12,000 square feet and holds over one million objects.Formerly seen as competitors, Herman Miller acquired Knoll four years ago in a billion merger that formed MillerKnoll. The deal united two of the most influential names in American furniture, merging their storied design legacies and the iconic pieces that helped define modern design. Now, MillerKnoll is honoring the distinct histories of each brand through this new archive. The archive is a permanent home for the brands’ archival collections and also exhibits the evolution of modern design. The facility is organized into three distinct areas: an exhibition space, open storage, and a reading room. 

    The facility’s first exhibition, Manufacturing Modern, explores the intertwined histories of Knoll and Herman Miller. It showcases designs from the individuals who helped shape each company. The open storage area displays over 300 pieces of modern furniture, featuring both original works from Knoll and Herman Miller as well as contemporary designs. In addition to viewing the furniture pieces, visitors can kick back in the reading room, which offers access to a collection of archival materials, including correspondence, photography, drawings, and textiles.
    The facility is organized into three distinct areas: an exhibition space, open storage, and a reading room and will be open for tours in partnership with the Cranbrook Art Academy this summer.“The debut of the MillerKnoll Archives invites our communities to experience design history – and imagine its future– in one dynamic space,” said MillerKnoll’s chief creative and product officer Ben Watson. “The ability to not only understand how iconic designs came to be, but how design solutions evolved over time, is a never-ending source of inspiration.”
    Exclusive tours of the archive will be available in July and August in partnership with the Cranbrook Art Museum and in October in partnership with Docomomo.
    #millerknoll #opens #new #design #archive
    MillerKnoll opens new design archive showcasing over one million objects from the company’s history
    In a 12,000-square-foot warehouse in Zeeland, Michigan, hundreds of chairs, sofas, and loveseats rest on open storage racks. Their bold colors and elegant forms stand in striking contrast to the industrial setting. A plush recliner, seemingly made for sinking into, sits beside a mesh desk chair like those found in generic office cubicles. Nearby, a rare prototype of the Knoll Womb® Chair, gifted by Eero Saarinen to his mother, blooms open like a flower–inviting someone to sit. There’s also mahogany furniture designed by Gilbert Rohde for Herman Miller, originally unveiled at the 1933 World’s Fair; early office pieces by Florence Knoll; and a sculptural paper lamp by Isamu Noguchi. This is the newly unveiled MillerKnoll Archive, a space that honors the distinct legacies of its formerly rival brands. In collaboration with New York–based design firm Standard Issue, MillerKnoll has created a permanent display of its most iconic designs at the company’s Michigan Design Yard headquarters. In the early 1920s, Dutch-born businessman Herman Miller became the majority stakeholder in a Zeeland, Michigan, company where his son-in-law served as president. Following the acquisition, Star Furniture Co. was renamed the Herman Miller Furniture Company. Meanwhile, across the Atlantic in Stuttgart, Germany, Walter Knoll joined his family’s furniture business and formed close ties with modernist pioneers Ludwig Mies van der Rohe and Walter Gropius, immersing himself in the Bauhaus movement as Germany edged toward war.  Just before the outbreak of World War II, Walter Knoll relocated to the United States and established his own furniture company in New York City. Around the same time, Michigan native Florence Schust was studying at the Cranbrook Academy of Art under Eliel Saarinen. There, she met Eero Saarinen and Charles Eames. Schust, who later married Walter Knoll, and Saarinen would go on to become key designers for the company, while Eames would play a similarly pivotal role at Herman Miller—setting both firms on parallel paths in the world of modern design. The facility was designed in collaboration with New York-based design firm Standard Issue. The archive, located in MillerKnoll’s Design Yard Headquarters, is 12,000 square feet and holds over one million objects.Formerly seen as competitors, Herman Miller acquired Knoll four years ago in a billion merger that formed MillerKnoll. The deal united two of the most influential names in American furniture, merging their storied design legacies and the iconic pieces that helped define modern design. Now, MillerKnoll is honoring the distinct histories of each brand through this new archive. The archive is a permanent home for the brands’ archival collections and also exhibits the evolution of modern design. The facility is organized into three distinct areas: an exhibition space, open storage, and a reading room.  The facility’s first exhibition, Manufacturing Modern, explores the intertwined histories of Knoll and Herman Miller. It showcases designs from the individuals who helped shape each company. The open storage area displays over 300 pieces of modern furniture, featuring both original works from Knoll and Herman Miller as well as contemporary designs. In addition to viewing the furniture pieces, visitors can kick back in the reading room, which offers access to a collection of archival materials, including correspondence, photography, drawings, and textiles. The facility is organized into three distinct areas: an exhibition space, open storage, and a reading room and will be open for tours in partnership with the Cranbrook Art Academy this summer.“The debut of the MillerKnoll Archives invites our communities to experience design history – and imagine its future– in one dynamic space,” said MillerKnoll’s chief creative and product officer Ben Watson. “The ability to not only understand how iconic designs came to be, but how design solutions evolved over time, is a never-ending source of inspiration.” Exclusive tours of the archive will be available in July and August in partnership with the Cranbrook Art Museum and in October in partnership with Docomomo. #millerknoll #opens #new #design #archive
    WWW.ARCHPAPER.COM
    MillerKnoll opens new design archive showcasing over one million objects from the company’s history
    In a 12,000-square-foot warehouse in Zeeland, Michigan, hundreds of chairs, sofas, and loveseats rest on open storage racks. Their bold colors and elegant forms stand in striking contrast to the industrial setting. A plush recliner, seemingly made for sinking into, sits beside a mesh desk chair like those found in generic office cubicles. Nearby, a rare prototype of the Knoll Womb® Chair, gifted by Eero Saarinen to his mother, blooms open like a flower–inviting someone to sit. There’s also mahogany furniture designed by Gilbert Rohde for Herman Miller, originally unveiled at the 1933 World’s Fair; early office pieces by Florence Knoll; and a sculptural paper lamp by Isamu Noguchi. This is the newly unveiled MillerKnoll Archive, a space that honors the distinct legacies of its formerly rival brands. In collaboration with New York–based design firm Standard Issue, MillerKnoll has created a permanent display of its most iconic designs at the company’s Michigan Design Yard headquarters. In the early 1920s, Dutch-born businessman Herman Miller became the majority stakeholder in a Zeeland, Michigan, company where his son-in-law served as president. Following the acquisition, Star Furniture Co. was renamed the Herman Miller Furniture Company. Meanwhile, across the Atlantic in Stuttgart, Germany, Walter Knoll joined his family’s furniture business and formed close ties with modernist pioneers Ludwig Mies van der Rohe and Walter Gropius, immersing himself in the Bauhaus movement as Germany edged toward war.  Just before the outbreak of World War II, Walter Knoll relocated to the United States and established his own furniture company in New York City. Around the same time, Michigan native Florence Schust was studying at the Cranbrook Academy of Art under Eliel Saarinen. There, she met Eero Saarinen and Charles Eames. Schust, who later married Walter Knoll, and Saarinen would go on to become key designers for the company, while Eames would play a similarly pivotal role at Herman Miller—setting both firms on parallel paths in the world of modern design. The facility was designed in collaboration with New York-based design firm Standard Issue. The archive, located in MillerKnoll’s Design Yard Headquarters, is 12,000 square feet and holds over one million objects. (Nicholas Calcott/Courtesy MillerKnoll) Formerly seen as competitors, Herman Miller acquired Knoll four years ago in a $1.8 billion merger that formed MillerKnoll. The deal united two of the most influential names in American furniture, merging their storied design legacies and the iconic pieces that helped define modern design. Now, MillerKnoll is honoring the distinct histories of each brand through this new archive. The archive is a permanent home for the brands’ archival collections and also exhibits the evolution of modern design. The facility is organized into three distinct areas: an exhibition space, open storage, and a reading room.  The facility’s first exhibition, Manufacturing Modern, explores the intertwined histories of Knoll and Herman Miller. It showcases designs from the individuals who helped shape each company. The open storage area displays over 300 pieces of modern furniture, featuring both original works from Knoll and Herman Miller as well as contemporary designs. In addition to viewing the furniture pieces, visitors can kick back in the reading room, which offers access to a collection of archival materials, including correspondence, photography, drawings, and textiles. The facility is organized into three distinct areas: an exhibition space, open storage, and a reading room and will be open for tours in partnership with the Cranbrook Art Academy this summer. (Nicholas Calcott/Courtesy MillerKnoll) “The debut of the MillerKnoll Archives invites our communities to experience design history – and imagine its future– in one dynamic space,” said MillerKnoll’s chief creative and product officer Ben Watson. “The ability to not only understand how iconic designs came to be, but how design solutions evolved over time, is a never-ending source of inspiration.” Exclusive tours of the archive will be available in July and August in partnership with the Cranbrook Art Museum and in October in partnership with Docomomo.
    Like
    Love
    Wow
    Sad
    Angry
    490
    0 Комментарии 0 Поделились
Расширенные страницы