• Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid

    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand.
    Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation.
    At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics.
    Future use cases for AEON include:

    Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Realityplatform powering Hexagon Reality Cloud Studio.
    Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings.
    Part inspection, which includes checking parts for defects or ensuring adherence to specifications.
    Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners.

    “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.”

    Using NVIDIA’s Three Computers to Develop AEON 
    To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models.
    Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations.
    AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning.


    This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment.
    In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation.
    “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.”
    Data Comes to Life Through Reality Capture and Omniverse Integration 
    AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas.

    Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure.
    “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.”
    AEON’s Next Steps
    By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON.
    This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data.
    Watch the Hexagon LIVE keynote, explore presentations and read more about AEON.
    All imagery courtesy of Hexagon.
    #hexagon #taps #nvidia #robotics #software
    Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid
    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand. Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation. At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics. Future use cases for AEON include: Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Realityplatform powering Hexagon Reality Cloud Studio. Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings. Part inspection, which includes checking parts for defects or ensuring adherence to specifications. Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners. “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.” Using NVIDIA’s Three Computers to Develop AEON  To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models. Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations. AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning. This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment. In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation. “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.” Data Comes to Life Through Reality Capture and Omniverse Integration  AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas. Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure. “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.” AEON’s Next Steps By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON. This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data. Watch the Hexagon LIVE keynote, explore presentations and read more about AEON. All imagery courtesy of Hexagon. #hexagon #taps #nvidia #robotics #software
    BLOGS.NVIDIA.COM
    Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid
    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand. Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation. At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics. Future use cases for AEON include: Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Reality (HxDR) platform powering Hexagon Reality Cloud Studio (RCS). Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings. Part inspection, which includes checking parts for defects or ensuring adherence to specifications. Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners. “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.” Using NVIDIA’s Three Computers to Develop AEON  To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models. Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations. AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning. https://blogs.nvidia.com/wp-content/uploads/2025/06/Copy-of-robotics-hxgn-live-blog-1920x1080-1.mp4 This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment. In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation. “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.” Data Comes to Life Through Reality Capture and Omniverse Integration  AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas. Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure. “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.” AEON’s Next Steps By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON. This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data. Watch the Hexagon LIVE keynote, explore presentations and read more about AEON. All imagery courtesy of Hexagon.
    0 Commentaires 0 Parts
  • Dandadan: Is There Anyone Who Can Challenge Momo as Okarun's Love Interest?

    Dandadan follows two eccentric teenagers as they explore the bizarre secrets and myths of the world they inhabit. In their misadventures, Momo and Okarun learn more about one another and become close friends in the process. Their bond is unique, as no one else truly understands or relates to the unorthodox beliefs and interests they have.
    #dandadan #there #anyone #who #can
    Dandadan: Is There Anyone Who Can Challenge Momo as Okarun's Love Interest?
    Dandadan follows two eccentric teenagers as they explore the bizarre secrets and myths of the world they inhabit. In their misadventures, Momo and Okarun learn more about one another and become close friends in the process. Their bond is unique, as no one else truly understands or relates to the unorthodox beliefs and interests they have. #dandadan #there #anyone #who #can
    GAMERANT.COM
    Dandadan: Is There Anyone Who Can Challenge Momo as Okarun's Love Interest?
    Dandadan follows two eccentric teenagers as they explore the bizarre secrets and myths of the world they inhabit. In their misadventures, Momo and Okarun learn more about one another and become close friends in the process. Their bond is unique, as no one else truly understands or relates to the unorthodox beliefs and interests they have.
    0 Commentaires 0 Parts
  • Hideo Kojima Had To Add His Own Easter Eggs To Death Stranding 2

    Hideo Kojima's latest game, Death Stranding 2: On the Beach, has arrived on PlayStation 5. While players continue to pour over the details, they may find a few Easter eggs that were placed in the game by Kojima himself. According to Kojima, he only took that hands-on approach because his staff refused to do it for him.As reported by Game Spark, Kojima confirmed that he personally added those nods to his past. He went on to say that when his staff was asked to include his self-referential jokes, they would pretend they didn't hear him.It's possible that Kojima was only joking about his staff's lack of enthusiasm for his ideas, but he did share the details about where one of his Easter eggs is hidden. He noted that it can be found when players visit a hot spring and look up at the sky before zooming in on the stars.Continue Reading at GameSpot
    #hideo #kojima #had #add #his
    Hideo Kojima Had To Add His Own Easter Eggs To Death Stranding 2
    Hideo Kojima's latest game, Death Stranding 2: On the Beach, has arrived on PlayStation 5. While players continue to pour over the details, they may find a few Easter eggs that were placed in the game by Kojima himself. According to Kojima, he only took that hands-on approach because his staff refused to do it for him.As reported by Game Spark, Kojima confirmed that he personally added those nods to his past. He went on to say that when his staff was asked to include his self-referential jokes, they would pretend they didn't hear him.It's possible that Kojima was only joking about his staff's lack of enthusiasm for his ideas, but he did share the details about where one of his Easter eggs is hidden. He noted that it can be found when players visit a hot spring and look up at the sky before zooming in on the stars.Continue Reading at GameSpot #hideo #kojima #had #add #his
    WWW.GAMESPOT.COM
    Hideo Kojima Had To Add His Own Easter Eggs To Death Stranding 2
    Hideo Kojima's latest game, Death Stranding 2: On the Beach, has arrived on PlayStation 5. While players continue to pour over the details, they may find a few Easter eggs that were placed in the game by Kojima himself. According to Kojima, he only took that hands-on approach because his staff refused to do it for him.As reported by Game Spark (via Automaton), Kojima confirmed that he personally added those nods to his past. He went on to say that when his staff was asked to include his self-referential jokes, they would pretend they didn't hear him.It's possible that Kojima was only joking about his staff's lack of enthusiasm for his ideas, but he did share the details about where one of his Easter eggs is hidden. He noted that it can be found when players visit a hot spring and look up at the sky before zooming in on the stars.Continue Reading at GameSpot
    0 Commentaires 0 Parts
  • What a colossal disappointment! The Switch 2's first new GameCube game is… Super Mario Strikers? Seriously?! After all the anticipation for classics like Luigi’s Mansion or Super Mario Sunshine, we get a mediocre soccer game as part of the Switch Online + Expansion Pack library. This is not the nostalgia trip we signed up for! Nintendo, how low can you go? This is an insult to fans craving real innovation and quality. Instead of delivering something groundbreaking, you're recycling an old franchise that barely scratched the surface of fun. Where's the creativity? Where's the passion? It's time to wake up, Nintendo!

    #Nintendo #Switch2 #MarioStrikers #GamingDisappointment #VideoGames
    What a colossal disappointment! The Switch 2's first new GameCube game is… Super Mario Strikers? Seriously?! After all the anticipation for classics like Luigi’s Mansion or Super Mario Sunshine, we get a mediocre soccer game as part of the Switch Online + Expansion Pack library. This is not the nostalgia trip we signed up for! Nintendo, how low can you go? This is an insult to fans craving real innovation and quality. Instead of delivering something groundbreaking, you're recycling an old franchise that barely scratched the surface of fun. Where's the creativity? Where's the passion? It's time to wake up, Nintendo! #Nintendo #Switch2 #MarioStrikers #GamingDisappointment #VideoGames
    KOTAKU.COM
    The Switch 2's First New GameCube Game Is A Mario Strikers That's Actually Good
    Just under a month since it launched, the Switch 2 is getting its first new GameCube game as part of its Switch Online + Expansion Pack library. Is it Luigi’s Mansion? Super Mario Sunshine?? Fire Emblem: Path of Radiance??? No, it’s Super Mario Strik
    1 Commentaires 0 Parts
  • Why Ready or Not feels so real– how Unreal Engine 5 is delivering next-level immersion


    Void's art director and lead designer reveal what it takes to improve on PC perfection.
    Why Ready or Not feels so real– how Unreal Engine 5 is delivering next-level immersion Void's art director and lead designer reveal what it takes to improve on PC perfection.
    WWW.CREATIVEBLOQ.COM
    Why Ready or Not feels so real– how Unreal Engine 5 is delivering next-level immersion
    Void's art director and lead designer reveal what it takes to improve on PC perfection.
    2 Commentaires 0 Parts
  • Plug and Play: Build a G-Assist Plug-In Today

    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems.
    NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels.

    G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow.
    Below, find popular G-Assist plug-ins, hackathon details and tips to get started.
    Plug-In and Win
    Join the hackathon by registering and checking out the curated technical resources.
    G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation.
    For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins.
    To submit an entry, participants must provide a GitHub repository, including source code file, requirements.txt, manifest.json, config.json, a plug-in executable file and READme code.
    Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action.
    Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16.
    Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in.
    Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit.
    Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU, specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver.
    Plug-InExplore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows.

    Popular plug-ins include:

    Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay.
    Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay.
    IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device.
    Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists.
    Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more.

    Get G-Assist 
    Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff.
    the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session.
    Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities.
    Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process.
    NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch.
    Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations. 
    Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter.
    Follow NVIDIA Workstation on LinkedIn and X. 
    See notice regarding software product information.
    #plug #play #build #gassist #plugin
    Plug and Play: Build a G-Assist Plug-In Today
    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems. NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels. G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow. Below, find popular G-Assist plug-ins, hackathon details and tips to get started. Plug-In and Win Join the hackathon by registering and checking out the curated technical resources. G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation. For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins. To submit an entry, participants must provide a GitHub repository, including source code file, requirements.txt, manifest.json, config.json, a plug-in executable file and READme code. Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action. Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16. Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in. Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit. Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU, specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver. Plug-InExplore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows. Popular plug-ins include: Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay. Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay. IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device. Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists. Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more. Get G-Assist  Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff. the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session. Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities. Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process. NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch. Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations.  Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter. Follow NVIDIA Workstation on LinkedIn and X.  See notice regarding software product information. #plug #play #build #gassist #plugin
    BLOGS.NVIDIA.COM
    Plug and Play: Build a G-Assist Plug-In Today
    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems. NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels. G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow. Below, find popular G-Assist plug-ins, hackathon details and tips to get started. Plug-In and Win Join the hackathon by registering and checking out the curated technical resources. G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation. For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins. To submit an entry, participants must provide a GitHub repository, including source code file (plugin.py), requirements.txt, manifest.json, config.json (if applicable), a plug-in executable file and READme code. Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action. Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16. Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in. Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit. Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU (Intel Pentium G Series, Core i3, i5, i7 or higher; AMD FX, Ryzen 3, 5, 7, 9, Threadripper or higher), specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver. Plug-In(spiration) Explore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows. Popular plug-ins include: Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay. Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay. IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device. Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists. Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more. Get G-Assist(ance)  Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff. Save the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session. Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities. Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process. NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch. Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations.  Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter. Follow NVIDIA Workstation on LinkedIn and X.  See notice regarding software product information.
    Like
    Wow
    Love
    Sad
    25
    0 Commentaires 0 Parts
  • How To Find And Use Minecraft Slimeballs, Defeat Slimes, And Farm Slime Blocks

    The Slime is one of the Minecraft mobs that initially appears hostile, but upon killing it you will find useful items for many sought-after crafting recipes in the survival game. We've got all you need to know on how to find and kill Slimes in Minecraft, as well as the items they drop, Slimeball crafting recipes, and more.Table of ContentsHow to find Slimes in MinecraftHow to find Slimes in MinecraftSlimes spawn in the overworld only, in specific slime chunks. These are all below layer 40, and you can show your Minecraft coordinates to see how close you are. Unlike most mobs, it doesn't matter what light level the environment is at for them to spawn. They can also spawn in swamp biomes between layers 51 and 69 if the light level is seven or less. Slimes spawn regardless of weather conditions. In swamps and mangrove swamps, slimes spawn most often on a full moon, but never on a new moon. Slimes will never spawn in mushroom fields or deep dark biomes.The Slime is a green cube in Minecraft, and is a hostile mob.Slimes do not spawn within 24 blocks of any player, and they despawn over time if no player is within 32 blocks. They despawn instantly if no player is within 128 blocks in Java edition, or 44 to 128 blocks in Bedrock depending on the simulation distance setting.Continue Reading at GameSpot
    #how #find #use #minecraft #slimeballs
    How To Find And Use Minecraft Slimeballs, Defeat Slimes, And Farm Slime Blocks
    The Slime is one of the Minecraft mobs that initially appears hostile, but upon killing it you will find useful items for many sought-after crafting recipes in the survival game. We've got all you need to know on how to find and kill Slimes in Minecraft, as well as the items they drop, Slimeball crafting recipes, and more.Table of ContentsHow to find Slimes in MinecraftHow to find Slimes in MinecraftSlimes spawn in the overworld only, in specific slime chunks. These are all below layer 40, and you can show your Minecraft coordinates to see how close you are. Unlike most mobs, it doesn't matter what light level the environment is at for them to spawn. They can also spawn in swamp biomes between layers 51 and 69 if the light level is seven or less. Slimes spawn regardless of weather conditions. In swamps and mangrove swamps, slimes spawn most often on a full moon, but never on a new moon. Slimes will never spawn in mushroom fields or deep dark biomes.The Slime is a green cube in Minecraft, and is a hostile mob.Slimes do not spawn within 24 blocks of any player, and they despawn over time if no player is within 32 blocks. They despawn instantly if no player is within 128 blocks in Java edition, or 44 to 128 blocks in Bedrock depending on the simulation distance setting.Continue Reading at GameSpot #how #find #use #minecraft #slimeballs
    WWW.GAMESPOT.COM
    How To Find And Use Minecraft Slimeballs, Defeat Slimes, And Farm Slime Blocks
    The Slime is one of the Minecraft mobs that initially appears hostile, but upon killing it you will find useful items for many sought-after crafting recipes in the survival game. We've got all you need to know on how to find and kill Slimes in Minecraft, as well as the items they drop, Slimeball crafting recipes, and more.Table of Contents [hide]How to find Slimes in MinecraftHow to find Slimes in MinecraftSlimes spawn in the overworld only, in specific slime chunks. These are all below layer 40, and you can show your Minecraft coordinates to see how close you are. Unlike most mobs, it doesn't matter what light level the environment is at for them to spawn. They can also spawn in swamp biomes between layers 51 and 69 if the light level is seven or less. Slimes spawn regardless of weather conditions. In swamps and mangrove swamps, slimes spawn most often on a full moon, but never on a new moon. Slimes will never spawn in mushroom fields or deep dark biomes.The Slime is a green cube in Minecraft, and is a hostile mob.Slimes do not spawn within 24 blocks of any player, and they despawn over time if no player is within 32 blocks. They despawn instantly if no player is within 128 blocks in Java edition, or 44 to 128 blocks in Bedrock depending on the simulation distance setting.Continue Reading at GameSpot
    0 Commentaires 0 Parts
  • AI visibility is pretty much about how often your brand shows up in tools like ChatGPT, Gemini, and Perplexity. If you're into tracking or growing your brand's presence, there's this guide that covers it. It's supposed to be expert-backed and all, but honestly, who knows? Just another thing to think about. If you want to dive into it, go ahead. If not, whatever.

    #AIVisibility
    #BrandPresence
    #LLMs
    #DigitalMarketing
    #ContentStrategy
    AI visibility is pretty much about how often your brand shows up in tools like ChatGPT, Gemini, and Perplexity. If you're into tracking or growing your brand's presence, there's this guide that covers it. It's supposed to be expert-backed and all, but honestly, who knows? Just another thing to think about. If you want to dive into it, go ahead. If not, whatever. #AIVisibility #BrandPresence #LLMs #DigitalMarketing #ContentStrategy
    WWW.SEMRUSH.COM
    AI Visibility: How to Track & Grow Your Brand Presence in LLMs
    AI visibility is how often your brand is mentioned by tools like ChatGPT, Gemini, and Perplexity. Learn how to track and grow your LLM presence with our expert-backed guide.
    1 Commentaires 0 Parts
  • Adventure Gamers, the once beloved site for point-and-click adventure gaming, has wiped its forums and relaunched as a gambling affiliate. It's just another cash-grab, really. What used to be a hub for enthusiasts now feels like a hollow echo of its former self. After years of supporting the adventure genre, it's sad to see it turn into something so... uninspiring. Not much else to say. Guess that's how it goes.

    #AdventureGamers #GamingNews #GamblingAffiliate #PointAndClick #SadChange
    Adventure Gamers, the once beloved site for point-and-click adventure gaming, has wiped its forums and relaunched as a gambling affiliate. It's just another cash-grab, really. What used to be a hub for enthusiasts now feels like a hollow echo of its former self. After years of supporting the adventure genre, it's sad to see it turn into something so... uninspiring. Not much else to say. Guess that's how it goes. #AdventureGamers #GamingNews #GamblingAffiliate #PointAndClick #SadChange
    KOTAKU.COM
    Beloved 27-Year-Old Gaming Site Wipes Forums, Relaunches As A Gambling Affiliate Cash-Grab
    In the last couple of weeks, Adventure Gamers—a beloved game site dedicated to point-and-click adventure gaming since 1998—has become a cruel pastiche of its former self. The site, primarily run by volunteers and enthusiasts, has fought for the much-
    1 Commentaires 0 Parts
  • Exciting news, everyone! The new logo for the "NGSC2025" Global Sports Conference has been unveiled, and it’s a symbol of unity, passion, and the bright future of sports! This is not just a logo; it represents our collective journey towards innovation and excellence in the world of athletics. Let’s come together to celebrate the spirit of sportsmanship and the power of community! Join me in embracing this incredible moment and get ready for a fantastic event that will inspire us all!

    #NGSC2025 #GlobalSports #Inspiration #Unity #Sportsmanship
    🌟 Exciting news, everyone! 🎉 The new logo for the "NGSC2025" Global Sports Conference has been unveiled, and it’s a symbol of unity, passion, and the bright future of sports! 🏆✨ This is not just a logo; it represents our collective journey towards innovation and excellence in the world of athletics. Let’s come together to celebrate the spirit of sportsmanship and the power of community! 💪💖 Join me in embracing this incredible moment and get ready for a fantastic event that will inspire us all! 🚀🌈 #NGSC2025 #GlobalSports #Inspiration #Unity #Sportsmanship
    ARABHARDWARE.NET
    الكشف عن شعار مؤتمر الرياضة العالمية الجديدة " NGSC2025"
    The post الكشف عن شعار مؤتمر الرياضة العالمية الجديدة " NGSC2025" appeared first on عرب هاردوير.
    1 Commentaires 0 Parts
Plus de résultats