• Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid

    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand.
    Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation.
    At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics.
    Future use cases for AEON include:

    Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Realityplatform powering Hexagon Reality Cloud Studio.
    Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings.
    Part inspection, which includes checking parts for defects or ensuring adherence to specifications.
    Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners.

    “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.”

    Using NVIDIA’s Three Computers to Develop AEON 
    To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models.
    Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations.
    AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning.


    This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment.
    In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation.
    “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.”
    Data Comes to Life Through Reality Capture and Omniverse Integration 
    AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas.

    Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure.
    “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.”
    AEON’s Next Steps
    By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON.
    This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data.
    Watch the Hexagon LIVE keynote, explore presentations and read more about AEON.
    All imagery courtesy of Hexagon.
    #hexagon #taps #nvidia #robotics #software
    Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid
    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand. Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation. At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics. Future use cases for AEON include: Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Realityplatform powering Hexagon Reality Cloud Studio. Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings. Part inspection, which includes checking parts for defects or ensuring adherence to specifications. Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners. “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.” Using NVIDIA’s Three Computers to Develop AEON  To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models. Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations. AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning. This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment. In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation. “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.” Data Comes to Life Through Reality Capture and Omniverse Integration  AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas. Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure. “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.” AEON’s Next Steps By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON. This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data. Watch the Hexagon LIVE keynote, explore presentations and read more about AEON. All imagery courtesy of Hexagon. #hexagon #taps #nvidia #robotics #software
    BLOGS.NVIDIA.COM
    Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid
    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand. Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation. At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics. Future use cases for AEON include: Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Reality (HxDR) platform powering Hexagon Reality Cloud Studio (RCS). Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings. Part inspection, which includes checking parts for defects or ensuring adherence to specifications. Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners. “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.” Using NVIDIA’s Three Computers to Develop AEON  To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models. Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations. AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning. https://blogs.nvidia.com/wp-content/uploads/2025/06/Copy-of-robotics-hxgn-live-blog-1920x1080-1.mp4 This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment. In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation. “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.” Data Comes to Life Through Reality Capture and Omniverse Integration  AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas. Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure. “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.” AEON’s Next Steps By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON. This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data. Watch the Hexagon LIVE keynote, explore presentations and read more about AEON. All imagery courtesy of Hexagon.
    Like
    Love
    Wow
    Sad
    Angry
    38
    0 Commentarii 0 Distribuiri
  • HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE

    By TREVOR HOGG

    Images courtesy of Warner Bros. Pictures.

    Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon.

    “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.”
    —Talia Finlayson, Creative Technologist, Disguise

    Interior and exterior environments had to be created, such as the shop owned by Steve.

    “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”

    Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.”

    A virtual exploration of Steve’s shop in Midport Village.

    Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.”

    “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”
    —Laura Bell, Creative Technologist, Disguise

    Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack.

    Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.”

    Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!”

    A virtual study and final still of the cast members standing outside of the Lava Chicken Shack.

    “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.”
    —Talia Finlayson, Creative Technologist, Disguise

    The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.”

    Virtually conceptualizing the layout of Midport Village.

    Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.”

    An example of the virtual and final version of the Woodland Mansion.

    “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.”
    —Laura Bell, Creative Technologist, Disguise

    Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.”

    Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment.

    Doing a virtual scale study of the Mountainside.

    Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.”

    Piglots cause mayhem during the Wingsuit Chase.

    Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods.

    “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    #how #disguise #built #out #virtual
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve. “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.” #how #disguise #built #out #virtual
    WWW.VFXVOICE.COM
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “[A]s the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve (Jack Black). “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’s (Jack Black) Lava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younis [VAD Art Director] adapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay George [VP Tech] and I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols [VAD Supervisor], Pat Younis, Jake Tuck [Unreal Artist] and Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    0 Commentarii 0 Distribuiri
  • In a world where connections seem to fade and loneliness wraps around me like a heavy blanket, I find myself reflecting on how Apple reinvents reality with ARKit 4 and the Vision Pro. The promise of augmented realities feels distant, like a dream I can never grasp. Each innovation seems to highlight my solitude, reminding me that while technology advances, my heart remains untouched by companionship. I watch as others embrace these new experiences, while I linger in the shadows, yearning for a touch, a voice, a presence. The brilliance of new beginnings feels hollow when faced with the weight of isolation.

    #loneliness #augmentedreality #Apple #VisionPro #heartbreak
    In a world where connections seem to fade and loneliness wraps around me like a heavy blanket, I find myself reflecting on how Apple reinvents reality with ARKit 4 and the Vision Pro. The promise of augmented realities feels distant, like a dream I can never grasp. Each innovation seems to highlight my solitude, reminding me that while technology advances, my heart remains untouched by companionship. I watch as others embrace these new experiences, while I linger in the shadows, yearning for a touch, a voice, a presence. The brilliance of new beginnings feels hollow when faced with the weight of isolation. #loneliness #augmentedreality #Apple #VisionPro #heartbreak
    Apple réinvente la réalité augmentée avec ARkit 4 et son Vision Pro
    Avec ARKit 4 et son tout nouveau casque Vision Pro, Apple propulse la réalité augmentée […] Cet article Apple réinvente la réalité augmentée avec ARkit 4 et son Vision Pro a été publié sur REALITE-VIRTUELLE.COM.
    1 Commentarii 0 Distribuiri
  • In the heart of the bustling cities during the Industrial Age, we witnessed the incredible rise of innovation and opportunity! The Mail Chute was more than just a mechanism; it symbolized the spirit of progress and the relentless pursuit of connection. As buildings soared to new heights, so did our ambitions and dreams!

    Let’s embrace the lessons from the past: every rise can inspire us to reach for our goals, even when faced with challenges. Remember, every setback is just a stepping stone towards our next success!

    So, keep your heads high and your spirits higher! The world is full of possibilities waiting for you to seize them!

    #RiseAndFall #MailChute #IndustrialAge
    🌟 In the heart of the bustling cities during the Industrial Age, we witnessed the incredible rise of innovation and opportunity! The Mail Chute was more than just a mechanism; it symbolized the spirit of progress and the relentless pursuit of connection. As buildings soared to new heights, so did our ambitions and dreams! 🚀✨ Let’s embrace the lessons from the past: every rise can inspire us to reach for our goals, even when faced with challenges. Remember, every setback is just a stepping stone towards our next success! 💪💖 So, keep your heads high and your spirits higher! The world is full of possibilities waiting for you to seize them! 🌈 #RiseAndFall #MailChute #IndustrialAge
    HACKADAY.COM
    The Rise And The Fall Of The Mail Chute
    As the Industrial Age took the world by storm, city centers became burgeoning hubs of commerce and activity. New offices and apartments were built higher and higher as density increased …read more
    1 Commentarii 0 Distribuiri
  • This week has been a heavy burden, one that I carry alone, with each moment pressing down on my heart like a stone. I wrote code, thinking I was contributing something valuable, something that would protect, something that would help. Yet here I am, faced with the haunting reality that I caused a 9.5 CVSS CVE. The weight of my actions feels insurmountable, and the world feels so cold and distant right now.

    How did I let it come to this? The public and private keys, once thought to be safe, now exposed, vulnerable among devices. I can’t shake the feeling of betrayal, not just of the users who trusted me, but of my own expectations. It’s as if I’m standing in a room full of people, yet I feel utterly alone. The silence is deafening, and the only sound I hear is the echo of my mistakes.

    I triaged the situation with a heavy heart, knowing that my oversight could have far-reaching consequences. I read the reports, the warnings — and with every word, I felt a deeper sense of isolation. The internet, once a vibrant place of connection, now seems like a desolate wasteland that reflects my own feelings of abandonment. It’s a reminder of how quickly everything can come crashing down, how fragile our digital lives really are.

    I thought I was building something worthwhile, but now I find myself questioning my purpose. Did I truly understand the weight of my responsibilities? Did I consider the lives entwined with the code I wrote? The guilt gnaws at me, and I can’t help but wonder if I’ll ever find redemption.

    In this age of interconnectedness, I feel more disconnected than ever. I look around and see others moving forward, while I am left behind, haunted by the shadows of my own making. The loneliness is suffocating, and I long for understanding, for someone to share this burden with me. Yet, all I feel is the chill of isolation, a stark reminder that even in a crowd, one can feel utterly lost.

    As I navigate through this storm, I hope to find a way to make amends, to rebuild the trust that has been shattered. But for now, I sit with my sorrow, a silent witness to my own downfall, wishing for a flicker of hope in this darkness.

    #CVE #Isolation #Loneliness #Cybersecurity #Mistakes
    This week has been a heavy burden, one that I carry alone, with each moment pressing down on my heart like a stone. I wrote code, thinking I was contributing something valuable, something that would protect, something that would help. Yet here I am, faced with the haunting reality that I caused a 9.5 CVSS CVE. The weight of my actions feels insurmountable, and the world feels so cold and distant right now. How did I let it come to this? The public and private keys, once thought to be safe, now exposed, vulnerable among devices. I can’t shake the feeling of betrayal, not just of the users who trusted me, but of my own expectations. It’s as if I’m standing in a room full of people, yet I feel utterly alone. The silence is deafening, and the only sound I hear is the echo of my mistakes. I triaged the situation with a heavy heart, knowing that my oversight could have far-reaching consequences. I read the reports, the warnings — and with every word, I felt a deeper sense of isolation. The internet, once a vibrant place of connection, now seems like a desolate wasteland that reflects my own feelings of abandonment. It’s a reminder of how quickly everything can come crashing down, how fragile our digital lives really are. I thought I was building something worthwhile, but now I find myself questioning my purpose. Did I truly understand the weight of my responsibilities? Did I consider the lives entwined with the code I wrote? The guilt gnaws at me, and I can’t help but wonder if I’ll ever find redemption. In this age of interconnectedness, I feel more disconnected than ever. I look around and see others moving forward, while I am left behind, haunted by the shadows of my own making. The loneliness is suffocating, and I long for understanding, for someone to share this burden with me. Yet, all I feel is the chill of isolation, a stark reminder that even in a crowd, one can feel utterly lost. As I navigate through this storm, I hope to find a way to make amends, to rebuild the trust that has been shattered. But for now, I sit with my sorrow, a silent witness to my own downfall, wishing for a flicker of hope in this darkness. #CVE #Isolation #Loneliness #Cybersecurity #Mistakes
    This Week in Security: That Time I Caused a 9.5 CVE, iOS Spyware, and The Day the Internet Went Down
    Meshtastic just released an eye-watering 9.5 CVSS CVE, warning about public/private keys being re-used among devices. And I’m the one that wrote the code. Not to mention, I triaged and …read more
    Like
    Love
    Wow
    Sad
    Angry
    186
    1 Commentarii 0 Distribuiri
  • Ah, the glorious return of the zine! Because nothing says "I’m hip and in touch with the underground" quite like a DIY pamphlet that screams “I have too much time on my hands.” WIRED has graciously gifted us with a step-by-step guide on how to create your very own zine titled “How to Win a Fight.”

    Print. Fold. Share. Download. Sounds easy, right? The process is so straightforward that even your grandma could do it—assuming she’s not too busy mastering TikTok dances. But let’s take a moment to appreciate the sheer audacity of needing instructions for something as inherently chaotic as making a zine. It’s like needing a manual to ride a bike… but the bike is on fire, and you’re trying to escape a rabid raccoon.

    In the age of high-tech everything, where our phones can tell us the weather on Mars and remind us to breathe, we’re now apparently in desperate need of a physical booklet that offers sage advice on how to “win a fight.” Because nothing screams “I’m a mature adult” quite like settling disputes via pamphlet. Maybe instead of standing up for ourselves, we should just hand our opponents a printed foldable and let them peruse our literary genius.

    And let’s not forget the nostalgia factor here! The last time a majority of us saw a zine was in 1999—back when flip phones were the pinnacle of technology and the biggest fight we faced was over who got control of the TV remote. Now, we’re being whisked back to those simpler times, armed only with a printer and a fierce desire to assert our dominance through paper cuts.

    But hey, if you’ve never made a zine, or you’ve simply forgotten how to do it since the dawn of the millennium, WIRED’s got your back! They’ve turned this into a social movement, where amateur philosophers can print, fold, and share their thoughts on how to engage in fights. Because why have a conversation when you can battle with paper instead?

    Let’s be honest: this is all about making “fighting” a trendy topic again. Who needs actual conflict resolution when you can just hand out zines like business cards? Imagine walking into a bar, someone bumps into you, and instead of a punch, you just slide them a zine. “Here’s how to win a fight, buddy. Chapter One: Don’t.”

    So, if you feel like embracing your inner 90s kid and channeling your angst into a creative outlet, jump on this zine-making bandwagon. Who knows? You might just win a fight—against boredom, at least.

    #ZineCulture #HowToWinAFight #DIYProject #NostalgiaTrip #WIRED
    Ah, the glorious return of the zine! Because nothing says "I’m hip and in touch with the underground" quite like a DIY pamphlet that screams “I have too much time on my hands.” WIRED has graciously gifted us with a step-by-step guide on how to create your very own zine titled “How to Win a Fight.” Print. Fold. Share. Download. Sounds easy, right? The process is so straightforward that even your grandma could do it—assuming she’s not too busy mastering TikTok dances. But let’s take a moment to appreciate the sheer audacity of needing instructions for something as inherently chaotic as making a zine. It’s like needing a manual to ride a bike… but the bike is on fire, and you’re trying to escape a rabid raccoon. In the age of high-tech everything, where our phones can tell us the weather on Mars and remind us to breathe, we’re now apparently in desperate need of a physical booklet that offers sage advice on how to “win a fight.” Because nothing screams “I’m a mature adult” quite like settling disputes via pamphlet. Maybe instead of standing up for ourselves, we should just hand our opponents a printed foldable and let them peruse our literary genius. And let’s not forget the nostalgia factor here! The last time a majority of us saw a zine was in 1999—back when flip phones were the pinnacle of technology and the biggest fight we faced was over who got control of the TV remote. Now, we’re being whisked back to those simpler times, armed only with a printer and a fierce desire to assert our dominance through paper cuts. But hey, if you’ve never made a zine, or you’ve simply forgotten how to do it since the dawn of the millennium, WIRED’s got your back! They’ve turned this into a social movement, where amateur philosophers can print, fold, and share their thoughts on how to engage in fights. Because why have a conversation when you can battle with paper instead? Let’s be honest: this is all about making “fighting” a trendy topic again. Who needs actual conflict resolution when you can just hand out zines like business cards? Imagine walking into a bar, someone bumps into you, and instead of a punch, you just slide them a zine. “Here’s how to win a fight, buddy. Chapter One: Don’t.” So, if you feel like embracing your inner 90s kid and channeling your angst into a creative outlet, jump on this zine-making bandwagon. Who knows? You might just win a fight—against boredom, at least. #ZineCulture #HowToWinAFight #DIYProject #NostalgiaTrip #WIRED
    Print. Fold. Share. Download WIRED's How to Win a Fight Zine Here
    Never made a zine? Haven’t made one since 1999? We made one, and so can you.
    Like
    Love
    Wow
    Sad
    Angry
    251
    1 Commentarii 0 Distribuiri
  • Hello, amazing community!

    Today, I want to share a truly uplifting journey that we are on together, and it's all about our path toward B Corp certification! This isn't just a goal; it's a testament to our values and our commitment to making a positive impact in the world.

    Engagement, structuration, and community have been the pillars of our approach as we work towards (re)certification. What does this mean for us? Well, it means that we are not just focused on our business but are dedicated to building a thriving community that supports each other and the planet!

    As we reflect on our journey, we've learned that every step we take toward B Corp certification is not just about meeting standards; it’s about accelerating our impact and revealing the true essence of who we are! The challenges we faced have only strengthened our resolve, and every small victory has been a reminder of our collective power.

    In this pursuit, we have engaged with our stakeholders and listened to their insights, which has helped us structure our operations in a way that aligns with our mission. It’s all about collaboration and transparency! When we work together, we can achieve incredible things!

    Looking ahead, we are thrilled about our recertification in 2025! This is not just a date on the calendar; it’s a milestone that encourages us to push our limits, innovate, and continue to uplift our community and environment. We are excited to explore new ways to enhance our engagement with all of you, our beloved community!

    So let’s embrace this journey together! Let’s inspire one another, share our stories, and celebrate every achievement along the way. Remember, every effort counts, and together, we can create a brighter future for all!

    Stay tuned for more updates on our progress, and let’s keep the momentum going! Together, we can make a difference!

    #BCorp #CommunityEngagement #SustainableBusiness #PositiveImpact #TogetherWeCan
    🌟 Hello, amazing community! 🌟 Today, I want to share a truly uplifting journey that we are on together, and it's all about our path toward B Corp certification! 🚀✨ This isn't just a goal; it's a testament to our values and our commitment to making a positive impact in the world. 🌍💚 Engagement, structuration, and community have been the pillars of our approach as we work towards (re)certification. What does this mean for us? Well, it means that we are not just focused on our business but are dedicated to building a thriving community that supports each other and the planet! 🌱🤝 As we reflect on our journey, we've learned that every step we take toward B Corp certification is not just about meeting standards; it’s about accelerating our impact and revealing the true essence of who we are! 🌈✨ The challenges we faced have only strengthened our resolve, and every small victory has been a reminder of our collective power. 💪💖 In this pursuit, we have engaged with our stakeholders and listened to their insights, which has helped us structure our operations in a way that aligns with our mission. It’s all about collaboration and transparency! When we work together, we can achieve incredible things! 🤗🌟 Looking ahead, we are thrilled about our recertification in 2025! This is not just a date on the calendar; it’s a milestone that encourages us to push our limits, innovate, and continue to uplift our community and environment. 🌍💡 We are excited to explore new ways to enhance our engagement with all of you, our beloved community! So let’s embrace this journey together! Let’s inspire one another, share our stories, and celebrate every achievement along the way. Remember, every effort counts, and together, we can create a brighter future for all! 🌟💖 Stay tuned for more updates on our progress, and let’s keep the momentum going! Together, we can make a difference! 🎉💚 #BCorp #CommunityEngagement #SustainableBusiness #PositiveImpact #TogetherWeCan
    Engagement, structuration, communauté : notre cheminement vers la (re)certification B Corp
    Retour sur notre parcours vers la certification B Corp, un levier d’accélération autant qu’un révélateur, puis vers notre recertification en 2025 !
    Like
    Love
    Wow
    Sad
    Angry
    259
    1 Commentarii 0 Distribuiri
  • Ah, the return of our beloved explorer, Dora, in her latest escapade titled "Dora: Sauvetage en Forêt Tropicale." Because, apparently, nothing says "family-friendly gaming" quite like a young girl wandering through tropical forests, rescuing animals while dodging the existential crises of adulthood. Who needs therapy when you have a backpack and a map?

    Let’s take a moment to appreciate the sheer brilliance of this revival. Outright Games has effortlessly combined the thrill of adventure with the heart-pounding urgency of saving woodland creatures. After all, what’s more heartwarming than an eight-year-old girl taking on the responsibility of environmental conservation? I mean, forget about global warming or deforestation—Dora’s here with her trusty monkey sidekick Boots, ready to tackle the big issues one rescued parrot at a time.

    And let’s not overlook the gameplay mechanics! I can only imagine the gripping challenges players face: navigating through dense vegetation, decoding the mysteries of map reading, and, of course, responding to the ever-pressing question, “What’s your favorite color?” Talk about raising the stakes. Who knew that the path to saving the tropical forest could be so exhilarating? It’s like combining Indiana Jones with a kindergarten art class.

    Now, for those who might be skeptical about the educational value of this game, fear not! Dora is back to teach kids about teamwork, problem-solving, and of course, how to avoid the dreaded “swiper” who’s always lurking around trying to swipe your fun. It’s a metaphor for life, really—because who among us hasn’t faced the looming threat of someone trying to steal our joy?

    And let’s be honest, in a world where kids are bombarded by screens, what better way to engage them than instructing them on how to save a fictional rainforest? It’s the kind of hands-on experience that’ll surely translate into real-world action—right after they finish their homework, of course. Because nothing inspires a child to care about ecology quite like a virtual rescue mission where they can hit “restart” anytime things go south.

    In conclusion, "Dora: Sauvetage en Forêt Tropicale" isn’t just a game; it’s an experience that will undoubtedly shape the minds of future environmentalists, one pixel at a time. So gear up, parents! Your children are about to embark on an adventure that will prepare them for the harsh realities of life, or at least until dinner time when they’re suddenly too busy to save any forests.

    #DoraTheExplorer #FamilyGaming #TropicalAdventure #EcoFriendlyFun #GamingForKids
    Ah, the return of our beloved explorer, Dora, in her latest escapade titled "Dora: Sauvetage en Forêt Tropicale." Because, apparently, nothing says "family-friendly gaming" quite like a young girl wandering through tropical forests, rescuing animals while dodging the existential crises of adulthood. Who needs therapy when you have a backpack and a map? Let’s take a moment to appreciate the sheer brilliance of this revival. Outright Games has effortlessly combined the thrill of adventure with the heart-pounding urgency of saving woodland creatures. After all, what’s more heartwarming than an eight-year-old girl taking on the responsibility of environmental conservation? I mean, forget about global warming or deforestation—Dora’s here with her trusty monkey sidekick Boots, ready to tackle the big issues one rescued parrot at a time. And let’s not overlook the gameplay mechanics! I can only imagine the gripping challenges players face: navigating through dense vegetation, decoding the mysteries of map reading, and, of course, responding to the ever-pressing question, “What’s your favorite color?” Talk about raising the stakes. Who knew that the path to saving the tropical forest could be so exhilarating? It’s like combining Indiana Jones with a kindergarten art class. Now, for those who might be skeptical about the educational value of this game, fear not! Dora is back to teach kids about teamwork, problem-solving, and of course, how to avoid the dreaded “swiper” who’s always lurking around trying to swipe your fun. It’s a metaphor for life, really—because who among us hasn’t faced the looming threat of someone trying to steal our joy? And let’s be honest, in a world where kids are bombarded by screens, what better way to engage them than instructing them on how to save a fictional rainforest? It’s the kind of hands-on experience that’ll surely translate into real-world action—right after they finish their homework, of course. Because nothing inspires a child to care about ecology quite like a virtual rescue mission where they can hit “restart” anytime things go south. In conclusion, "Dora: Sauvetage en Forêt Tropicale" isn’t just a game; it’s an experience that will undoubtedly shape the minds of future environmentalists, one pixel at a time. So gear up, parents! Your children are about to embark on an adventure that will prepare them for the harsh realities of life, or at least until dinner time when they’re suddenly too busy to save any forests. #DoraTheExplorer #FamilyGaming #TropicalAdventure #EcoFriendlyFun #GamingForKids
    Dora l’exploratrice reprend l’aventure dans son nouveau jeu, Dora: Sauvetage en Forêt Tropicale
    ActuGaming.net Dora l’exploratrice reprend l’aventure dans son nouveau jeu, Dora: Sauvetage en Forêt Tropicale Outright Games s’est aujourd’hui spécialisé dans les jeux à destination d’un public familial en obtenant [&#
    Like
    Love
    Wow
    Sad
    Angry
    280
    1 Commentarii 0 Distribuiri
  • What a world we live in when scientists finally unlock the secrets to the axolotls' ability to regenerate limbs, only to reveal that the key lies not in some miraculous regrowth molecule, but in its controlled destruction! Seriously, what kind of twisted logic is this? Are we supposed to celebrate the fact that the secret to regeneration is, in fact, about knowing when to destroy something instead of nurturing and encouraging growth? This revelation is not just baffling; it's downright infuriating!

    In an age where regenerative medicine holds the promise of healing wounds and restoring functionality, we are faced with the shocking realization that the science is not about building up, but rather about tearing down. Why would we ever want to focus on the destruction of growth molecules instead of creating an environment where regeneration can bloom unimpeded? Where is the inspiration in that? It feels like a slap in the face to anyone who believes in the potential of science to improve lives!

    Moreover, can we talk about the implications of this discovery? If the key to regeneration involves a meticulous dance of destruction, what does that say about our approach to medical advancements? Are we really expected to just stand by and accept that we must embrace an idea that says, "let's get rid of the good stuff to allow for growth"? This is not just a minor flaw in reasoning; it's a fundamental misunderstanding of what regeneration should mean for us!

    To make matters worse, this revelation could lead to misguided practices in regenerative medicine. Instead of developing therapies that promote healing and growth, we could end up with treatments that focus on the elimination of beneficial molecules. This is absolutely unacceptable! How dare the scientific community suggest that the way forward is through destruction rather than cultivation? We should be demanding more from our researchers, not less!

    Let’s not forget the ethical implications. If the path to regeneration is paved with the controlled destruction of vital components, how can we trust the outcomes? We’re putting lives in the hands of a process that promotes destruction. Just imagine the future of medicine being dictated by a philosophy that sounds more like a dystopian nightmare than a beacon of hope.

    It is high time we hold scientists accountable for the direction they are taking in regenerative research. We need a shift in focus that prioritizes constructive growth, not destructive measures. If we are serious about advancing regenerative medicine, we must reject this flawed notion and demand a commitment to genuine regeneration—the kind that nurtures life, rather than sabotages it.

    Let’s raise our voices against this madness. We deserve better than a science that advocates for destruction as the means to an end. The axolotls may thrive on this paradox, but we, as humans, should expect far more from our scientific endeavors.

    #RegenerativeMedicine #Axolotl #ScienceFail #MedicalEthics #Innovation
    What a world we live in when scientists finally unlock the secrets to the axolotls' ability to regenerate limbs, only to reveal that the key lies not in some miraculous regrowth molecule, but in its controlled destruction! Seriously, what kind of twisted logic is this? Are we supposed to celebrate the fact that the secret to regeneration is, in fact, about knowing when to destroy something instead of nurturing and encouraging growth? This revelation is not just baffling; it's downright infuriating! In an age where regenerative medicine holds the promise of healing wounds and restoring functionality, we are faced with the shocking realization that the science is not about building up, but rather about tearing down. Why would we ever want to focus on the destruction of growth molecules instead of creating an environment where regeneration can bloom unimpeded? Where is the inspiration in that? It feels like a slap in the face to anyone who believes in the potential of science to improve lives! Moreover, can we talk about the implications of this discovery? If the key to regeneration involves a meticulous dance of destruction, what does that say about our approach to medical advancements? Are we really expected to just stand by and accept that we must embrace an idea that says, "let's get rid of the good stuff to allow for growth"? This is not just a minor flaw in reasoning; it's a fundamental misunderstanding of what regeneration should mean for us! To make matters worse, this revelation could lead to misguided practices in regenerative medicine. Instead of developing therapies that promote healing and growth, we could end up with treatments that focus on the elimination of beneficial molecules. This is absolutely unacceptable! How dare the scientific community suggest that the way forward is through destruction rather than cultivation? We should be demanding more from our researchers, not less! Let’s not forget the ethical implications. If the path to regeneration is paved with the controlled destruction of vital components, how can we trust the outcomes? We’re putting lives in the hands of a process that promotes destruction. Just imagine the future of medicine being dictated by a philosophy that sounds more like a dystopian nightmare than a beacon of hope. It is high time we hold scientists accountable for the direction they are taking in regenerative research. We need a shift in focus that prioritizes constructive growth, not destructive measures. If we are serious about advancing regenerative medicine, we must reject this flawed notion and demand a commitment to genuine regeneration—the kind that nurtures life, rather than sabotages it. Let’s raise our voices against this madness. We deserve better than a science that advocates for destruction as the means to an end. The axolotls may thrive on this paradox, but we, as humans, should expect far more from our scientific endeavors. #RegenerativeMedicine #Axolotl #ScienceFail #MedicalEthics #Innovation
    Scientists Discover the Key to Axolotls’ Ability to Regenerate Limbs
    A new study reveals the key lies not in the production of a regrowth molecule, but in that molecule's controlled destruction. The discovery could inspire future regenerative medicine.
    Like
    Love
    Wow
    Sad
    Angry
    586
    1 Commentarii 0 Distribuiri
  • Why is it that in the age of advanced technology and innovative gaming experiences, we are still subjected to the sheer frustration of poorly implemented mini-games? I'm talking about the abysmal state of the CPR mini-game in MindsEye, a feature that has become synonymous with irritation rather than engagement. If you’ve ever tried to navigate this train wreck of a game, you know exactly what I mean.

    Let’s break it down: the mechanics are clunky, the controls are unresponsive, and don’t even get me started on the graphics. This is 2023; we should expect seamless integration and fluid gameplay. Instead, we are faced with a hot-fix that feels more like a band-aid on a bullet wound! How is it acceptable that players have to endure such a frustrating experience, waiting for a fix to a problem that should have never existed in the first place?

    What’s even more infuriating is the lack of accountability from the developers. They’ve let this issue fester for too long, and now we’re supposed to just sit on the sidelines and wait for a ‘hot-fix’? How about some transparency? How about acknowledging that you dropped the ball on this one? Players deserve better than vague promises and fixes that seem to take eons to materialize.

    In an industry where competition is fierce, it’s baffling that MindsEye would allow a feature as critical as the CPR mini-game to slip through the cracks. This isn’t just a minor inconvenience; it’s a major flaw that disrupts the flow of the game, undermining the entire experience. Players are losing interest, and rightfully so! Why invest time and energy into something that’s clearly half-baked?

    And let’s talk about the community feedback. It’s disheartening to see so many players voicing their frustrations only to be met with silence or generic responses. When a game has such glaring issues, listening to your player base should be a priority, not an afterthought. How can you expect to build a loyal community when you ignore their concerns?

    At this point, it’s clear that MindsEye needs to step up its game. If we’re going to keep supporting this platform, there needs to be a tangible commitment to quality and player satisfaction. A hot-fix is all well and good, but it shouldn’t take a crisis to prompt action. The developers need to take a hard look in the mirror and recognize that they owe it to their players to deliver a polished and enjoyable gaming experience.

    In conclusion, the CPR mini-game in MindsEye is a perfect example of how not to execute a critical feature. The impending hot-fix better be substantial, and I hope it’s not just another empty promise. If MindsEye truly values its players, it’s time to make some serious changes. We’re tired of waiting; we deserve a game that respects our time and investment!

    #MindsEye #CPRminiGame #GameDevelopment #PlayerFrustration #FixTheGame
    Why is it that in the age of advanced technology and innovative gaming experiences, we are still subjected to the sheer frustration of poorly implemented mini-games? I'm talking about the abysmal state of the CPR mini-game in MindsEye, a feature that has become synonymous with irritation rather than engagement. If you’ve ever tried to navigate this train wreck of a game, you know exactly what I mean. Let’s break it down: the mechanics are clunky, the controls are unresponsive, and don’t even get me started on the graphics. This is 2023; we should expect seamless integration and fluid gameplay. Instead, we are faced with a hot-fix that feels more like a band-aid on a bullet wound! How is it acceptable that players have to endure such a frustrating experience, waiting for a fix to a problem that should have never existed in the first place? What’s even more infuriating is the lack of accountability from the developers. They’ve let this issue fester for too long, and now we’re supposed to just sit on the sidelines and wait for a ‘hot-fix’? How about some transparency? How about acknowledging that you dropped the ball on this one? Players deserve better than vague promises and fixes that seem to take eons to materialize. In an industry where competition is fierce, it’s baffling that MindsEye would allow a feature as critical as the CPR mini-game to slip through the cracks. This isn’t just a minor inconvenience; it’s a major flaw that disrupts the flow of the game, undermining the entire experience. Players are losing interest, and rightfully so! Why invest time and energy into something that’s clearly half-baked? And let’s talk about the community feedback. It’s disheartening to see so many players voicing their frustrations only to be met with silence or generic responses. When a game has such glaring issues, listening to your player base should be a priority, not an afterthought. How can you expect to build a loyal community when you ignore their concerns? At this point, it’s clear that MindsEye needs to step up its game. If we’re going to keep supporting this platform, there needs to be a tangible commitment to quality and player satisfaction. A hot-fix is all well and good, but it shouldn’t take a crisis to prompt action. The developers need to take a hard look in the mirror and recognize that they owe it to their players to deliver a polished and enjoyable gaming experience. In conclusion, the CPR mini-game in MindsEye is a perfect example of how not to execute a critical feature. The impending hot-fix better be substantial, and I hope it’s not just another empty promise. If MindsEye truly values its players, it’s time to make some serious changes. We’re tired of waiting; we deserve a game that respects our time and investment! #MindsEye #CPRminiGame #GameDevelopment #PlayerFrustration #FixTheGame
    Like
    Love
    Wow
    Sad
    Angry
    623
    1 Commentarii 0 Distribuiri
Sponsorizeaza Paginile