• It's infuriating to see how the retro gaming community is falling for the same old trap when it comes to the Commodore 64 on new FPGA systems. Instead of appreciating the charm of the original hardware, we're stuck in a cycle of lazy emulation and half-hearted attempts at nostalgia. The ease of using modern tech to replicate the past is a cop-out! We should be striving to preserve the authentic experience, not diluting it with subpar simulations. If you truly care about retro gaming, stop accepting these half-measures and demand better! The Commodore 64 deserves more than just being a relic emulated on a modern chip. Let's bring back the real deal or nothing at all!

    #Commodore64 #FPGA #
    It's infuriating to see how the retro gaming community is falling for the same old trap when it comes to the Commodore 64 on new FPGA systems. Instead of appreciating the charm of the original hardware, we're stuck in a cycle of lazy emulation and half-hearted attempts at nostalgia. The ease of using modern tech to replicate the past is a cop-out! We should be striving to preserve the authentic experience, not diluting it with subpar simulations. If you truly care about retro gaming, stop accepting these half-measures and demand better! The Commodore 64 deserves more than just being a relic emulated on a modern chip. Let's bring back the real deal or nothing at all! #Commodore64 #FPGA #
    HACKADAY.COM
    Commodore 64 on New FPGA
    When it comes to getting retro hardware running again, there are many approaches. On one hand, the easiest path could be to emulate the hardware on something modern, using nothing …read more
    Like
    Love
    Wow
    Sad
    Angry
    109
    1 Comments 0 Shares 0 Reviews
  • Fusion and AI: How private sector tech is powering progress at ITER

    In April 2025, at the ITER Private Sector Fusion Workshop in Cadarache, something remarkable unfolded. In a room filled with scientists, engineers and software visionaries, the line between big science and commercial innovation began to blur.  
    Three organisations – Microsoft Research, Arena and Brigantium Engineering – shared how artificial intelligence, already transforming everything from language models to logistics, is now stepping into a new role: helping humanity to unlock the power of nuclear fusion. 
    Each presenter addressed a different part of the puzzle, but the message was the same: AI isn’t just a buzzword anymore. It’s becoming a real tool – practical, powerful and indispensable – for big science and engineering projects, including fusion. 
    “If we think of the agricultural revolution and the industrial revolution, the AI revolution is next – and it’s coming at a pace which is unprecedented,” said Kenji Takeda, director of research incubations at Microsoft Research. 
    Microsoft’s collaboration with ITER is already in motion. Just a month before the workshop, the two teams signed a Memorandum of Understandingto explore how AI can accelerate research and development. This follows ITER’s initial use of Microsoft technology to empower their teams.
    A chatbot in Azure OpenAI service was developed to help staff navigate technical knowledge, on more than a million ITER documents, using natural conversation. GitHub Copilot assists with coding, while AI helps to resolve IT support tickets – those everyday but essential tasks that keep the lights on. 
    But Microsoft’s vision goes deeper. Fusion demands materials that can survive extreme conditions – heat, radiation, pressure – and that’s where AI shows a different kind of potential. MatterGen, a Microsoft Research generative AI model for materials, designs entirely new materials based on specific properties.
    “It’s like ChatGPT,” said Takeda, “but instead of ‘Write me a poem’, we ask it to design a material that can survive as the first wall of a fusion reactor.” 
    The next step? MatterSim – a simulation tool that predicts how these imagined materials will behave in the real world. By combining generation and simulation, Microsoft hopes to uncover materials that don’t yet exist in any catalogue. 
    While Microsoft tackles the atomic scale, Arena is focused on a different challenge: speeding up hardware development. As general manager Michael Frei put it: “Software innovation happens in seconds. In hardware, that loop can take months – or years.” 
    Arena’s answer is Atlas, a multimodal AI platform that acts as an extra set of hands – and eyes – for engineers. It can read data sheets, interpret lab results, analyse circuit diagrams and even interact with lab equipment through software interfaces. “Instead of adjusting an oscilloscope manually,” said Frei, “you can just say, ‘Verify the I2Cprotocol’, and Atlas gets it done.” 
    It doesn’t stop there. Atlas can write and adapt firmware on the fly, responding to real-time conditions. That means tighter feedback loops, faster prototyping and fewer late nights in the lab. Arena aims to make building hardware feel a little more like writing software – fluid, fast and assisted by smart tools. 

    Fusion, of course, isn’t just about atoms and code – it’s also about construction. Gigantic, one-of-a-kind machines don’t build themselves. That’s where Brigantium Engineering comes in.
    Founder Lynton Sutton explained how his team uses “4D planning” – a marriage of 3D CAD models and detailed construction schedules – to visualise how everything comes together over time. “Gantt charts are hard to interpret. 3D models are static. Our job is to bring those together,” he said. 
    The result is a time-lapse-style animation that shows the construction process step by step. It’s proven invaluable for safety reviews and stakeholder meetings. Rather than poring over spreadsheets, teams can simply watch the plan come to life. 
    And there’s more. Brigantium is bringing these models into virtual reality using Unreal Engine – the same one behind many video games. One recent model recreated ITER’s tokamak pit using drone footage and photogrammetry. The experience is fully interactive and can even run in a web browser.
    “We’ve really improved the quality of the visualisation,” said Sutton. “It’s a lot smoother; the textures look a lot better. Eventually, we’ll have this running through a web browser, so anybody on the team can just click on a web link to navigate this 4D model.” 
    Looking forward, Sutton believes AI could help automate the painstaking work of syncing schedules with 3D models. One day, these simulations could reach all the way down to individual bolts and fasteners – not just with impressive visuals, but with critical tools for preventing delays. 
    Despite the different approaches, one theme ran through all three presentations: AI isn’t just a tool for office productivity. It’s becoming a partner in creativity, problem-solving and even scientific discovery. 
    Takeda mentioned that Microsoft is experimenting with “world models” inspired by how video games simulate physics. These models learn about the physical world by watching pixels in the form of videos of real phenomena such as plasma behaviour. “Our thesis is that if you showed this AI videos of plasma, it might learn the physics of plasmas,” he said. 
    It sounds futuristic, but the logic holds. The more AI can learn from the world, the more it can help us understand it – and perhaps even master it. At its heart, the message from the workshop was simple: AI isn’t here to replace the scientist, the engineer or the planner; it’s here to help, and to make their work faster, more flexible and maybe a little more fun.
    As Takeda put it: “Those are just a few examples of how AI is starting to be used at ITER. And it’s just the start of that journey.” 
    If these early steps are any indication, that journey won’t just be faster – it might also be more inspired. 
    #fusion #how #private #sector #tech
    Fusion and AI: How private sector tech is powering progress at ITER
    In April 2025, at the ITER Private Sector Fusion Workshop in Cadarache, something remarkable unfolded. In a room filled with scientists, engineers and software visionaries, the line between big science and commercial innovation began to blur.   Three organisations – Microsoft Research, Arena and Brigantium Engineering – shared how artificial intelligence, already transforming everything from language models to logistics, is now stepping into a new role: helping humanity to unlock the power of nuclear fusion.  Each presenter addressed a different part of the puzzle, but the message was the same: AI isn’t just a buzzword anymore. It’s becoming a real tool – practical, powerful and indispensable – for big science and engineering projects, including fusion.  “If we think of the agricultural revolution and the industrial revolution, the AI revolution is next – and it’s coming at a pace which is unprecedented,” said Kenji Takeda, director of research incubations at Microsoft Research.  Microsoft’s collaboration with ITER is already in motion. Just a month before the workshop, the two teams signed a Memorandum of Understandingto explore how AI can accelerate research and development. This follows ITER’s initial use of Microsoft technology to empower their teams. A chatbot in Azure OpenAI service was developed to help staff navigate technical knowledge, on more than a million ITER documents, using natural conversation. GitHub Copilot assists with coding, while AI helps to resolve IT support tickets – those everyday but essential tasks that keep the lights on.  But Microsoft’s vision goes deeper. Fusion demands materials that can survive extreme conditions – heat, radiation, pressure – and that’s where AI shows a different kind of potential. MatterGen, a Microsoft Research generative AI model for materials, designs entirely new materials based on specific properties. “It’s like ChatGPT,” said Takeda, “but instead of ‘Write me a poem’, we ask it to design a material that can survive as the first wall of a fusion reactor.”  The next step? MatterSim – a simulation tool that predicts how these imagined materials will behave in the real world. By combining generation and simulation, Microsoft hopes to uncover materials that don’t yet exist in any catalogue.  While Microsoft tackles the atomic scale, Arena is focused on a different challenge: speeding up hardware development. As general manager Michael Frei put it: “Software innovation happens in seconds. In hardware, that loop can take months – or years.”  Arena’s answer is Atlas, a multimodal AI platform that acts as an extra set of hands – and eyes – for engineers. It can read data sheets, interpret lab results, analyse circuit diagrams and even interact with lab equipment through software interfaces. “Instead of adjusting an oscilloscope manually,” said Frei, “you can just say, ‘Verify the I2Cprotocol’, and Atlas gets it done.”  It doesn’t stop there. Atlas can write and adapt firmware on the fly, responding to real-time conditions. That means tighter feedback loops, faster prototyping and fewer late nights in the lab. Arena aims to make building hardware feel a little more like writing software – fluid, fast and assisted by smart tools.  Fusion, of course, isn’t just about atoms and code – it’s also about construction. Gigantic, one-of-a-kind machines don’t build themselves. That’s where Brigantium Engineering comes in. Founder Lynton Sutton explained how his team uses “4D planning” – a marriage of 3D CAD models and detailed construction schedules – to visualise how everything comes together over time. “Gantt charts are hard to interpret. 3D models are static. Our job is to bring those together,” he said.  The result is a time-lapse-style animation that shows the construction process step by step. It’s proven invaluable for safety reviews and stakeholder meetings. Rather than poring over spreadsheets, teams can simply watch the plan come to life.  And there’s more. Brigantium is bringing these models into virtual reality using Unreal Engine – the same one behind many video games. One recent model recreated ITER’s tokamak pit using drone footage and photogrammetry. The experience is fully interactive and can even run in a web browser. “We’ve really improved the quality of the visualisation,” said Sutton. “It’s a lot smoother; the textures look a lot better. Eventually, we’ll have this running through a web browser, so anybody on the team can just click on a web link to navigate this 4D model.”  Looking forward, Sutton believes AI could help automate the painstaking work of syncing schedules with 3D models. One day, these simulations could reach all the way down to individual bolts and fasteners – not just with impressive visuals, but with critical tools for preventing delays.  Despite the different approaches, one theme ran through all three presentations: AI isn’t just a tool for office productivity. It’s becoming a partner in creativity, problem-solving and even scientific discovery.  Takeda mentioned that Microsoft is experimenting with “world models” inspired by how video games simulate physics. These models learn about the physical world by watching pixels in the form of videos of real phenomena such as plasma behaviour. “Our thesis is that if you showed this AI videos of plasma, it might learn the physics of plasmas,” he said.  It sounds futuristic, but the logic holds. The more AI can learn from the world, the more it can help us understand it – and perhaps even master it. At its heart, the message from the workshop was simple: AI isn’t here to replace the scientist, the engineer or the planner; it’s here to help, and to make their work faster, more flexible and maybe a little more fun. As Takeda put it: “Those are just a few examples of how AI is starting to be used at ITER. And it’s just the start of that journey.”  If these early steps are any indication, that journey won’t just be faster – it might also be more inspired.  #fusion #how #private #sector #tech
    WWW.COMPUTERWEEKLY.COM
    Fusion and AI: How private sector tech is powering progress at ITER
    In April 2025, at the ITER Private Sector Fusion Workshop in Cadarache, something remarkable unfolded. In a room filled with scientists, engineers and software visionaries, the line between big science and commercial innovation began to blur.   Three organisations – Microsoft Research, Arena and Brigantium Engineering – shared how artificial intelligence (AI), already transforming everything from language models to logistics, is now stepping into a new role: helping humanity to unlock the power of nuclear fusion.  Each presenter addressed a different part of the puzzle, but the message was the same: AI isn’t just a buzzword anymore. It’s becoming a real tool – practical, powerful and indispensable – for big science and engineering projects, including fusion.  “If we think of the agricultural revolution and the industrial revolution, the AI revolution is next – and it’s coming at a pace which is unprecedented,” said Kenji Takeda, director of research incubations at Microsoft Research.  Microsoft’s collaboration with ITER is already in motion. Just a month before the workshop, the two teams signed a Memorandum of Understanding (MoU) to explore how AI can accelerate research and development. This follows ITER’s initial use of Microsoft technology to empower their teams. A chatbot in Azure OpenAI service was developed to help staff navigate technical knowledge, on more than a million ITER documents, using natural conversation. GitHub Copilot assists with coding, while AI helps to resolve IT support tickets – those everyday but essential tasks that keep the lights on.  But Microsoft’s vision goes deeper. Fusion demands materials that can survive extreme conditions – heat, radiation, pressure – and that’s where AI shows a different kind of potential. MatterGen, a Microsoft Research generative AI model for materials, designs entirely new materials based on specific properties. “It’s like ChatGPT,” said Takeda, “but instead of ‘Write me a poem’, we ask it to design a material that can survive as the first wall of a fusion reactor.”  The next step? MatterSim – a simulation tool that predicts how these imagined materials will behave in the real world. By combining generation and simulation, Microsoft hopes to uncover materials that don’t yet exist in any catalogue.  While Microsoft tackles the atomic scale, Arena is focused on a different challenge: speeding up hardware development. As general manager Michael Frei put it: “Software innovation happens in seconds. In hardware, that loop can take months – or years.”  Arena’s answer is Atlas, a multimodal AI platform that acts as an extra set of hands – and eyes – for engineers. It can read data sheets, interpret lab results, analyse circuit diagrams and even interact with lab equipment through software interfaces. “Instead of adjusting an oscilloscope manually,” said Frei, “you can just say, ‘Verify the I2C [inter integrated circuit] protocol’, and Atlas gets it done.”  It doesn’t stop there. Atlas can write and adapt firmware on the fly, responding to real-time conditions. That means tighter feedback loops, faster prototyping and fewer late nights in the lab. Arena aims to make building hardware feel a little more like writing software – fluid, fast and assisted by smart tools.  Fusion, of course, isn’t just about atoms and code – it’s also about construction. Gigantic, one-of-a-kind machines don’t build themselves. That’s where Brigantium Engineering comes in. Founder Lynton Sutton explained how his team uses “4D planning” – a marriage of 3D CAD models and detailed construction schedules – to visualise how everything comes together over time. “Gantt charts are hard to interpret. 3D models are static. Our job is to bring those together,” he said.  The result is a time-lapse-style animation that shows the construction process step by step. It’s proven invaluable for safety reviews and stakeholder meetings. Rather than poring over spreadsheets, teams can simply watch the plan come to life.  And there’s more. Brigantium is bringing these models into virtual reality using Unreal Engine – the same one behind many video games. One recent model recreated ITER’s tokamak pit using drone footage and photogrammetry. The experience is fully interactive and can even run in a web browser. “We’ve really improved the quality of the visualisation,” said Sutton. “It’s a lot smoother; the textures look a lot better. Eventually, we’ll have this running through a web browser, so anybody on the team can just click on a web link to navigate this 4D model.”  Looking forward, Sutton believes AI could help automate the painstaking work of syncing schedules with 3D models. One day, these simulations could reach all the way down to individual bolts and fasteners – not just with impressive visuals, but with critical tools for preventing delays.  Despite the different approaches, one theme ran through all three presentations: AI isn’t just a tool for office productivity. It’s becoming a partner in creativity, problem-solving and even scientific discovery.  Takeda mentioned that Microsoft is experimenting with “world models” inspired by how video games simulate physics. These models learn about the physical world by watching pixels in the form of videos of real phenomena such as plasma behaviour. “Our thesis is that if you showed this AI videos of plasma, it might learn the physics of plasmas,” he said.  It sounds futuristic, but the logic holds. The more AI can learn from the world, the more it can help us understand it – and perhaps even master it. At its heart, the message from the workshop was simple: AI isn’t here to replace the scientist, the engineer or the planner; it’s here to help, and to make their work faster, more flexible and maybe a little more fun. As Takeda put it: “Those are just a few examples of how AI is starting to be used at ITER. And it’s just the start of that journey.”  If these early steps are any indication, that journey won’t just be faster – it might also be more inspired. 
    Like
    Love
    Wow
    Sad
    Angry
    490
    2 Comments 0 Shares 0 Reviews
  • Unity Technical VFX Artist at No Brakes Games

    Unity Technical VFX ArtistNo Brakes GamesVilnius, Lithuania or Remote3 hours agoApplyWe are No Brakes Games, the creators of Human Fall Flat.We are looking for a Unity Technical VFX Artist to join our team to work on Human Fall Flat 2.FULL-TIME POSITION - Vilnius, Lithuania or RemoteRole Overview:As a Unity Technical Artist specializing in VFX and rendering, you will develop and optimize real-time visual effects, create advanced shaders and materials, and ensure a balance between visual fidelity and performance. You will solve complex technical challenges, optimizing effects for real-time execution while collaborating with artists, designers, and engineers to push Unity’s rendering capabilities to the next level.Responsibilities:Develop and optimize real-time VFX solutions that are both visually striking and performant.Create scalable shaders and materials for water, fog, atmospheric effects, and dynamic lighting.Debug and resolve VFX performance bottlenecks using Unity Profiler, RenderDoc, and other tools.Optimize particle systems, volumetric effects, and GPU simulations for multi-platform performance.Document best practices and educate the team on efficient asset and shader workflows.Collaborate with engineers to develop and implement custom rendering solutions.Stay updated on Unity’s latest advancements in rendering, HDRP, and the Visual Effect Graph.Requirements:2+ years of experience as a Technical Artist in game development, with a focus on Unity.Strong understanding of Unity’s rendering pipelineand shader development.Experience developing performance-conscious visual effects, including particle systems, volumetric lighting, and dynamic environmental effects.Proficiency in GPU/CPU optimization techniques, LODs for VFX.Hands-on experience with real-time lighting and atmospheric effects.Ability to debug and profile complex rendering issues effectively.Excellent communication skills and ability to work collaboratively within a multi-disciplinary team.A flexible, R&D-driven mindset, able to iterate quickly in both prototyping and production environments.Nice-to-Have:Experience working on at least one released game project.Experience with Unity HDRP and SRP.Experience with multi-platform development.Knowledge of C#, Python, or C++ for extending Unity’s capabilities.Experience developing custom node-based tools or extending Unity’s Visual Effect Graph.Background in procedural animation, physics-based effects, or fluid simulations.Apply today by sending your Portfolio & CV to jobs@nobrakesgames.com
    Create Your Profile — Game companies can contact you with their relevant job openings.
    Apply
    #unity #technical #vfx #artist #brakes
    Unity Technical VFX Artist at No Brakes Games
    Unity Technical VFX ArtistNo Brakes GamesVilnius, Lithuania or Remote3 hours agoApplyWe are No Brakes Games, the creators of Human Fall Flat.We are looking for a Unity Technical VFX Artist to join our team to work on Human Fall Flat 2.FULL-TIME POSITION - Vilnius, Lithuania or RemoteRole Overview:As a Unity Technical Artist specializing in VFX and rendering, you will develop and optimize real-time visual effects, create advanced shaders and materials, and ensure a balance between visual fidelity and performance. You will solve complex technical challenges, optimizing effects for real-time execution while collaborating with artists, designers, and engineers to push Unity’s rendering capabilities to the next level.Responsibilities:Develop and optimize real-time VFX solutions that are both visually striking and performant.Create scalable shaders and materials for water, fog, atmospheric effects, and dynamic lighting.Debug and resolve VFX performance bottlenecks using Unity Profiler, RenderDoc, and other tools.Optimize particle systems, volumetric effects, and GPU simulations for multi-platform performance.Document best practices and educate the team on efficient asset and shader workflows.Collaborate with engineers to develop and implement custom rendering solutions.Stay updated on Unity’s latest advancements in rendering, HDRP, and the Visual Effect Graph.Requirements:2+ years of experience as a Technical Artist in game development, with a focus on Unity.Strong understanding of Unity’s rendering pipelineand shader development.Experience developing performance-conscious visual effects, including particle systems, volumetric lighting, and dynamic environmental effects.Proficiency in GPU/CPU optimization techniques, LODs for VFX.Hands-on experience with real-time lighting and atmospheric effects.Ability to debug and profile complex rendering issues effectively.Excellent communication skills and ability to work collaboratively within a multi-disciplinary team.A flexible, R&D-driven mindset, able to iterate quickly in both prototyping and production environments.Nice-to-Have:Experience working on at least one released game project.Experience with Unity HDRP and SRP.Experience with multi-platform development.Knowledge of C#, Python, or C++ for extending Unity’s capabilities.Experience developing custom node-based tools or extending Unity’s Visual Effect Graph.Background in procedural animation, physics-based effects, or fluid simulations.Apply today by sending your Portfolio & CV to jobs@nobrakesgames.com Create Your Profile — Game companies can contact you with their relevant job openings. Apply #unity #technical #vfx #artist #brakes
    Unity Technical VFX Artist at No Brakes Games
    Unity Technical VFX ArtistNo Brakes GamesVilnius, Lithuania or Remote3 hours agoApplyWe are No Brakes Games, the creators of Human Fall Flat.We are looking for a Unity Technical VFX Artist to join our team to work on Human Fall Flat 2.FULL-TIME POSITION - Vilnius, Lithuania or RemoteRole Overview:As a Unity Technical Artist specializing in VFX and rendering, you will develop and optimize real-time visual effects, create advanced shaders and materials, and ensure a balance between visual fidelity and performance. You will solve complex technical challenges, optimizing effects for real-time execution while collaborating with artists, designers, and engineers to push Unity’s rendering capabilities to the next level.Responsibilities:Develop and optimize real-time VFX solutions that are both visually striking and performant.Create scalable shaders and materials for water, fog, atmospheric effects, and dynamic lighting.Debug and resolve VFX performance bottlenecks using Unity Profiler, RenderDoc, and other tools.Optimize particle systems, volumetric effects, and GPU simulations for multi-platform performance.Document best practices and educate the team on efficient asset and shader workflows.Collaborate with engineers to develop and implement custom rendering solutions.Stay updated on Unity’s latest advancements in rendering, HDRP, and the Visual Effect Graph.Requirements:2+ years of experience as a Technical Artist in game development, with a focus on Unity.Strong understanding of Unity’s rendering pipeline (HDRP, URP, Built-in) and shader development (HLSL, Shader Graph).Experience developing performance-conscious visual effects, including particle systems, volumetric lighting, and dynamic environmental effects.Proficiency in GPU/CPU optimization techniques, LODs for VFX.Hands-on experience with real-time lighting and atmospheric effects.Ability to debug and profile complex rendering issues effectively.Excellent communication skills and ability to work collaboratively within a multi-disciplinary team.A flexible, R&D-driven mindset, able to iterate quickly in both prototyping and production environments.Nice-to-Have:Experience working on at least one released game project.Experience with Unity HDRP and SRP.Experience with multi-platform development (PC, console, mobile, VR/AR).Knowledge of C#, Python, or C++ for extending Unity’s capabilities.Experience developing custom node-based tools or extending Unity’s Visual Effect Graph.Background in procedural animation, physics-based effects, or fluid simulations.Apply today by sending your Portfolio & CV to jobs@nobrakesgames.com Create Your Profile — Game companies can contact you with their relevant job openings. Apply
    0 Comments 0 Shares 0 Reviews
  • Komires: Matali Physics 6.9 Released

    We are pleased to announce the release of Matali Physics 6.9, the next significant step on the way to the seventh major version of the environment. Matali Physics 6.9 introduces a number of improvements and fixes to Matali Physics Core, Matali Render and Matali Games modules, presents physics-driven, completely dynamic light sources, real-time object scaling with destruction, lighting model simulating global illuminationin some aspects, comprehensive support for Wayland on Linux, and more.

    Posted by komires on Jun 3rd, 2025
    What is Matali Physics?
    Matali Physics is an advanced, modern, multi-platform, high-performance 3d physics environment intended for games, VR, AR, physics-based simulations and robotics. Matali Physics consists of the advanced 3d physics engine Matali Physics Core and other physics-driven modules that all together provide comprehensive simulation of physical phenomena and physics-based modeling of both real and imaginary objects.
    What's new in version 6.9?

    Physics-driven, completely dynamic light sources. The introduced solution allows for processing hundreds of movable, long-range and shadow-casting light sources, where with each source can be assigned logic that controls its behavior, changes light parameters, volumetric effects parameters and others;
    Real-time object scaling with destruction. All groups of physics objects and groups of physics objects with constraints may be subject to destruction process during real-time scaling, allowing group members to break off at different sizes;
    Lighting model simulating global illuminationin some aspects. Based on own research and development work, processed in real time, ready for dynamic scenes, fast on mobile devices, not based on lightmaps, light probes, baked lights, etc.;
    Comprehensive support for Wayland on Linux. The latest version allows Matali Physics SDK users to create advanced, high-performance, physics-based, Vulkan-based games for modern Linux distributions where Wayland is the main display server protocol;
    Other improvements and fixes which complete list is available on the History webpage.

    What platforms does Matali Physics support?

    Android
    Android TV
    *BSD
    iOS
    iPadOS
    LinuxmacOS
    Steam Deck
    tvOS
    UWPWindowsWhat are the benefits of using Matali Physics?

    Physics simulation, graphics, sound and music integrated into one total multimedia solution where creating complex interactions and behaviors is common and relatively easy
    Composed of dedicated modules that do not require additional licences and fees
    Supports fully dynamic and destructible scenes
    Supports physics-based behavioral animations
    Supports physical AI, object motion and state change control
    Supports physics-based GUI
    Supports physics-based particle effects
    Supports multi-scene physics simulation and scene combining
    Supports physics-based photo mode
    Supports physics-driven sound
    Supports physics-driven music
    Supports debug visualization
    Fully serializable and deserializable
    Available for all major mobile, desktop and TV platforms
    New features on request
    Dedicated technical support
    Regular updates and fixes

    If you have questions related to the latest version and the use of Matali Physics environment as a game creation solution, please do not hesitate to contact us.
    #komires #matali #physics #released
    Komires: Matali Physics 6.9 Released
    We are pleased to announce the release of Matali Physics 6.9, the next significant step on the way to the seventh major version of the environment. Matali Physics 6.9 introduces a number of improvements and fixes to Matali Physics Core, Matali Render and Matali Games modules, presents physics-driven, completely dynamic light sources, real-time object scaling with destruction, lighting model simulating global illuminationin some aspects, comprehensive support for Wayland on Linux, and more. Posted by komires on Jun 3rd, 2025 What is Matali Physics? Matali Physics is an advanced, modern, multi-platform, high-performance 3d physics environment intended for games, VR, AR, physics-based simulations and robotics. Matali Physics consists of the advanced 3d physics engine Matali Physics Core and other physics-driven modules that all together provide comprehensive simulation of physical phenomena and physics-based modeling of both real and imaginary objects. What's new in version 6.9? Physics-driven, completely dynamic light sources. The introduced solution allows for processing hundreds of movable, long-range and shadow-casting light sources, where with each source can be assigned logic that controls its behavior, changes light parameters, volumetric effects parameters and others; Real-time object scaling with destruction. All groups of physics objects and groups of physics objects with constraints may be subject to destruction process during real-time scaling, allowing group members to break off at different sizes; Lighting model simulating global illuminationin some aspects. Based on own research and development work, processed in real time, ready for dynamic scenes, fast on mobile devices, not based on lightmaps, light probes, baked lights, etc.; Comprehensive support for Wayland on Linux. The latest version allows Matali Physics SDK users to create advanced, high-performance, physics-based, Vulkan-based games for modern Linux distributions where Wayland is the main display server protocol; Other improvements and fixes which complete list is available on the History webpage. What platforms does Matali Physics support? Android Android TV *BSD iOS iPadOS LinuxmacOS Steam Deck tvOS UWPWindowsWhat are the benefits of using Matali Physics? Physics simulation, graphics, sound and music integrated into one total multimedia solution where creating complex interactions and behaviors is common and relatively easy Composed of dedicated modules that do not require additional licences and fees Supports fully dynamic and destructible scenes Supports physics-based behavioral animations Supports physical AI, object motion and state change control Supports physics-based GUI Supports physics-based particle effects Supports multi-scene physics simulation and scene combining Supports physics-based photo mode Supports physics-driven sound Supports physics-driven music Supports debug visualization Fully serializable and deserializable Available for all major mobile, desktop and TV platforms New features on request Dedicated technical support Regular updates and fixes If you have questions related to the latest version and the use of Matali Physics environment as a game creation solution, please do not hesitate to contact us. #komires #matali #physics #released
    WWW.INDIEDB.COM
    Komires: Matali Physics 6.9 Released
    We are pleased to announce the release of Matali Physics 6.9, the next significant step on the way to the seventh major version of the environment. Matali Physics 6.9 introduces a number of improvements and fixes to Matali Physics Core, Matali Render and Matali Games modules, presents physics-driven, completely dynamic light sources, real-time object scaling with destruction, lighting model simulating global illumination (GI) in some aspects, comprehensive support for Wayland on Linux, and more. Posted by komires on Jun 3rd, 2025 What is Matali Physics? Matali Physics is an advanced, modern, multi-platform, high-performance 3d physics environment intended for games, VR, AR, physics-based simulations and robotics. Matali Physics consists of the advanced 3d physics engine Matali Physics Core and other physics-driven modules that all together provide comprehensive simulation of physical phenomena and physics-based modeling of both real and imaginary objects. What's new in version 6.9? Physics-driven, completely dynamic light sources. The introduced solution allows for processing hundreds of movable, long-range and shadow-casting light sources, where with each source can be assigned logic that controls its behavior, changes light parameters, volumetric effects parameters and others; Real-time object scaling with destruction. All groups of physics objects and groups of physics objects with constraints may be subject to destruction process during real-time scaling, allowing group members to break off at different sizes; Lighting model simulating global illumination (GI) in some aspects. Based on own research and development work, processed in real time, ready for dynamic scenes, fast on mobile devices, not based on lightmaps, light probes, baked lights, etc.; Comprehensive support for Wayland on Linux. The latest version allows Matali Physics SDK users to create advanced, high-performance, physics-based, Vulkan-based games for modern Linux distributions where Wayland is the main display server protocol; Other improvements and fixes which complete list is available on the History webpage. What platforms does Matali Physics support? Android Android TV *BSD iOS iPadOS Linux (distributions) macOS Steam Deck tvOS UWP (Desktop, Xbox Series X/S) Windows (Classic, GDK, Handheld consoles) What are the benefits of using Matali Physics? Physics simulation, graphics, sound and music integrated into one total multimedia solution where creating complex interactions and behaviors is common and relatively easy Composed of dedicated modules that do not require additional licences and fees Supports fully dynamic and destructible scenes Supports physics-based behavioral animations Supports physical AI, object motion and state change control Supports physics-based GUI Supports physics-based particle effects Supports multi-scene physics simulation and scene combining Supports physics-based photo mode Supports physics-driven sound Supports physics-driven music Supports debug visualization Fully serializable and deserializable Available for all major mobile, desktop and TV platforms New features on request Dedicated technical support Regular updates and fixes If you have questions related to the latest version and the use of Matali Physics environment as a game creation solution, please do not hesitate to contact us.
    0 Comments 0 Shares 0 Reviews
  • VFX Artist at No Brakes Games

    VFX ArtistNo Brakes GamesVilnius, Lithuania or Remote2 hours agoApplyWe are No Brakes Games, the creators of Human Fall Flat.We are now looking for an VFX Artist to join our team to work on Human Fall Flat 2.FULL-TIME POSITIONRole Overview:As a VFX Artist, you will develop and optimize real-time visual effects and ensure a balance between visual fidelity and performance.Responsibilities:Develop and optimize real-time VFX solutions that are both visually striking and performant.Debug and resolve VFX performance bottlenecks using Unity Profiler, RenderDoc, and other tools.Optimize particle systems, volumetric effects, and GPU simulations for multi-platform performance.Stay updated on Unity’s latest advancements in rendering, HDRP, and the Visual Effect Graph.Requirements:2+ years of experience as a VFX Artist in game development, with a focus on Unity.Experience developing performance-conscious visual effects, including particle systems, volumetric lighting, and dynamic environmental effects.Proficiency in GPU/CPU optimization techniques, LODs for VFX.Excellent communication skills and ability to work collaboratively within a multi-disciplinary team.A flexible, R&D-driven mindset, able to iterate quickly in both prototyping and production environments.Nice-to-Have:Experience working on at least one released game project.Experience with Unity HDRP and SRP.Experience with multi-platform development.Experience developing custom node-based tools or extending Unity’s Visual Effect Graph.Background in procedural animation, physics-based effects, or fluid simulations.Apply today by sending your Portfolio & CV to jobs@nobrakesgames.com
    Create Your Profile — Game companies can contact you with their relevant job openings.
    Apply
    #vfx #artist #brakes #games
    VFX Artist at No Brakes Games
    VFX ArtistNo Brakes GamesVilnius, Lithuania or Remote2 hours agoApplyWe are No Brakes Games, the creators of Human Fall Flat.We are now looking for an VFX Artist to join our team to work on Human Fall Flat 2.FULL-TIME POSITIONRole Overview:As a VFX Artist, you will develop and optimize real-time visual effects and ensure a balance between visual fidelity and performance.Responsibilities:Develop and optimize real-time VFX solutions that are both visually striking and performant.Debug and resolve VFX performance bottlenecks using Unity Profiler, RenderDoc, and other tools.Optimize particle systems, volumetric effects, and GPU simulations for multi-platform performance.Stay updated on Unity’s latest advancements in rendering, HDRP, and the Visual Effect Graph.Requirements:2+ years of experience as a VFX Artist in game development, with a focus on Unity.Experience developing performance-conscious visual effects, including particle systems, volumetric lighting, and dynamic environmental effects.Proficiency in GPU/CPU optimization techniques, LODs for VFX.Excellent communication skills and ability to work collaboratively within a multi-disciplinary team.A flexible, R&D-driven mindset, able to iterate quickly in both prototyping and production environments.Nice-to-Have:Experience working on at least one released game project.Experience with Unity HDRP and SRP.Experience with multi-platform development.Experience developing custom node-based tools or extending Unity’s Visual Effect Graph.Background in procedural animation, physics-based effects, or fluid simulations.Apply today by sending your Portfolio & CV to jobs@nobrakesgames.com Create Your Profile — Game companies can contact you with their relevant job openings. Apply #vfx #artist #brakes #games
    VFX Artist at No Brakes Games
    VFX ArtistNo Brakes GamesVilnius, Lithuania or Remote2 hours agoApplyWe are No Brakes Games, the creators of Human Fall Flat.We are now looking for an VFX Artist to join our team to work on Human Fall Flat 2.FULL-TIME POSITIONRole Overview:As a VFX Artist, you will develop and optimize real-time visual effects and ensure a balance between visual fidelity and performance.Responsibilities:Develop and optimize real-time VFX solutions that are both visually striking and performant.Debug and resolve VFX performance bottlenecks using Unity Profiler, RenderDoc, and other tools.Optimize particle systems, volumetric effects, and GPU simulations for multi-platform performance.Stay updated on Unity’s latest advancements in rendering, HDRP, and the Visual Effect Graph.Requirements:2+ years of experience as a VFX Artist in game development, with a focus on Unity.Experience developing performance-conscious visual effects, including particle systems, volumetric lighting, and dynamic environmental effects.Proficiency in GPU/CPU optimization techniques, LODs for VFX.Excellent communication skills and ability to work collaboratively within a multi-disciplinary team.A flexible, R&D-driven mindset, able to iterate quickly in both prototyping and production environments.Nice-to-Have:Experience working on at least one released game project.Experience with Unity HDRP and SRP.Experience with multi-platform development (PC, console, mobile, VR/AR).Experience developing custom node-based tools or extending Unity’s Visual Effect Graph.Background in procedural animation, physics-based effects, or fluid simulations.Apply today by sending your Portfolio & CV to jobs@nobrakesgames.com Create Your Profile — Game companies can contact you with their relevant job openings. Apply
    0 Comments 0 Shares 0 Reviews
  • How a planetarium show discovered a spiral at the edge of our solar system

    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system.

    “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist.

    Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years. 

    The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?” 

    To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data.

    “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says. 

    The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars.

    “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.”

    She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’” 

    While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves. 

    In each simulation, the spiral persisted.

    “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’” 

    An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system.

    “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.”

    “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.”

    It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.”

    The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems.

    Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”

     In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show.

    “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’

    “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'”

    “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds.

    The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.”

    By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies.

    To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX.

    The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.” 

    The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.”

    Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data.

    “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.”

    As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands.

    Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent. 

    More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud. 

    Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.” 

    The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud. 

    For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    #how #planetarium #show #discovered #spiral
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park. #how #planetarium #show #discovered #spiral
    WWW.FASTCOMPANY.COM
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space Show (curving, dusty S-shape behind the Sun) [Image: © AMNH] More simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system. [Image: NASA] As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths. [Image: © AMNH] Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “Then [planetarium’s director] Neil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud (center), a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud [Image: © AMNH ] “New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    0 Comments 0 Shares 0 Reviews
CGShares https://cgshares.com