• Acronis has appointed a new Country Manager for Iberia, Eduardo García Sancho, to oversee operations in the region. The plan is to grow the business, strengthen relationships with partners and clients, and enhance the company's presence in the area. Sounds like a typical corporate move, right? Not much excitement here.

    It's just another day in the world of cybersecurity. Eduardo will lead the team, but honestly, these changes rarely shake things up in a way that’s noticeable. Companies keep trying to expand and improve their market standing, which seems to be the standard practice these days. One more manager in the mix, same old story.

    While growth and relationships are important, it feels like we’ve heard this script before. You bring in someone new, they talk about plans and visions, and then... well, we wait to see if anything actually changes. It’s a bit like watching paint dry, really.

    So, Acronis now has Eduardo at the helm for Iberia. Let's see how that goes. If you're interested in cybersecurity or just happen to be following corporate management moves, this might be mildly worth noting. But, if you're like me, it probably won't spark much enthusiasm. Just another appointment in the long line of appointments.

    #Acronis #CountryManager #Iberia #Cybersecurity #CorporateMoves
    Acronis has appointed a new Country Manager for Iberia, Eduardo García Sancho, to oversee operations in the region. The plan is to grow the business, strengthen relationships with partners and clients, and enhance the company's presence in the area. Sounds like a typical corporate move, right? Not much excitement here. It's just another day in the world of cybersecurity. Eduardo will lead the team, but honestly, these changes rarely shake things up in a way that’s noticeable. Companies keep trying to expand and improve their market standing, which seems to be the standard practice these days. One more manager in the mix, same old story. While growth and relationships are important, it feels like we’ve heard this script before. You bring in someone new, they talk about plans and visions, and then... well, we wait to see if anything actually changes. It’s a bit like watching paint dry, really. So, Acronis now has Eduardo at the helm for Iberia. Let's see how that goes. If you're interested in cybersecurity or just happen to be following corporate management moves, this might be mildly worth noting. But, if you're like me, it probably won't spark much enthusiasm. Just another appointment in the long line of appointments. #Acronis #CountryManager #Iberia #Cybersecurity #CorporateMoves
    Acronis nombra nuevo Country Manager para Iberia
    La compañía de ciberseguridad Acronis refuerza su equipo en Iberia con el nombramiento de un nuevo Country Manager en la zona: Eduardo García Sancho, que se pondrá al frente del equipo de la compañía en la zona con el objetivo de fomentar el crecimi
    Like
    Love
    Wow
    Sad
    Angry
    604
    1 Comments 0 Shares 0 Reviews
  • Summer Game Fest 2025 Saw 89 Percent Growth in Live Concurrent Viewership Since Last Year

    The recent Summer Game Fest 2025 has been quite successful. According to Variety, the live stream saw a massive growth year-over-year in terms of live viewership, coming in at an increase of 89 percent since the 2024 edition.
    The event saw a number of new games announced, as well as trailers for previously-announced games that will be coming soon. Among the headliners for the event were titles like Capcom’s Resident Evil Requiem, as well as gameplay for IO Interactive’s 007: First Light.
    “In total, the peak concurrent audience for SGF reached more than 3 million live simultaneous viewers across Twitch and YouTube, with significant year over year growth on both platforms in terms of average viewership, watch time and co-streams,” announced Summer Game Fest in a press release over the weekend.
    At the time of publishing Summer Game Fest 2025 had managed to get 8.5 million views on just one of the places where it was hosted – the official The Game Awards channel. While it is worth noting that this number takes both the live stream audience as well as those who watched the event afterwards into account, the number would quite likely be higher when taking other hosts, and even platforms like Twitch into account.
    Reports, have indicated that of the 8.5 million viewers, 1.5 million could be attributed to those watching during the live stream globally. Twitch, on the other hand, saw a growth of 38 percent in terms of live viewership among the over 8,900 channels that were co-streaming the event. This came in to around 1.4 million concurrent live viewers worldwide.
    Summer Games Fest 2025 was accompanied by a host of other events happening over the same weekend. This included events focused on PC Gaming, as well as Microsoft’s own Xbox Games Showcase, and even the indie game-focused Future Games Show 2025.
    Coinciding with the events was Valve kicking of its latest edition of Steam Next Fest, which featured a host of different game demos that players could try out. A lot of these games were unveiled or otherwise got new trailers during the live events last weekend. However, it is worth noting that today is the final day of this iteration of Steam Next Fest, which means that a lot of the demos will be going away.
    #summer #game #fest #saw #percent
    Summer Game Fest 2025 Saw 89 Percent Growth in Live Concurrent Viewership Since Last Year
    The recent Summer Game Fest 2025 has been quite successful. According to Variety, the live stream saw a massive growth year-over-year in terms of live viewership, coming in at an increase of 89 percent since the 2024 edition. The event saw a number of new games announced, as well as trailers for previously-announced games that will be coming soon. Among the headliners for the event were titles like Capcom’s Resident Evil Requiem, as well as gameplay for IO Interactive’s 007: First Light. “In total, the peak concurrent audience for SGF reached more than 3 million live simultaneous viewers across Twitch and YouTube, with significant year over year growth on both platforms in terms of average viewership, watch time and co-streams,” announced Summer Game Fest in a press release over the weekend. At the time of publishing Summer Game Fest 2025 had managed to get 8.5 million views on just one of the places where it was hosted – the official The Game Awards channel. While it is worth noting that this number takes both the live stream audience as well as those who watched the event afterwards into account, the number would quite likely be higher when taking other hosts, and even platforms like Twitch into account. Reports, have indicated that of the 8.5 million viewers, 1.5 million could be attributed to those watching during the live stream globally. Twitch, on the other hand, saw a growth of 38 percent in terms of live viewership among the over 8,900 channels that were co-streaming the event. This came in to around 1.4 million concurrent live viewers worldwide. Summer Games Fest 2025 was accompanied by a host of other events happening over the same weekend. This included events focused on PC Gaming, as well as Microsoft’s own Xbox Games Showcase, and even the indie game-focused Future Games Show 2025. Coinciding with the events was Valve kicking of its latest edition of Steam Next Fest, which featured a host of different game demos that players could try out. A lot of these games were unveiled or otherwise got new trailers during the live events last weekend. However, it is worth noting that today is the final day of this iteration of Steam Next Fest, which means that a lot of the demos will be going away. #summer #game #fest #saw #percent
    GAMINGBOLT.COM
    Summer Game Fest 2025 Saw 89 Percent Growth in Live Concurrent Viewership Since Last Year
    The recent Summer Game Fest 2025 has been quite successful. According to Variety, the live stream saw a massive growth year-over-year in terms of live viewership, coming in at an increase of 89 percent since the 2024 edition. The event saw a number of new games announced, as well as trailers for previously-announced games that will be coming soon. Among the headliners for the event were titles like Capcom’s Resident Evil Requiem, as well as gameplay for IO Interactive’s 007: First Light. “In total, the peak concurrent audience for SGF reached more than 3 million live simultaneous viewers across Twitch and YouTube, with significant year over year growth on both platforms in terms of average viewership, watch time and co-streams,” announced Summer Game Fest in a press release over the weekend. At the time of publishing Summer Game Fest 2025 had managed to get 8.5 million views on just one of the places where it was hosted – the official The Game Awards channel. While it is worth noting that this number takes both the live stream audience as well as those who watched the event afterwards into account, the number would quite likely be higher when taking other hosts, and even platforms like Twitch into account. Reports, have indicated that of the 8.5 million viewers, 1.5 million could be attributed to those watching during the live stream globally. Twitch, on the other hand, saw a growth of 38 percent in terms of live viewership among the over 8,900 channels that were co-streaming the event. This came in to around 1.4 million concurrent live viewers worldwide. Summer Games Fest 2025 was accompanied by a host of other events happening over the same weekend. This included events focused on PC Gaming, as well as Microsoft’s own Xbox Games Showcase, and even the indie game-focused Future Games Show 2025. Coinciding with the events was Valve kicking of its latest edition of Steam Next Fest, which featured a host of different game demos that players could try out. A lot of these games were unveiled or otherwise got new trailers during the live events last weekend. However, it is worth noting that today is the final day of this iteration of Steam Next Fest, which means that a lot of the demos will be going away.
    Like
    Love
    Wow
    Sad
    Angry
    465
    0 Comments 0 Shares 0 Reviews
  • Microsoft accidentally replaced Windows 11 startup sound with one from Vista

    When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

    Microsoft accidentally replaced Windows 11 startup sound with one from Vista

    Taras Buria

    Neowin
    @TarasBuria ·

    Jun 14, 2025 14:46 EDT

    The recently released Windows 11 Dev and Beta builds introduced some welcome changes and improvements. However, those preview builds are not flawless and have a pretty long list of known issues. One of those issues, though, is a rather delightful one: Windows 11's default startup jingle has been accidentally replaced with one from 2006.
    After Windows Insiders discovered that Windows 11 now plays the Windows Vista startup sound and reported it to Microsoft, the company acknowledged it and added it to the list of known bugs in the latest Windows 11 Dev and Beta builds:

    This week’s flight comes with a delightful blast from the past and will play the Windows Vista boot sound instead of the Windows 11 boot sound. We’re working on a fix.

    Although Windows Vista is nearly two decades old, it was brought to everyone's attention this week after Apple introduced macOS 26 Tahoe with its controversial "Liquid Glass" redesign, which many consider a rather miserable remix of Windows Aero from Windows Vista and Windows 7.
    While it is definitely an interesting coincidence, Microsoft did not intentionally replace the startup sound in Windows 11 preview builds. Brandon LeBlanc from the Windows Insider team confirmed in his X that that is a bug after joking about everyone talking about Windows Vista once again in light of Apple's latest announcements:

    It is worth noting that if you miss the startup sound of Windows Vista, you can still use it in modern Windows versions. All it takes is the original WAV file and a few clicks in the Windows Registry and Sound settings. And for those who want a shot of nostalgia, here is the sound of Windows Vista startup:

    What startup jingle do you like more: Windows Vista or Windows 11? Share your thoughts in the comments.

    Tags

    Report a problem with article

    Follow @NeowinFeed
    #microsoft #accidentally #replaced #windows #startup
    Microsoft accidentally replaced Windows 11 startup sound with one from Vista
    When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works. Microsoft accidentally replaced Windows 11 startup sound with one from Vista Taras Buria Neowin @TarasBuria · Jun 14, 2025 14:46 EDT The recently released Windows 11 Dev and Beta builds introduced some welcome changes and improvements. However, those preview builds are not flawless and have a pretty long list of known issues. One of those issues, though, is a rather delightful one: Windows 11's default startup jingle has been accidentally replaced with one from 2006. After Windows Insiders discovered that Windows 11 now plays the Windows Vista startup sound and reported it to Microsoft, the company acknowledged it and added it to the list of known bugs in the latest Windows 11 Dev and Beta builds: This week’s flight comes with a delightful blast from the past and will play the Windows Vista boot sound instead of the Windows 11 boot sound. We’re working on a fix. Although Windows Vista is nearly two decades old, it was brought to everyone's attention this week after Apple introduced macOS 26 Tahoe with its controversial "Liquid Glass" redesign, which many consider a rather miserable remix of Windows Aero from Windows Vista and Windows 7. While it is definitely an interesting coincidence, Microsoft did not intentionally replace the startup sound in Windows 11 preview builds. Brandon LeBlanc from the Windows Insider team confirmed in his X that that is a bug after joking about everyone talking about Windows Vista once again in light of Apple's latest announcements: It is worth noting that if you miss the startup sound of Windows Vista, you can still use it in modern Windows versions. All it takes is the original WAV file and a few clicks in the Windows Registry and Sound settings. And for those who want a shot of nostalgia, here is the sound of Windows Vista startup: What startup jingle do you like more: Windows Vista or Windows 11? Share your thoughts in the comments. Tags Report a problem with article Follow @NeowinFeed #microsoft #accidentally #replaced #windows #startup
    WWW.NEOWIN.NET
    Microsoft accidentally replaced Windows 11 startup sound with one from Vista
    When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works. Microsoft accidentally replaced Windows 11 startup sound with one from Vista Taras Buria Neowin @TarasBuria · Jun 14, 2025 14:46 EDT The recently released Windows 11 Dev and Beta builds introduced some welcome changes and improvements. However, those preview builds are not flawless and have a pretty long list of known issues. One of those issues, though, is a rather delightful one: Windows 11's default startup jingle has been accidentally replaced with one from 2006. After Windows Insiders discovered that Windows 11 now plays the Windows Vista startup sound and reported it to Microsoft, the company acknowledged it and added it to the list of known bugs in the latest Windows 11 Dev and Beta builds: This week’s flight comes with a delightful blast from the past and will play the Windows Vista boot sound instead of the Windows 11 boot sound. We’re working on a fix. Although Windows Vista is nearly two decades old, it was brought to everyone's attention this week after Apple introduced macOS 26 Tahoe with its controversial "Liquid Glass" redesign, which many consider a rather miserable remix of Windows Aero from Windows Vista and Windows 7. While it is definitely an interesting coincidence, Microsoft did not intentionally replace the startup sound in Windows 11 preview builds. Brandon LeBlanc from the Windows Insider team confirmed in his X that that is a bug after joking about everyone talking about Windows Vista once again in light of Apple's latest announcements: It is worth noting that if you miss the startup sound of Windows Vista, you can still use it in modern Windows versions. All it takes is the original WAV file and a few clicks in the Windows Registry and Sound settings. And for those who want a shot of nostalgia, here is the sound of Windows Vista startup: What startup jingle do you like more: Windows Vista or Windows 11? Share your thoughts in the comments. Tags Report a problem with article Follow @NeowinFeed
    0 Comments 0 Shares 0 Reviews
  • NVIDIA helps Germany lead Europe’s AI manufacturing race

    Germany and NVIDIA are building possibly the most ambitious European tech project of the decade: the continent’s first industrial AI cloud.NVIDIA has been on a European tour over the past month with CEO Jensen Huang charming audiences at London Tech Week before dazzling the crowds at Paris’s VivaTech. But it was his meeting with German Chancellor Friedrich Merz that might prove the most consequential stop.The resulting partnership between NVIDIA and Deutsche Telekom isn’t just another corporate handshake; it’s potentially a turning point for European technological sovereignty.An “AI factory”will be created with a focus on manufacturing, which is hardly surprising given Germany’s renowned industrial heritage. The facility aims to give European industrial players the computational firepower to revolutionise everything from design to robotics.“In the era of AI, every manufacturer needs two factories: one for making things, and one for creating the intelligence that powers them,” said Huang. “By building Europe’s first industrial AI infrastructure, we’re enabling the region’s leading industrial companies to advance simulation-first, AI-driven manufacturing.”It’s rare to hear such urgency from a telecoms CEO, but Deutsche Telekom’s Timotheus Höttges added: “Europe’s technological future needs a sprint, not a stroll. We must seize the opportunities of artificial intelligence now, revolutionise our industry, and secure a leading position in the global technology competition. Our economic success depends on quick decisions and collaborative innovations.”The first phase alone will deploy 10,000 NVIDIA Blackwell GPUs spread across various high-performance systems. That makes this Germany’s largest AI deployment ever; a statement the country isn’t content to watch from the sidelines as AI transforms global industry.A Deloitte study recently highlighted the critical importance of AI technology development to Germany’s future competitiveness, particularly noting the need for expanded data centre capacity. When you consider that demand is expected to triple within just five years, this investment seems less like ambition and more like necessity.Robots teaching robotsOne of the early adopters is NEURA Robotics, a German firm that specialises in cognitive robotics. They’re using this computational muscle to power something called the Neuraverse which is essentially a connected network where robots can learn from each other.Think of it as a robotic hive mind for skills ranging from precision welding to household ironing, with each machine contributing its learnings to a collective intelligence.“Physical AI is the electricity of the future—it will power every machine on the planet,” said David Reger, Founder and CEO of NEURA Robotics. “Through this initiative, we’re helping build the sovereign infrastructure Europe needs to lead in intelligent robotics and stay in control of its future.”The implications of this AI project for manufacturing in Germany could be profound. This isn’t just about making existing factories slightly more efficient; it’s about reimagining what manufacturing can be in an age of intelligent machines.AI for more than just Germany’s industrial titansWhat’s particularly promising about this project is its potential reach beyond Germany’s industrial titans. The famed Mittelstand – the network of specialised small and medium-sized businesses that forms the backbone of the German economy – stands to benefit.These companies often lack the resources to build their own AI infrastructure but possess the specialised knowledge that makes them perfect candidates for AI-enhanced innovation. Democratising access to cutting-edge AI could help preserve their competitive edge in a challenging global market.Academic and research institutions will also gain access, potentially accelerating innovation across numerous fields. The approximately 900 Germany-based startups in NVIDIA’s Inception program will be eligible to use these resources, potentially unleashing a wave of entrepreneurial AI applications.However impressive this massive project is, it’s viewed merely as a stepping stone towards something even more ambitious: Europe’s AI gigafactory. This planned 100,000 GPU-powered initiative backed by the EU and Germany won’t come online until 2027, but it represents Europe’s determination to carve out its own technological future.As other European telecom providers follow suit with their own AI infrastructure projects, we may be witnessing the beginning of a concerted effort to establish technological sovereignty across the continent.For a region that has often found itself caught between American tech dominance and Chinese ambitions, building indigenous AI capability represents more than economic opportunity. Whether this bold project in Germany will succeed remains to be seen, but one thing is clear: Europe is no longer content to be a passive consumer of AI technology developed elsewhere.Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here.
    #nvidia #helps #germany #lead #europes
    NVIDIA helps Germany lead Europe’s AI manufacturing race
    Germany and NVIDIA are building possibly the most ambitious European tech project of the decade: the continent’s first industrial AI cloud.NVIDIA has been on a European tour over the past month with CEO Jensen Huang charming audiences at London Tech Week before dazzling the crowds at Paris’s VivaTech. But it was his meeting with German Chancellor Friedrich Merz that might prove the most consequential stop.The resulting partnership between NVIDIA and Deutsche Telekom isn’t just another corporate handshake; it’s potentially a turning point for European technological sovereignty.An “AI factory”will be created with a focus on manufacturing, which is hardly surprising given Germany’s renowned industrial heritage. The facility aims to give European industrial players the computational firepower to revolutionise everything from design to robotics.“In the era of AI, every manufacturer needs two factories: one for making things, and one for creating the intelligence that powers them,” said Huang. “By building Europe’s first industrial AI infrastructure, we’re enabling the region’s leading industrial companies to advance simulation-first, AI-driven manufacturing.”It’s rare to hear such urgency from a telecoms CEO, but Deutsche Telekom’s Timotheus Höttges added: “Europe’s technological future needs a sprint, not a stroll. We must seize the opportunities of artificial intelligence now, revolutionise our industry, and secure a leading position in the global technology competition. Our economic success depends on quick decisions and collaborative innovations.”The first phase alone will deploy 10,000 NVIDIA Blackwell GPUs spread across various high-performance systems. That makes this Germany’s largest AI deployment ever; a statement the country isn’t content to watch from the sidelines as AI transforms global industry.A Deloitte study recently highlighted the critical importance of AI technology development to Germany’s future competitiveness, particularly noting the need for expanded data centre capacity. When you consider that demand is expected to triple within just five years, this investment seems less like ambition and more like necessity.Robots teaching robotsOne of the early adopters is NEURA Robotics, a German firm that specialises in cognitive robotics. They’re using this computational muscle to power something called the Neuraverse which is essentially a connected network where robots can learn from each other.Think of it as a robotic hive mind for skills ranging from precision welding to household ironing, with each machine contributing its learnings to a collective intelligence.“Physical AI is the electricity of the future—it will power every machine on the planet,” said David Reger, Founder and CEO of NEURA Robotics. “Through this initiative, we’re helping build the sovereign infrastructure Europe needs to lead in intelligent robotics and stay in control of its future.”The implications of this AI project for manufacturing in Germany could be profound. This isn’t just about making existing factories slightly more efficient; it’s about reimagining what manufacturing can be in an age of intelligent machines.AI for more than just Germany’s industrial titansWhat’s particularly promising about this project is its potential reach beyond Germany’s industrial titans. The famed Mittelstand – the network of specialised small and medium-sized businesses that forms the backbone of the German economy – stands to benefit.These companies often lack the resources to build their own AI infrastructure but possess the specialised knowledge that makes them perfect candidates for AI-enhanced innovation. Democratising access to cutting-edge AI could help preserve their competitive edge in a challenging global market.Academic and research institutions will also gain access, potentially accelerating innovation across numerous fields. The approximately 900 Germany-based startups in NVIDIA’s Inception program will be eligible to use these resources, potentially unleashing a wave of entrepreneurial AI applications.However impressive this massive project is, it’s viewed merely as a stepping stone towards something even more ambitious: Europe’s AI gigafactory. This planned 100,000 GPU-powered initiative backed by the EU and Germany won’t come online until 2027, but it represents Europe’s determination to carve out its own technological future.As other European telecom providers follow suit with their own AI infrastructure projects, we may be witnessing the beginning of a concerted effort to establish technological sovereignty across the continent.For a region that has often found itself caught between American tech dominance and Chinese ambitions, building indigenous AI capability represents more than economic opportunity. Whether this bold project in Germany will succeed remains to be seen, but one thing is clear: Europe is no longer content to be a passive consumer of AI technology developed elsewhere.Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here. #nvidia #helps #germany #lead #europes
    WWW.ARTIFICIALINTELLIGENCE-NEWS.COM
    NVIDIA helps Germany lead Europe’s AI manufacturing race
    Germany and NVIDIA are building possibly the most ambitious European tech project of the decade: the continent’s first industrial AI cloud.NVIDIA has been on a European tour over the past month with CEO Jensen Huang charming audiences at London Tech Week before dazzling the crowds at Paris’s VivaTech. But it was his meeting with German Chancellor Friedrich Merz that might prove the most consequential stop.The resulting partnership between NVIDIA and Deutsche Telekom isn’t just another corporate handshake; it’s potentially a turning point for European technological sovereignty.An “AI factory” (as they’re calling it) will be created with a focus on manufacturing, which is hardly surprising given Germany’s renowned industrial heritage. The facility aims to give European industrial players the computational firepower to revolutionise everything from design to robotics.“In the era of AI, every manufacturer needs two factories: one for making things, and one for creating the intelligence that powers them,” said Huang. “By building Europe’s first industrial AI infrastructure, we’re enabling the region’s leading industrial companies to advance simulation-first, AI-driven manufacturing.”It’s rare to hear such urgency from a telecoms CEO, but Deutsche Telekom’s Timotheus Höttges added: “Europe’s technological future needs a sprint, not a stroll. We must seize the opportunities of artificial intelligence now, revolutionise our industry, and secure a leading position in the global technology competition. Our economic success depends on quick decisions and collaborative innovations.”The first phase alone will deploy 10,000 NVIDIA Blackwell GPUs spread across various high-performance systems. That makes this Germany’s largest AI deployment ever; a statement the country isn’t content to watch from the sidelines as AI transforms global industry.A Deloitte study recently highlighted the critical importance of AI technology development to Germany’s future competitiveness, particularly noting the need for expanded data centre capacity. When you consider that demand is expected to triple within just five years, this investment seems less like ambition and more like necessity.Robots teaching robotsOne of the early adopters is NEURA Robotics, a German firm that specialises in cognitive robotics. They’re using this computational muscle to power something called the Neuraverse which is essentially a connected network where robots can learn from each other.Think of it as a robotic hive mind for skills ranging from precision welding to household ironing, with each machine contributing its learnings to a collective intelligence.“Physical AI is the electricity of the future—it will power every machine on the planet,” said David Reger, Founder and CEO of NEURA Robotics. “Through this initiative, we’re helping build the sovereign infrastructure Europe needs to lead in intelligent robotics and stay in control of its future.”The implications of this AI project for manufacturing in Germany could be profound. This isn’t just about making existing factories slightly more efficient; it’s about reimagining what manufacturing can be in an age of intelligent machines.AI for more than just Germany’s industrial titansWhat’s particularly promising about this project is its potential reach beyond Germany’s industrial titans. The famed Mittelstand – the network of specialised small and medium-sized businesses that forms the backbone of the German economy – stands to benefit.These companies often lack the resources to build their own AI infrastructure but possess the specialised knowledge that makes them perfect candidates for AI-enhanced innovation. Democratising access to cutting-edge AI could help preserve their competitive edge in a challenging global market.Academic and research institutions will also gain access, potentially accelerating innovation across numerous fields. The approximately 900 Germany-based startups in NVIDIA’s Inception program will be eligible to use these resources, potentially unleashing a wave of entrepreneurial AI applications.However impressive this massive project is, it’s viewed merely as a stepping stone towards something even more ambitious: Europe’s AI gigafactory. This planned 100,000 GPU-powered initiative backed by the EU and Germany won’t come online until 2027, but it represents Europe’s determination to carve out its own technological future.As other European telecom providers follow suit with their own AI infrastructure projects, we may be witnessing the beginning of a concerted effort to establish technological sovereignty across the continent.For a region that has often found itself caught between American tech dominance and Chinese ambitions, building indigenous AI capability represents more than economic opportunity. Whether this bold project in Germany will succeed remains to be seen, but one thing is clear: Europe is no longer content to be a passive consumer of AI technology developed elsewhere.(Photo by Maheshkumar Painam)Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here.
    0 Comments 0 Shares 0 Reviews
  • One of the most versatile action cameras I've tested isn't from GoPro - and it's on sale

    DJI Osmo Action 4. Adrian Kingsley-Hughes/ZDNETMultiple DJI Osmo Action 4 packages are on sale . Both the Essential and Standard Combos have been discounted to while the Adventure Combo has dropped to DJI might not be the first name on people's lips when it comes to action cameras, but the company that's better known for its drones also has a really solid line of action cameras. And its latest device, the Osmo Action 4 camera, has some very impressive tricks up its sleeve.Also: One of the most versatile cameras I've used is not from Sony or Canon and it's on saleSo, what sets this action camera apart from the competition? Let's take a look.
    details
    View First off, this is not just an action camera -- it's a pro-grade action camera.From a hardware point of view, the Osmo Action 4 features a 1/1.3-inch image sensor that can record 4K at up to 120 frames per second. This sensor is combined with a wide-angle f/2.8 aperture lens that provides an ultra-wide field of view of up to 155°. And that's wide. Build quality and fit and finish are second to none. Adrian Kingsley-Hughes/ZDNETFor when the going gets rough, the Osmo Action 4 offers 360° HorizonSteady stabilization modes, including RockSteady 3.0/3.0+ for first-person video footage and HorizonBalancing/HorizonSteady modes for horizontal shots. That's pro-grade hardware right there.Also: This new AI video editor is an all-in-one production service for filmmakers - how to try itThe Osmo Action 4 also features a 10-bit D-Log M color mode. This mode allows the sensor to record over one billion colors and offers a wider dynamic range, giving you a video that is more vivid and that offers greater detail in the highlights and shadows. This mode, combined with an advanced color temperature sensor, means that the colors have a true-to-life feel regardless of whether you're shooting outdoors, indoors, or even underwater. The DJI Osmo Action 4 ready for action. Adrian Kingsley-Hughes/ZDNETI've added some video output from the Osmo Action 4 below. There are examples in both 1080p and 4K. To test the stabilization, I attached the camera to the truck and took it on some roads, some of which are pretty rough. The Osmo Action 4 had no problem with that terrain. I also popped the camera into the sea, just because. And again, no problem.I've also captured a few time-lapses with the camera -- not because I like clouds, but pointing a camera at a sky can be a good test of how it handles changing light. Also: I recommend this action camera to beginners and professional creators. Here's whyTimelapses with action cameras can suffer from unsightly exposure changes that cause the image to pulse, a condition known as exposure pumping. This issue can also cause the white balance to change noticeably in a video, but the Osmo Action 4 handled this test well.All the footage I've shot is what I've come to expect from a DJI camera, whether it's from an action camera or drone -- crisp, clear, vivid, and also nice and stable.The Osmo Action 4 is packed with various electronic image-stabilizationtech to ensure that your footage is smooth and on the horizon. It's worth noting the limitations of EIS -- it's not supported in slow-motion and timelapse modes, and the HorizonSteady and HorizonBalancing features are only available for video recorded at 1080por 2.7Kwith a frame rate of 60fps or below. On the durability front, I've no concerns. I've subjected the Osmo Action 4 to a hard few days of testing, and it's not let me down or complained once. It takes impacts like a champ, and being underwater or in dirt and sand is no problem at all. Also: I'm a full-time Canon photographer, but this Nikon camera made me wonder if I'm missing outYou might think that this heavy-duty testing would be hard on the camera's tiny batteries, but you'd be wrong. Remember I said the Osmo Action 4 offered hours of battery life? Well, I wasn't kidding.  The Osmo Action 4's ultra-long life batteries are incredible.  Adrian Kingsley-Hughes/ZDNETDJI says that a single battery can deliver up to 160 minutes of 1080p/24fps video recording. That's over two and a half hours of recording time. In the real world, I was blown away by how much a single battery can deliver. I shot video and timelapse, messed around with a load of camera settings, and then transferred that footage to my iPhone, and still had 16% battery left.No action camera has delivered so much for me on one battery. The two extra batteries and the multifunction case that come as part of the Adventure Combo are worth the extra Adrian Kingsley-Hughes/ZDNETAnd when you're ready to recharge, a 30W USB-C charger can take a battery from zero to 80% in 18 minutes. That's also impressive.What's more, the batteries are resistant to cold, offering up to 150 minutes of 1080p/24fps recording in temperatures as low as -20°C. This resistance also blows the competition away.Even taking into account all these strong points, the Osmo Action 4 offers even more.The camera has 2x digital zoom for better composition, Voice Prompts that let you know what the camera is doing without looking, and Voice Control that lets you operate the device without touching the screen or using the app. The Osmo Action 4 also digitally hides the selfie stick from a variety of different shots, and you can even connect the DJI Mic to the camera via the USB-C port for better audio capture.Also: Yes, an Android tablet finally made me reconsider my iPad Pro loyaltyAs for price, the Osmo Action 4 Standard Combo bundle comes in at while the Osmo Action 4 Adventure Combo, which comes with two extra Osmo Action Extreme batteries, an additional mini Osmo Action quick-release adapter mount, a battery case that acts as a power bank, and a 1.5-meter selfie stick, is I'm in love with the Osmo Action 4. It's hands down the best, most versatile, most powerful action camera on the market today, offering pro-grade features at a price that definitely isn't pro-grade.  Everything included in the Action Combo bundle. DJIDJI Osmo Action 4 tech specsDimensions: 70.5×44.2×32.8mmWeight: 145gWaterproof: 18m, up to 60m with the optional waterproof case Microphones: 3Sensor 1/1.3-inch CMOSLens: FOV 155°, aperture f/2.8, focus distance 0.4m to ∞Max Photo Resolution: 3648×2736Max Video Resolution: 4K: 3840×2880@24/25/30/48/50/60fps and 4K: 3840×2160@24/25/30/48/50/60/100/120fpsISO Range: 100-12800Front Screen: 1.4-inch, 323ppi, 320×320Rear Screen: 2.25-inch, 326ppi, 360×640Front/Rear Screen Brightness: 750±50 cd/m² Storage: microSDBattery: 1770mAh, lab tested to offer up to 160 minutes of runtimeOperating Temperature: -20° to 45° CThis article was originally published in August of 2023 and updated in March 2025.Featured reviews
    #one #most #versatile #action #cameras
    One of the most versatile action cameras I've tested isn't from GoPro - and it's on sale
    DJI Osmo Action 4. Adrian Kingsley-Hughes/ZDNETMultiple DJI Osmo Action 4 packages are on sale . Both the Essential and Standard Combos have been discounted to while the Adventure Combo has dropped to DJI might not be the first name on people's lips when it comes to action cameras, but the company that's better known for its drones also has a really solid line of action cameras. And its latest device, the Osmo Action 4 camera, has some very impressive tricks up its sleeve.Also: One of the most versatile cameras I've used is not from Sony or Canon and it's on saleSo, what sets this action camera apart from the competition? Let's take a look. details View First off, this is not just an action camera -- it's a pro-grade action camera.From a hardware point of view, the Osmo Action 4 features a 1/1.3-inch image sensor that can record 4K at up to 120 frames per second. This sensor is combined with a wide-angle f/2.8 aperture lens that provides an ultra-wide field of view of up to 155°. And that's wide. Build quality and fit and finish are second to none. Adrian Kingsley-Hughes/ZDNETFor when the going gets rough, the Osmo Action 4 offers 360° HorizonSteady stabilization modes, including RockSteady 3.0/3.0+ for first-person video footage and HorizonBalancing/HorizonSteady modes for horizontal shots. That's pro-grade hardware right there.Also: This new AI video editor is an all-in-one production service for filmmakers - how to try itThe Osmo Action 4 also features a 10-bit D-Log M color mode. This mode allows the sensor to record over one billion colors and offers a wider dynamic range, giving you a video that is more vivid and that offers greater detail in the highlights and shadows. This mode, combined with an advanced color temperature sensor, means that the colors have a true-to-life feel regardless of whether you're shooting outdoors, indoors, or even underwater. The DJI Osmo Action 4 ready for action. Adrian Kingsley-Hughes/ZDNETI've added some video output from the Osmo Action 4 below. There are examples in both 1080p and 4K. To test the stabilization, I attached the camera to the truck and took it on some roads, some of which are pretty rough. The Osmo Action 4 had no problem with that terrain. I also popped the camera into the sea, just because. And again, no problem.I've also captured a few time-lapses with the camera -- not because I like clouds, but pointing a camera at a sky can be a good test of how it handles changing light. Also: I recommend this action camera to beginners and professional creators. Here's whyTimelapses with action cameras can suffer from unsightly exposure changes that cause the image to pulse, a condition known as exposure pumping. This issue can also cause the white balance to change noticeably in a video, but the Osmo Action 4 handled this test well.All the footage I've shot is what I've come to expect from a DJI camera, whether it's from an action camera or drone -- crisp, clear, vivid, and also nice and stable.The Osmo Action 4 is packed with various electronic image-stabilizationtech to ensure that your footage is smooth and on the horizon. It's worth noting the limitations of EIS -- it's not supported in slow-motion and timelapse modes, and the HorizonSteady and HorizonBalancing features are only available for video recorded at 1080por 2.7Kwith a frame rate of 60fps or below. On the durability front, I've no concerns. I've subjected the Osmo Action 4 to a hard few days of testing, and it's not let me down or complained once. It takes impacts like a champ, and being underwater or in dirt and sand is no problem at all. Also: I'm a full-time Canon photographer, but this Nikon camera made me wonder if I'm missing outYou might think that this heavy-duty testing would be hard on the camera's tiny batteries, but you'd be wrong. Remember I said the Osmo Action 4 offered hours of battery life? Well, I wasn't kidding.  The Osmo Action 4's ultra-long life batteries are incredible.  Adrian Kingsley-Hughes/ZDNETDJI says that a single battery can deliver up to 160 minutes of 1080p/24fps video recording. That's over two and a half hours of recording time. In the real world, I was blown away by how much a single battery can deliver. I shot video and timelapse, messed around with a load of camera settings, and then transferred that footage to my iPhone, and still had 16% battery left.No action camera has delivered so much for me on one battery. The two extra batteries and the multifunction case that come as part of the Adventure Combo are worth the extra Adrian Kingsley-Hughes/ZDNETAnd when you're ready to recharge, a 30W USB-C charger can take a battery from zero to 80% in 18 minutes. That's also impressive.What's more, the batteries are resistant to cold, offering up to 150 minutes of 1080p/24fps recording in temperatures as low as -20°C. This resistance also blows the competition away.Even taking into account all these strong points, the Osmo Action 4 offers even more.The camera has 2x digital zoom for better composition, Voice Prompts that let you know what the camera is doing without looking, and Voice Control that lets you operate the device without touching the screen or using the app. The Osmo Action 4 also digitally hides the selfie stick from a variety of different shots, and you can even connect the DJI Mic to the camera via the USB-C port for better audio capture.Also: Yes, an Android tablet finally made me reconsider my iPad Pro loyaltyAs for price, the Osmo Action 4 Standard Combo bundle comes in at while the Osmo Action 4 Adventure Combo, which comes with two extra Osmo Action Extreme batteries, an additional mini Osmo Action quick-release adapter mount, a battery case that acts as a power bank, and a 1.5-meter selfie stick, is I'm in love with the Osmo Action 4. It's hands down the best, most versatile, most powerful action camera on the market today, offering pro-grade features at a price that definitely isn't pro-grade.  Everything included in the Action Combo bundle. DJIDJI Osmo Action 4 tech specsDimensions: 70.5×44.2×32.8mmWeight: 145gWaterproof: 18m, up to 60m with the optional waterproof case Microphones: 3Sensor 1/1.3-inch CMOSLens: FOV 155°, aperture f/2.8, focus distance 0.4m to ∞Max Photo Resolution: 3648×2736Max Video Resolution: 4K: 3840×2880@24/25/30/48/50/60fps and 4K: 3840×2160@24/25/30/48/50/60/100/120fpsISO Range: 100-12800Front Screen: 1.4-inch, 323ppi, 320×320Rear Screen: 2.25-inch, 326ppi, 360×640Front/Rear Screen Brightness: 750±50 cd/m² Storage: microSDBattery: 1770mAh, lab tested to offer up to 160 minutes of runtimeOperating Temperature: -20° to 45° CThis article was originally published in August of 2023 and updated in March 2025.Featured reviews #one #most #versatile #action #cameras
    WWW.ZDNET.COM
    One of the most versatile action cameras I've tested isn't from GoPro - and it's on sale
    DJI Osmo Action 4. Adrian Kingsley-Hughes/ZDNETMultiple DJI Osmo Action 4 packages are on sale at Amazon. Both the Essential and Standard Combos have been discounted to $249, while the Adventure Combo has dropped to $349.DJI might not be the first name on people's lips when it comes to action cameras, but the company that's better known for its drones also has a really solid line of action cameras. And its latest device, the Osmo Action 4 camera, has some very impressive tricks up its sleeve.Also: One of the most versatile cameras I've used is not from Sony or Canon and it's on saleSo, what sets this action camera apart from the competition? Let's take a look. details View at Amazon First off, this is not just an action camera -- it's a pro-grade action camera.From a hardware point of view, the Osmo Action 4 features a 1/1.3-inch image sensor that can record 4K at up to 120 frames per second (fps). This sensor is combined with a wide-angle f/2.8 aperture lens that provides an ultra-wide field of view of up to 155°. And that's wide. Build quality and fit and finish are second to none. Adrian Kingsley-Hughes/ZDNETFor when the going gets rough, the Osmo Action 4 offers 360° HorizonSteady stabilization modes, including RockSteady 3.0/3.0+ for first-person video footage and HorizonBalancing/HorizonSteady modes for horizontal shots. That's pro-grade hardware right there.Also: This new AI video editor is an all-in-one production service for filmmakers - how to try itThe Osmo Action 4 also features a 10-bit D-Log M color mode. This mode allows the sensor to record over one billion colors and offers a wider dynamic range, giving you a video that is more vivid and that offers greater detail in the highlights and shadows. This mode, combined with an advanced color temperature sensor, means that the colors have a true-to-life feel regardless of whether you're shooting outdoors, indoors, or even underwater. The DJI Osmo Action 4 ready for action. Adrian Kingsley-Hughes/ZDNETI've added some video output from the Osmo Action 4 below. There are examples in both 1080p and 4K. To test the stabilization, I attached the camera to the truck and took it on some roads, some of which are pretty rough. The Osmo Action 4 had no problem with that terrain. I also popped the camera into the sea, just because. And again, no problem.I've also captured a few time-lapses with the camera -- not because I like clouds (well, actually, I do like clouds), but pointing a camera at a sky can be a good test of how it handles changing light. Also: I recommend this action camera to beginners and professional creators. Here's whyTimelapses with action cameras can suffer from unsightly exposure changes that cause the image to pulse, a condition known as exposure pumping. This issue can also cause the white balance to change noticeably in a video, but the Osmo Action 4 handled this test well.All the footage I've shot is what I've come to expect from a DJI camera, whether it's from an action camera or drone -- crisp, clear, vivid, and also nice and stable.The Osmo Action 4 is packed with various electronic image-stabilization (EIS) tech to ensure that your footage is smooth and on the horizon. It's worth noting the limitations of EIS -- it's not supported in slow-motion and timelapse modes, and the HorizonSteady and HorizonBalancing features are only available for video recorded at 1080p (16:9) or 2.7K (16:9) with a frame rate of 60fps or below. On the durability front, I've no concerns. I've subjected the Osmo Action 4 to a hard few days of testing, and it's not let me down or complained once. It takes impacts like a champ, and being underwater or in dirt and sand is no problem at all. Also: I'm a full-time Canon photographer, but this Nikon camera made me wonder if I'm missing outYou might think that this heavy-duty testing would be hard on the camera's tiny batteries, but you'd be wrong. Remember I said the Osmo Action 4 offered hours of battery life? Well, I wasn't kidding.  The Osmo Action 4's ultra-long life batteries are incredible.  Adrian Kingsley-Hughes/ZDNETDJI says that a single battery can deliver up to 160 minutes of 1080p/24fps video recording (at room temperature, with RockSteady on, Wi-Fi off, and screen off). That's over two and a half hours of recording time. In the real world, I was blown away by how much a single battery can deliver. I shot video and timelapse, messed around with a load of camera settings, and then transferred that footage to my iPhone, and still had 16% battery left.No action camera has delivered so much for me on one battery. The two extra batteries and the multifunction case that come as part of the Adventure Combo are worth the extra $100. Adrian Kingsley-Hughes/ZDNETAnd when you're ready to recharge, a 30W USB-C charger can take a battery from zero to 80% in 18 minutes. That's also impressive.What's more, the batteries are resistant to cold, offering up to 150 minutes of 1080p/24fps recording in temperatures as low as -20°C (-4°F). This resistance also blows the competition away.Even taking into account all these strong points, the Osmo Action 4 offers even more.The camera has 2x digital zoom for better composition, Voice Prompts that let you know what the camera is doing without looking, and Voice Control that lets you operate the device without touching the screen or using the app. The Osmo Action 4 also digitally hides the selfie stick from a variety of different shots, and you can even connect the DJI Mic to the camera via the USB-C port for better audio capture.Also: Yes, an Android tablet finally made me reconsider my iPad Pro loyaltyAs for price, the Osmo Action 4 Standard Combo bundle comes in at $399, while the Osmo Action 4 Adventure Combo, which comes with two extra Osmo Action Extreme batteries, an additional mini Osmo Action quick-release adapter mount, a battery case that acts as a power bank, and a 1.5-meter selfie stick, is $499.I'm in love with the Osmo Action 4. It's hands down the best, most versatile, most powerful action camera on the market today, offering pro-grade features at a price that definitely isn't pro-grade.  Everything included in the Action Combo bundle. DJIDJI Osmo Action 4 tech specsDimensions: 70.5×44.2×32.8mmWeight: 145gWaterproof: 18m, up to 60m with the optional waterproof case Microphones: 3Sensor 1/1.3-inch CMOSLens: FOV 155°, aperture f/2.8, focus distance 0.4m to ∞Max Photo Resolution: 3648×2736Max Video Resolution: 4K (4:3): 3840×2880@24/25/30/48/50/60fps and 4K (16:9): 3840×2160@24/25/30/48/50/60/100/120fpsISO Range: 100-12800Front Screen: 1.4-inch, 323ppi, 320×320Rear Screen: 2.25-inch, 326ppi, 360×640Front/Rear Screen Brightness: 750±50 cd/m² Storage: microSD (up to 512GB)Battery: 1770mAh, lab tested to offer up to 160 minutes of runtime (tested at room temperature - 25°C/77°F - and 1080p/24fps, with RockSteady on, Wi-Fi off, and screen off)Operating Temperature: -20° to 45° C (-4° to 113° F)This article was originally published in August of 2023 and updated in March 2025.Featured reviews
    0 Comments 0 Shares 0 Reviews
  • Nike Introduces the Air Max 1000 its First Fully 3D Printed Sneaker

    Global sportswear leader Nike is reportedly preparing to release the Air Max 1000 Oatmeal, its first fully 3D printed sneaker, with a launch tentatively scheduled for Summer 2025. While Nike has yet to confirm an official release date, industry sources suggest the debut may occur sometime between June and August. The retail price is expected to be approximately This model marks a step in Nike’s exploration of additive manufacturing, enabled through a collaboration with Zellerfeld, a German startup known for its work in fully 3D printed footwear.
    Building Buzz Online
    The “Oatmeal” colorway—a neutral blend of soft beige tones—has already attracted attention on social platforms like TikTok, Instagram, and X. In April, content creator Janelle C. Shuttlesworth described the shoes as “light as air” in a video preview. Sneaker-focused accounts such as JustFreshKicks and TikTok user @shoehefner5 have also offered early walkthroughs. Among fans, the nickname “Foamy Oat” has started to catch on.
    Nike’s 3D printed Air Max 1000 Oatmeal. Photo via Janelle C. Shuttlesworth.
    Before generating buzz online, the sneaker made a public appearance at ComplexCon Las Vegas in November 2024. There, its laceless, sculptural silhouette and smooth, seamless texture stood out—merging futuristic design with signature Air Max elements, such as the visible heel air unit.
    Reimagining the Air Max Legacy
    Drawing inspiration from the original Air Max 1, the Air Max 1000 retains the iconic air cushion in the heel while reinventing the rest of the structure using 3D printing. The shoe’s upper and outsole are formed as a single, continuous piece, produced from ZellerFoam, a proprietary flexible material developed by Zellerfeld.
    Zellerfeld’s fused filament fabricationprocess enables varied material densities throughout the shoe—resulting in a firm, supportive sole paired with a lightweight, breathable upper. The laceless, slip-on design prioritizes ease of wear while reinforcing a sleek, minimalist aesthetic.
    Nike’s Chief Innovation Officer, John Hoke, emphasized the broader impact of the design, noting that the Air Max 1000 “opens up new creative possibilities” and achieves levels of precision and contouring not possible with traditional footwear manufacturing. He also pointed to the sustainability benefits of AM, which produces minimal waste by fabricating only the necessary components.
    Expansion of 3D Printed Footwear Technology
    The Air Max 1000 joins a growing lineup of 3D printed footwear innovations from major brands. Gucci, the Italian luxury brand known for blending traditional craftsmanship with modern techniques, unveiled several Cub3d sneakers as part of its Spring Summer 2025collection. The brand developed Demetra, a material made from at least 70% plant-based ingredients, including viscose, wood pulp, and bio-based polyurethane. The bi-material sole combines an EVA-filled interior for cushioning and a TPU exterior, featuring an Interlocking G pattern that creates a 3D effect.
    Elsewhere, Syntilay, a footwear company combining artificial intelligence with 3D printing, launched a range of custom-fit slides. These slides are designed using AI-generated 3D models, starting with sketch-based concepts that are refined through AI platforms and then transformed into digital 3D designs. The company offers sizing adjustments based on smartphone foot scans, which are integrated into the manufacturing process.
    Join our Additive Manufacturing Advantageevent on July 10th, where AM leaders from Aerospace, Space, and Defense come together to share mission-critical insights. Online and free to attend.Secure your spot now.
    Who won the2024 3D Printing Industry Awards?
    Subscribe to the 3D Printing Industry newsletterto keep up with the latest 3D printing news.
    You can also follow us onLinkedIn, and subscribe to the 3D Printing Industry Youtube channel to access more exclusive content.
    Featured image shows Nike’s 3D printed Air Max 1000 Oatmeal. Photo via Janelle C. Shuttlesworth.

    Paloma Duran
    Paloma Duran holds a BA in International Relations and an MA in Journalism. Specializing in writing, podcasting, and content and event creation, she works across politics, energy, mining, and technology. With a passion for global trends, Paloma is particularly interested in the impact of technology like 3D printing on shaping our future.
    #nike #introduces #air #max #its
    Nike Introduces the Air Max 1000 its First Fully 3D Printed Sneaker
    Global sportswear leader Nike is reportedly preparing to release the Air Max 1000 Oatmeal, its first fully 3D printed sneaker, with a launch tentatively scheduled for Summer 2025. While Nike has yet to confirm an official release date, industry sources suggest the debut may occur sometime between June and August. The retail price is expected to be approximately This model marks a step in Nike’s exploration of additive manufacturing, enabled through a collaboration with Zellerfeld, a German startup known for its work in fully 3D printed footwear. Building Buzz Online The “Oatmeal” colorway—a neutral blend of soft beige tones—has already attracted attention on social platforms like TikTok, Instagram, and X. In April, content creator Janelle C. Shuttlesworth described the shoes as “light as air” in a video preview. Sneaker-focused accounts such as JustFreshKicks and TikTok user @shoehefner5 have also offered early walkthroughs. Among fans, the nickname “Foamy Oat” has started to catch on. Nike’s 3D printed Air Max 1000 Oatmeal. Photo via Janelle C. Shuttlesworth. Before generating buzz online, the sneaker made a public appearance at ComplexCon Las Vegas in November 2024. There, its laceless, sculptural silhouette and smooth, seamless texture stood out—merging futuristic design with signature Air Max elements, such as the visible heel air unit. Reimagining the Air Max Legacy Drawing inspiration from the original Air Max 1, the Air Max 1000 retains the iconic air cushion in the heel while reinventing the rest of the structure using 3D printing. The shoe’s upper and outsole are formed as a single, continuous piece, produced from ZellerFoam, a proprietary flexible material developed by Zellerfeld. Zellerfeld’s fused filament fabricationprocess enables varied material densities throughout the shoe—resulting in a firm, supportive sole paired with a lightweight, breathable upper. The laceless, slip-on design prioritizes ease of wear while reinforcing a sleek, minimalist aesthetic. Nike’s Chief Innovation Officer, John Hoke, emphasized the broader impact of the design, noting that the Air Max 1000 “opens up new creative possibilities” and achieves levels of precision and contouring not possible with traditional footwear manufacturing. He also pointed to the sustainability benefits of AM, which produces minimal waste by fabricating only the necessary components. Expansion of 3D Printed Footwear Technology The Air Max 1000 joins a growing lineup of 3D printed footwear innovations from major brands. Gucci, the Italian luxury brand known for blending traditional craftsmanship with modern techniques, unveiled several Cub3d sneakers as part of its Spring Summer 2025collection. The brand developed Demetra, a material made from at least 70% plant-based ingredients, including viscose, wood pulp, and bio-based polyurethane. The bi-material sole combines an EVA-filled interior for cushioning and a TPU exterior, featuring an Interlocking G pattern that creates a 3D effect. Elsewhere, Syntilay, a footwear company combining artificial intelligence with 3D printing, launched a range of custom-fit slides. These slides are designed using AI-generated 3D models, starting with sketch-based concepts that are refined through AI platforms and then transformed into digital 3D designs. The company offers sizing adjustments based on smartphone foot scans, which are integrated into the manufacturing process. Join our Additive Manufacturing Advantageevent on July 10th, where AM leaders from Aerospace, Space, and Defense come together to share mission-critical insights. Online and free to attend.Secure your spot now. Who won the2024 3D Printing Industry Awards? Subscribe to the 3D Printing Industry newsletterto keep up with the latest 3D printing news. You can also follow us onLinkedIn, and subscribe to the 3D Printing Industry Youtube channel to access more exclusive content. Featured image shows Nike’s 3D printed Air Max 1000 Oatmeal. Photo via Janelle C. Shuttlesworth. Paloma Duran Paloma Duran holds a BA in International Relations and an MA in Journalism. Specializing in writing, podcasting, and content and event creation, she works across politics, energy, mining, and technology. With a passion for global trends, Paloma is particularly interested in the impact of technology like 3D printing on shaping our future. #nike #introduces #air #max #its
    3DPRINTINGINDUSTRY.COM
    Nike Introduces the Air Max 1000 its First Fully 3D Printed Sneaker
    Global sportswear leader Nike is reportedly preparing to release the Air Max 1000 Oatmeal, its first fully 3D printed sneaker, with a launch tentatively scheduled for Summer 2025. While Nike has yet to confirm an official release date, industry sources suggest the debut may occur sometime between June and August. The retail price is expected to be approximately $210. This model marks a step in Nike’s exploration of additive manufacturing (AM), enabled through a collaboration with Zellerfeld, a German startup known for its work in fully 3D printed footwear. Building Buzz Online The “Oatmeal” colorway—a neutral blend of soft beige tones—has already attracted attention on social platforms like TikTok, Instagram, and X. In April, content creator Janelle C. Shuttlesworth described the shoes as “light as air” in a video preview. Sneaker-focused accounts such as JustFreshKicks and TikTok user @shoehefner5 have also offered early walkthroughs. Among fans, the nickname “Foamy Oat” has started to catch on. Nike’s 3D printed Air Max 1000 Oatmeal. Photo via Janelle C. Shuttlesworth. Before generating buzz online, the sneaker made a public appearance at ComplexCon Las Vegas in November 2024. There, its laceless, sculptural silhouette and smooth, seamless texture stood out—merging futuristic design with signature Air Max elements, such as the visible heel air unit. Reimagining the Air Max Legacy Drawing inspiration from the original Air Max 1 (1987), the Air Max 1000 retains the iconic air cushion in the heel while reinventing the rest of the structure using 3D printing. The shoe’s upper and outsole are formed as a single, continuous piece, produced from ZellerFoam, a proprietary flexible material developed by Zellerfeld. Zellerfeld’s fused filament fabrication (FFF) process enables varied material densities throughout the shoe—resulting in a firm, supportive sole paired with a lightweight, breathable upper. The laceless, slip-on design prioritizes ease of wear while reinforcing a sleek, minimalist aesthetic. Nike’s Chief Innovation Officer, John Hoke, emphasized the broader impact of the design, noting that the Air Max 1000 “opens up new creative possibilities” and achieves levels of precision and contouring not possible with traditional footwear manufacturing. He also pointed to the sustainability benefits of AM, which produces minimal waste by fabricating only the necessary components. Expansion of 3D Printed Footwear Technology The Air Max 1000 joins a growing lineup of 3D printed footwear innovations from major brands. Gucci, the Italian luxury brand known for blending traditional craftsmanship with modern techniques, unveiled several Cub3d sneakers as part of its Spring Summer 2025 (SS25) collection. The brand developed Demetra, a material made from at least 70% plant-based ingredients, including viscose, wood pulp, and bio-based polyurethane. The bi-material sole combines an EVA-filled interior for cushioning and a TPU exterior, featuring an Interlocking G pattern that creates a 3D effect. Elsewhere, Syntilay, a footwear company combining artificial intelligence with 3D printing, launched a range of custom-fit slides. These slides are designed using AI-generated 3D models, starting with sketch-based concepts that are refined through AI platforms and then transformed into digital 3D designs. The company offers sizing adjustments based on smartphone foot scans, which are integrated into the manufacturing process. Join our Additive Manufacturing Advantage (AMAA) event on July 10th, where AM leaders from Aerospace, Space, and Defense come together to share mission-critical insights. Online and free to attend.Secure your spot now. Who won the2024 3D Printing Industry Awards? Subscribe to the 3D Printing Industry newsletterto keep up with the latest 3D printing news. You can also follow us onLinkedIn, and subscribe to the 3D Printing Industry Youtube channel to access more exclusive content. Featured image shows Nike’s 3D printed Air Max 1000 Oatmeal. Photo via Janelle C. Shuttlesworth. Paloma Duran Paloma Duran holds a BA in International Relations and an MA in Journalism. Specializing in writing, podcasting, and content and event creation, she works across politics, energy, mining, and technology. With a passion for global trends, Paloma is particularly interested in the impact of technology like 3D printing on shaping our future.
    0 Comments 0 Shares 0 Reviews
  • Discord Invite Link Hijacking Delivers AsyncRAT and Skuld Stealer Targeting Crypto Wallets

    Jun 14, 2025Ravie LakshmananMalware / Threat Intelligence

    A new malware campaign is exploiting a weakness in Discord's invitation system to deliver an information stealer called Skuld and the AsyncRAT remote access trojan.
    "Attackers hijacked the links through vanity link registration, allowing them to silently redirect users from trusted sources to malicious servers," Check Point said in a technical report. "The attackers combined the ClickFix phishing technique, multi-stage loaders, and time-based evasions to stealthily deliver AsyncRAT, and a customized Skuld Stealer targeting crypto wallets."
    The issue with Discord's invite mechanism is that it allows attackers to hijack expired or deleted invite links and secretly redirect unsuspecting users to malicious servers under their control. This also means that a Discord invite link that was once trusted and shared on forums or social media platforms could unwittingly lead users to malicious sites.

    Details of the campaign come a little over a month after the cybersecurity company revealed another sophisticated phishing campaign that hijacked expired vanity invite links to entice users into joining a Discord server and instruct them to visit a phishing site to verify ownership, only to have their digital assets drained upon connecting their wallets.
    While users can create temporary, permanent, or custominvite links on Discord, the platform prevents other legitimate servers from reclaiming a previously expired or deleted invite. However, Check Point found that creating custom invite links allows the reuse of expired invite codes and even deleted permanent invite codes in some cases.

    This ability to reuse Discord expired or deleted codes when creating custom vanity invite links opens the door to abuse, allowing attackers to claim it for their malicious server.
    "This creates a serious risk: Users who follow previously trusted invite linkscan unknowingly be redirected to fake Discord servers created by threat actors," Check Point said.
    The Discord invite-link hijacking, in a nutshell, involves taking control of invite links originally shared by legitimate communities and then using them to redirect users to the malicious server. Users who fall prey to the scheme and join the server are asked to complete a verification step in order to gain full server access by authorizing a bot, which then leads them to a fake website with a prominent "Verify" button.
    This is where the attackers take the attack to the next level by incorporating the infamous ClickFix social engineering tactic to trick users into infecting their systems under the pretext of verification.

    Specifically, clicking the "Verify" button surreptitiously executes JavaScript that copies a PowerShell command to the machine's clipboard, after which the users are urged to launch the Windows Run dialog, paste the already copied "verification string", and press Enter to authenticate their accounts.
    But in reality, performing these steps triggers the download of a PowerShell script hosted on Pastebin that subsequently retrieves and executes a first-stage downloader, which is ultimately used to drop AsyncRAT and Skuld Stealer from a remote server and execute them.
    At the heart of this attack lies a meticulously engineered, multi-stage infection process designed for both precision and stealth, while also taking steps to subvert security protections through sandbox security checks.
    AsyncRAT, which offers comprehensive remote control capabilities over infected systems, has been found to employ a technique called dead drop resolver to access the actual command-and-controlserver by reading a Pastebin file.
    The other payload is a Golang information stealer that's downloaded from Bitbucket. It's equipped to steal sensitive user data from Discord, various browsers, crypto wallets, and gaming platforms.
    Skuld is also capable of harvesting crypto wallet seed phrases and passwords from the Exodus and Atomic crypto wallets. It accomplishes this using an approach called wallet injection that replaces legitimate application files with trojanized versions downloaded from GitHub. It's worth noting that a similar technique was recently put to use by a rogue npm package named pdf-to-office.
    The attack also employs a custom version of an open-source tool known as ChromeKatz to bypass Chrome's app-bound encryption protections. The collected data is exfiltrated to the miscreants via a Discord webhook.
    The fact that payload delivery and data exfiltration occur via trusted cloud services such as GitHub, Bitbucket, Pastebin, and Discord allows the threat actors to blend in with normal traffic and fly under the radar. Discord has since disabled the malicious bot, effectively breaking the attack chain.

    Check Point said it also identified another campaign mounted by the same threat actor that distributes the loader as a modified version of a hacktool for unlocking pirated games. The malicious program, also hosted on Bitbucket, has been downloaded 350 times.
    It has been assessed that the victims of these campaigns are primarily located in the United States, Vietnam, France, Germany, Slovakia, Austria, the Netherlands, and the United Kingdom.
    The findings represent the latest example of how cybercriminals are targeting the popular social platform, which has had its content delivery networkabused to host malware in the past.
    "This campaign illustrates how a subtle feature of Discord's invite system, the ability to reuse expired or deleted invite codes in vanity invite links, can be exploited as a powerful attack vector," the researchers said. "By hijacking legitimate invite links, threat actors silently redirect unsuspecting users to malicious Discord servers."
    "The choice of payloads, including a powerful stealer specifically targeting cryptocurrency wallets, suggests that the attackers are primarily focused on crypto users and motivated by financial gain."

    Found this article interesting? Follow us on Twitter  and LinkedIn to read more exclusive content we post.

    SHARE




    #discord #invite #link #hijacking #delivers
    Discord Invite Link Hijacking Delivers AsyncRAT and Skuld Stealer Targeting Crypto Wallets
    Jun 14, 2025Ravie LakshmananMalware / Threat Intelligence A new malware campaign is exploiting a weakness in Discord's invitation system to deliver an information stealer called Skuld and the AsyncRAT remote access trojan. "Attackers hijacked the links through vanity link registration, allowing them to silently redirect users from trusted sources to malicious servers," Check Point said in a technical report. "The attackers combined the ClickFix phishing technique, multi-stage loaders, and time-based evasions to stealthily deliver AsyncRAT, and a customized Skuld Stealer targeting crypto wallets." The issue with Discord's invite mechanism is that it allows attackers to hijack expired or deleted invite links and secretly redirect unsuspecting users to malicious servers under their control. This also means that a Discord invite link that was once trusted and shared on forums or social media platforms could unwittingly lead users to malicious sites. Details of the campaign come a little over a month after the cybersecurity company revealed another sophisticated phishing campaign that hijacked expired vanity invite links to entice users into joining a Discord server and instruct them to visit a phishing site to verify ownership, only to have their digital assets drained upon connecting their wallets. While users can create temporary, permanent, or custominvite links on Discord, the platform prevents other legitimate servers from reclaiming a previously expired or deleted invite. However, Check Point found that creating custom invite links allows the reuse of expired invite codes and even deleted permanent invite codes in some cases. This ability to reuse Discord expired or deleted codes when creating custom vanity invite links opens the door to abuse, allowing attackers to claim it for their malicious server. "This creates a serious risk: Users who follow previously trusted invite linkscan unknowingly be redirected to fake Discord servers created by threat actors," Check Point said. The Discord invite-link hijacking, in a nutshell, involves taking control of invite links originally shared by legitimate communities and then using them to redirect users to the malicious server. Users who fall prey to the scheme and join the server are asked to complete a verification step in order to gain full server access by authorizing a bot, which then leads them to a fake website with a prominent "Verify" button. This is where the attackers take the attack to the next level by incorporating the infamous ClickFix social engineering tactic to trick users into infecting their systems under the pretext of verification. Specifically, clicking the "Verify" button surreptitiously executes JavaScript that copies a PowerShell command to the machine's clipboard, after which the users are urged to launch the Windows Run dialog, paste the already copied "verification string", and press Enter to authenticate their accounts. But in reality, performing these steps triggers the download of a PowerShell script hosted on Pastebin that subsequently retrieves and executes a first-stage downloader, which is ultimately used to drop AsyncRAT and Skuld Stealer from a remote server and execute them. At the heart of this attack lies a meticulously engineered, multi-stage infection process designed for both precision and stealth, while also taking steps to subvert security protections through sandbox security checks. AsyncRAT, which offers comprehensive remote control capabilities over infected systems, has been found to employ a technique called dead drop resolver to access the actual command-and-controlserver by reading a Pastebin file. The other payload is a Golang information stealer that's downloaded from Bitbucket. It's equipped to steal sensitive user data from Discord, various browsers, crypto wallets, and gaming platforms. Skuld is also capable of harvesting crypto wallet seed phrases and passwords from the Exodus and Atomic crypto wallets. It accomplishes this using an approach called wallet injection that replaces legitimate application files with trojanized versions downloaded from GitHub. It's worth noting that a similar technique was recently put to use by a rogue npm package named pdf-to-office. The attack also employs a custom version of an open-source tool known as ChromeKatz to bypass Chrome's app-bound encryption protections. The collected data is exfiltrated to the miscreants via a Discord webhook. The fact that payload delivery and data exfiltration occur via trusted cloud services such as GitHub, Bitbucket, Pastebin, and Discord allows the threat actors to blend in with normal traffic and fly under the radar. Discord has since disabled the malicious bot, effectively breaking the attack chain. Check Point said it also identified another campaign mounted by the same threat actor that distributes the loader as a modified version of a hacktool for unlocking pirated games. The malicious program, also hosted on Bitbucket, has been downloaded 350 times. It has been assessed that the victims of these campaigns are primarily located in the United States, Vietnam, France, Germany, Slovakia, Austria, the Netherlands, and the United Kingdom. The findings represent the latest example of how cybercriminals are targeting the popular social platform, which has had its content delivery networkabused to host malware in the past. "This campaign illustrates how a subtle feature of Discord's invite system, the ability to reuse expired or deleted invite codes in vanity invite links, can be exploited as a powerful attack vector," the researchers said. "By hijacking legitimate invite links, threat actors silently redirect unsuspecting users to malicious Discord servers." "The choice of payloads, including a powerful stealer specifically targeting cryptocurrency wallets, suggests that the attackers are primarily focused on crypto users and motivated by financial gain." Found this article interesting? Follow us on Twitter  and LinkedIn to read more exclusive content we post. SHARE     #discord #invite #link #hijacking #delivers
    THEHACKERNEWS.COM
    Discord Invite Link Hijacking Delivers AsyncRAT and Skuld Stealer Targeting Crypto Wallets
    Jun 14, 2025Ravie LakshmananMalware / Threat Intelligence A new malware campaign is exploiting a weakness in Discord's invitation system to deliver an information stealer called Skuld and the AsyncRAT remote access trojan. "Attackers hijacked the links through vanity link registration, allowing them to silently redirect users from trusted sources to malicious servers," Check Point said in a technical report. "The attackers combined the ClickFix phishing technique, multi-stage loaders, and time-based evasions to stealthily deliver AsyncRAT, and a customized Skuld Stealer targeting crypto wallets." The issue with Discord's invite mechanism is that it allows attackers to hijack expired or deleted invite links and secretly redirect unsuspecting users to malicious servers under their control. This also means that a Discord invite link that was once trusted and shared on forums or social media platforms could unwittingly lead users to malicious sites. Details of the campaign come a little over a month after the cybersecurity company revealed another sophisticated phishing campaign that hijacked expired vanity invite links to entice users into joining a Discord server and instruct them to visit a phishing site to verify ownership, only to have their digital assets drained upon connecting their wallets. While users can create temporary, permanent, or custom (vanity) invite links on Discord, the platform prevents other legitimate servers from reclaiming a previously expired or deleted invite. However, Check Point found that creating custom invite links allows the reuse of expired invite codes and even deleted permanent invite codes in some cases. This ability to reuse Discord expired or deleted codes when creating custom vanity invite links opens the door to abuse, allowing attackers to claim it for their malicious server. "This creates a serious risk: Users who follow previously trusted invite links (e.g., on websites, blogs, or forums) can unknowingly be redirected to fake Discord servers created by threat actors," Check Point said. The Discord invite-link hijacking, in a nutshell, involves taking control of invite links originally shared by legitimate communities and then using them to redirect users to the malicious server. Users who fall prey to the scheme and join the server are asked to complete a verification step in order to gain full server access by authorizing a bot, which then leads them to a fake website with a prominent "Verify" button. This is where the attackers take the attack to the next level by incorporating the infamous ClickFix social engineering tactic to trick users into infecting their systems under the pretext of verification. Specifically, clicking the "Verify" button surreptitiously executes JavaScript that copies a PowerShell command to the machine's clipboard, after which the users are urged to launch the Windows Run dialog, paste the already copied "verification string" (i.e., the PowerShell command), and press Enter to authenticate their accounts. But in reality, performing these steps triggers the download of a PowerShell script hosted on Pastebin that subsequently retrieves and executes a first-stage downloader, which is ultimately used to drop AsyncRAT and Skuld Stealer from a remote server and execute them. At the heart of this attack lies a meticulously engineered, multi-stage infection process designed for both precision and stealth, while also taking steps to subvert security protections through sandbox security checks. AsyncRAT, which offers comprehensive remote control capabilities over infected systems, has been found to employ a technique called dead drop resolver to access the actual command-and-control (C2) server by reading a Pastebin file. The other payload is a Golang information stealer that's downloaded from Bitbucket. It's equipped to steal sensitive user data from Discord, various browsers, crypto wallets, and gaming platforms. Skuld is also capable of harvesting crypto wallet seed phrases and passwords from the Exodus and Atomic crypto wallets. It accomplishes this using an approach called wallet injection that replaces legitimate application files with trojanized versions downloaded from GitHub. It's worth noting that a similar technique was recently put to use by a rogue npm package named pdf-to-office. The attack also employs a custom version of an open-source tool known as ChromeKatz to bypass Chrome's app-bound encryption protections. The collected data is exfiltrated to the miscreants via a Discord webhook. The fact that payload delivery and data exfiltration occur via trusted cloud services such as GitHub, Bitbucket, Pastebin, and Discord allows the threat actors to blend in with normal traffic and fly under the radar. Discord has since disabled the malicious bot, effectively breaking the attack chain. Check Point said it also identified another campaign mounted by the same threat actor that distributes the loader as a modified version of a hacktool for unlocking pirated games. The malicious program, also hosted on Bitbucket, has been downloaded 350 times. It has been assessed that the victims of these campaigns are primarily located in the United States, Vietnam, France, Germany, Slovakia, Austria, the Netherlands, and the United Kingdom. The findings represent the latest example of how cybercriminals are targeting the popular social platform, which has had its content delivery network (CDN) abused to host malware in the past. "This campaign illustrates how a subtle feature of Discord's invite system, the ability to reuse expired or deleted invite codes in vanity invite links, can be exploited as a powerful attack vector," the researchers said. "By hijacking legitimate invite links, threat actors silently redirect unsuspecting users to malicious Discord servers." "The choice of payloads, including a powerful stealer specifically targeting cryptocurrency wallets, suggests that the attackers are primarily focused on crypto users and motivated by financial gain." Found this article interesting? Follow us on Twitter  and LinkedIn to read more exclusive content we post. SHARE    
    0 Comments 0 Shares 0 Reviews
  • How a planetarium show discovered a spiral at the edge of our solar system

    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system.

    “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist.

    Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years. 

    The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?” 

    To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data.

    “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says. 

    The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars.

    “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.”

    She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’” 

    While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves. 

    In each simulation, the spiral persisted.

    “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’” 

    An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system.

    “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.”

    “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.”

    It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.”

    The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems.

    Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”

     In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show.

    “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’

    “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'”

    “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds.

    The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.”

    By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies.

    To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX.

    The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.” 

    The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.”

    Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data.

    “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.”

    As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands.

    Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent. 

    More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud. 

    Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.” 

    The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud. 

    For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    #how #planetarium #show #discovered #spiral
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park. #how #planetarium #show #discovered #spiral
    WWW.FASTCOMPANY.COM
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space Show (curving, dusty S-shape behind the Sun) [Image: © AMNH] More simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system. [Image: NASA] As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths. [Image: © AMNH] Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “Then [planetarium’s director] Neil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud (center), a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud [Image: © AMNH ] “New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    0 Comments 0 Shares 0 Reviews
CGShares https://cgshares.com