• Meta and Yandex Spying on Android Users Through Localhost Ports: The Dying State of Online Privacy

    Home Meta and Yandex Spying on Android Users Through Localhost Ports: The Dying State of Online Privacy

    News

    Meta and Yandex Spying on Android Users Through Localhost Ports: The Dying State of Online Privacy

    7 min read

    Published: June 4, 2025

    Key Takeaways

    Meta and Yandex have been found guilty of secretly listening to localhost ports and using them to transfer sensitive data from Android devices.
    The corporations use Meta Pixel and Yandex Metrica scripts to transfer cookies from browsers to local apps. Using incognito mode or a VPN can’t fully protect users against it.
    A Meta spokesperson has called this a ‘miscommunication,’ which seems to be an attempt to underplay the situation.

    Wake up, Android folks! A new privacy scandal has hit your area of town. According to a new report led by Radboud University, Meta and Yandex have been listening to localhost ports to link your web browsing data with your identity and collect personal information without your consent.
    The companies use Meta Pixel and the Yandex Metrica scripts, which are embedded on 5.8 million and 3 million websites, respectively, to connect with their native apps on Android devices through localhost sockets.
    This creates a communication path between the cookies on your website and the local apps, establishing a channel for transferring personal information from your device.
    Also, you are mistaken if you think using your browser’s incognito mode or a VPN can protect you. Zuckerberg’s latest method of data harvesting can’t be overcome by tweaking any privacy or cookie settings or by using a VPN or incognito mode.
    How Does It Work?
    Here’s the method used by Meta to spy on Android devices:

    As many as 22% of the top 1 million websites contain Meta Pixel – a tracking code that helps website owners measure ad performance and track user behaviour.
    When Meta Pixel loads, it creates a special cookie called _fbp, which is supposed to be a first-party cookie. This means no other third party, including Meta apps themselves, should have access to this cookie. The _fbp cookie identifies your browser whenever you visit a website, meaning it can identify which person is accessing which websites.
    However, Meta, being Meta, went and found a loophole around this. Now, whenever you run Facebook or Instagram on your Android device, they can open up listening ports, specifically a TCP portand a UDP port, on your phone in the background. 
    Whenever you load a website on your browser, the Meta Pixel uses WebRTC with SDP Munging, which essentially hides the _fbp cookie value inside the SDP message before being transmitted to your phone’s localhost. 
    Since Facebook and Instagram are already listening to this port, it receives the _fbp cookie value and can easily tie your identity to the website you’re visiting. Remember, Facebook and Instagram already have your identification details since you’re always logged in on these platforms.

    The report also says that Meta can link all _fbp received from various websites to your ID. Simply put, Meta knows which person is viewing what set of websites.
    Yandex also uses a similar method to harvest your personal data.

    Whenever you open a Yandex app, such as Yandex Maps, Yandex Browser, Yandex Search, or Navigator, it opens up ports like 29009, 30102, 29010, and 30103 on your phone. 
    When you visit a website that contains the Yandex Metrica Script, Yandex’s version of Meta Pixel, the script sends requests to Yandex servers containing obfuscated parameters. 
    These parameters are then sent to the local host via HTTP and HTTPS, which contains the IP address 127.0.0.1, or the yandexmetrica.com domain, which secretly points to 127.0.0.1.
    Now, the Yandex Metrica SDK in the Yandex apps receives these parameters and sends device identifiers, such as an Android Advertising ID, UUIDs, or device fingerprints. This entire message is encrypted to hide what it contains.
    The Yandex Metrica Script receives this info and sends it back to the Yandex servers. Just like Meta, Yandex can also tie your website activity to the device information shared by the SDK.

    Meta’s Infamous History with Privacy Norms
    This is not something new or unthinkable that Meta has done. The Mark Zuckerberg-led social media giant has a history of such privacy violations. 
    For instance, in 2024, the company was accused of collecting biometric data from Texas users without their express consent. The company settled the lawsuit by paying B. 
    Another of the most famous lawsuits was the Cambridge Analytica scandal in 2018, where a political consulting firm accessed private data of 87 million Facebook users without consent. The FTC fined Meta B for privacy violations along with a 100M settlement with the US Securities and Exchange Commission. 
    Meta Pixel has also come under scrutiny before, when it was accused of collecting sensitive health information from hospital websites. In another case dating back to 2012, Meta was accused of tracking users even after they logged out from their Facebook accounts. In this case, Meta paid M and promised to delete the collected data. 
    In 2024, South Korea also fined Meta M for inappropriately collecting personal data, such as sexual orientation and political beliefs, of 980K users.
    In September 2024, Meta was fined M by the Irish Data Protection Commission for inadvertently storing user passwords in plain text in such a way that employees could search for them. The passwords were not encrypted and were essentially leaked internally.
    So, the latest scandal isn’t entirely out of character for Meta. It has been finding ways to collect your data ever since its incorporation, and it seems like it will continue to do so, regardless of the regulations and safeguards in place.
    That said, Meta’s recent tracking method is insanely dangerous because there’s no safeguard around it. Even if you visit websites in incognito mode or use a VPN, Meta Pixel can still track your activities. 
    The past lawsuits also show a very identifiable pattern: Meta doesn’t fight a lawsuit until the end to try to win it. It either accepts the fine or settles the lawsuit with monetary compensation. This essentially goes to show that it passively accepts and even ‘owns’ the illegitimate tracking methods it has been using for decades. It’s quite possible that the top management views these fines and penalties as a cost of collecting data.
    Meta’s Timid Response
    Meta’s response claims that there’s some ‘miscommunication’ regarding Google policies. However, the method used in the aforementioned tracking scandal isn’t something that can simply happen due to ‘faulty design’ or miscommunication. 

    We are in discussions with Google to address a potential miscommunication regarding the application of their policies – Meta Spokesperson

    This kind of unethical tracking method has to be deliberately designed by engineers for it to work perfectly on such a large scale. While Meta is still trying to underplay the situation, it has paused the ‘feature’as of now. The report also claims that as of June 3, Facebook and Instagram are not actively listening to the new ports.
    Here’s what will possibly happen next:

    A lawsuit may be filed based on the report.
    An investigating committee might be formed to question the matter.
    The company will come up with lame excuses, such as misinterpretation or miscommunication of policy guidelines.
    Meta will eventually settle the lawsuit or bear the fine with pride, like it has always done. 

    The regulatory authorities are apparently chasing a rat that finds new holes to hide every day. Companies like Meta and Yandex seem to be one step ahead of these regulations and have mastered the art of finding loopholes.
    More than legislative technicalities, it’s the moral ethics of the company that become clear with incidents like this. The intent of these regulations is to protect personal information, and the fact that Meta and Yandex blatantly circumvent these regulations in their spirit shows the absolutely horrific state of capitalism these corporations are in.

    Krishi is a seasoned tech journalist with over four years of experience writing about PC hardware, consumer technology, and artificial intelligence.  Clarity and accessibility are at the core of Krishi’s writing style.
    He believes technology writing should empower readers—not confuse them—and he’s committed to ensuring his content is always easy to understand without sacrificing accuracy or depth.
    Over the years, Krishi has contributed to some of the most reputable names in the industry, including Techopedia, TechRadar, and Tom’s Guide. A man of many talents, Krishi has also proven his mettle as a crypto writer, tackling complex topics with both ease and zeal. His work spans various formats—from in-depth explainers and news coverage to feature pieces and buying guides. 
    Behind the scenes, Krishi operates from a dual-monitor setupthat’s always buzzing with news feeds, technical documentation, and research notes, as well as the occasional gaming sessions that keep him fresh. 
    Krishi thrives on staying current, always ready to dive into the latest announcements, industry shifts, and their far-reaching impacts.  When he's not deep into research on the latest PC hardware news, Krishi would love to chat with you about day trading and the financial markets—oh! And cricket, as well.

    View all articles by Krishi Chowdhary

    Our editorial process

    The Tech Report editorial policy is centered on providing helpful, accurate content that offers real value to our readers. We only work with experienced writers who have specific knowledge in the topics they cover, including latest developments in technology, online privacy, cryptocurrencies, software, and more. Our editorial policy ensures that each topic is researched and curated by our in-house editors. We maintain rigorous journalistic standards, and every article is 100% written by real authors.

    More from News

    View all

    View all
    #meta #yandex #spying #android #users
    Meta and Yandex Spying on Android Users Through Localhost Ports: The Dying State of Online Privacy
    Home Meta and Yandex Spying on Android Users Through Localhost Ports: The Dying State of Online Privacy News Meta and Yandex Spying on Android Users Through Localhost Ports: The Dying State of Online Privacy 7 min read Published: June 4, 2025 Key Takeaways Meta and Yandex have been found guilty of secretly listening to localhost ports and using them to transfer sensitive data from Android devices. The corporations use Meta Pixel and Yandex Metrica scripts to transfer cookies from browsers to local apps. Using incognito mode or a VPN can’t fully protect users against it. A Meta spokesperson has called this a ‘miscommunication,’ which seems to be an attempt to underplay the situation. Wake up, Android folks! A new privacy scandal has hit your area of town. According to a new report led by Radboud University, Meta and Yandex have been listening to localhost ports to link your web browsing data with your identity and collect personal information without your consent. The companies use Meta Pixel and the Yandex Metrica scripts, which are embedded on 5.8 million and 3 million websites, respectively, to connect with their native apps on Android devices through localhost sockets. This creates a communication path between the cookies on your website and the local apps, establishing a channel for transferring personal information from your device. Also, you are mistaken if you think using your browser’s incognito mode or a VPN can protect you. Zuckerberg’s latest method of data harvesting can’t be overcome by tweaking any privacy or cookie settings or by using a VPN or incognito mode. How Does It Work? Here’s the method used by Meta to spy on Android devices: As many as 22% of the top 1 million websites contain Meta Pixel – a tracking code that helps website owners measure ad performance and track user behaviour. When Meta Pixel loads, it creates a special cookie called _fbp, which is supposed to be a first-party cookie. This means no other third party, including Meta apps themselves, should have access to this cookie. The _fbp cookie identifies your browser whenever you visit a website, meaning it can identify which person is accessing which websites. However, Meta, being Meta, went and found a loophole around this. Now, whenever you run Facebook or Instagram on your Android device, they can open up listening ports, specifically a TCP portand a UDP port, on your phone in the background.  Whenever you load a website on your browser, the Meta Pixel uses WebRTC with SDP Munging, which essentially hides the _fbp cookie value inside the SDP message before being transmitted to your phone’s localhost.  Since Facebook and Instagram are already listening to this port, it receives the _fbp cookie value and can easily tie your identity to the website you’re visiting. Remember, Facebook and Instagram already have your identification details since you’re always logged in on these platforms. The report also says that Meta can link all _fbp received from various websites to your ID. Simply put, Meta knows which person is viewing what set of websites. Yandex also uses a similar method to harvest your personal data. Whenever you open a Yandex app, such as Yandex Maps, Yandex Browser, Yandex Search, or Navigator, it opens up ports like 29009, 30102, 29010, and 30103 on your phone.  When you visit a website that contains the Yandex Metrica Script, Yandex’s version of Meta Pixel, the script sends requests to Yandex servers containing obfuscated parameters.  These parameters are then sent to the local host via HTTP and HTTPS, which contains the IP address 127.0.0.1, or the yandexmetrica.com domain, which secretly points to 127.0.0.1. Now, the Yandex Metrica SDK in the Yandex apps receives these parameters and sends device identifiers, such as an Android Advertising ID, UUIDs, or device fingerprints. This entire message is encrypted to hide what it contains. The Yandex Metrica Script receives this info and sends it back to the Yandex servers. Just like Meta, Yandex can also tie your website activity to the device information shared by the SDK. Meta’s Infamous History with Privacy Norms This is not something new or unthinkable that Meta has done. The Mark Zuckerberg-led social media giant has a history of such privacy violations.  For instance, in 2024, the company was accused of collecting biometric data from Texas users without their express consent. The company settled the lawsuit by paying B.  Another of the most famous lawsuits was the Cambridge Analytica scandal in 2018, where a political consulting firm accessed private data of 87 million Facebook users without consent. The FTC fined Meta B for privacy violations along with a 100M settlement with the US Securities and Exchange Commission.  Meta Pixel has also come under scrutiny before, when it was accused of collecting sensitive health information from hospital websites. In another case dating back to 2012, Meta was accused of tracking users even after they logged out from their Facebook accounts. In this case, Meta paid M and promised to delete the collected data.  In 2024, South Korea also fined Meta M for inappropriately collecting personal data, such as sexual orientation and political beliefs, of 980K users. In September 2024, Meta was fined M by the Irish Data Protection Commission for inadvertently storing user passwords in plain text in such a way that employees could search for them. The passwords were not encrypted and were essentially leaked internally. So, the latest scandal isn’t entirely out of character for Meta. It has been finding ways to collect your data ever since its incorporation, and it seems like it will continue to do so, regardless of the regulations and safeguards in place. That said, Meta’s recent tracking method is insanely dangerous because there’s no safeguard around it. Even if you visit websites in incognito mode or use a VPN, Meta Pixel can still track your activities.  The past lawsuits also show a very identifiable pattern: Meta doesn’t fight a lawsuit until the end to try to win it. It either accepts the fine or settles the lawsuit with monetary compensation. This essentially goes to show that it passively accepts and even ‘owns’ the illegitimate tracking methods it has been using for decades. It’s quite possible that the top management views these fines and penalties as a cost of collecting data. Meta’s Timid Response Meta’s response claims that there’s some ‘miscommunication’ regarding Google policies. However, the method used in the aforementioned tracking scandal isn’t something that can simply happen due to ‘faulty design’ or miscommunication.  We are in discussions with Google to address a potential miscommunication regarding the application of their policies – Meta Spokesperson This kind of unethical tracking method has to be deliberately designed by engineers for it to work perfectly on such a large scale. While Meta is still trying to underplay the situation, it has paused the ‘feature’as of now. The report also claims that as of June 3, Facebook and Instagram are not actively listening to the new ports. Here’s what will possibly happen next: A lawsuit may be filed based on the report. An investigating committee might be formed to question the matter. The company will come up with lame excuses, such as misinterpretation or miscommunication of policy guidelines. Meta will eventually settle the lawsuit or bear the fine with pride, like it has always done.  The regulatory authorities are apparently chasing a rat that finds new holes to hide every day. Companies like Meta and Yandex seem to be one step ahead of these regulations and have mastered the art of finding loopholes. More than legislative technicalities, it’s the moral ethics of the company that become clear with incidents like this. The intent of these regulations is to protect personal information, and the fact that Meta and Yandex blatantly circumvent these regulations in their spirit shows the absolutely horrific state of capitalism these corporations are in. Krishi is a seasoned tech journalist with over four years of experience writing about PC hardware, consumer technology, and artificial intelligence.  Clarity and accessibility are at the core of Krishi’s writing style. He believes technology writing should empower readers—not confuse them—and he’s committed to ensuring his content is always easy to understand without sacrificing accuracy or depth. Over the years, Krishi has contributed to some of the most reputable names in the industry, including Techopedia, TechRadar, and Tom’s Guide. A man of many talents, Krishi has also proven his mettle as a crypto writer, tackling complex topics with both ease and zeal. His work spans various formats—from in-depth explainers and news coverage to feature pieces and buying guides.  Behind the scenes, Krishi operates from a dual-monitor setupthat’s always buzzing with news feeds, technical documentation, and research notes, as well as the occasional gaming sessions that keep him fresh.  Krishi thrives on staying current, always ready to dive into the latest announcements, industry shifts, and their far-reaching impacts.  When he's not deep into research on the latest PC hardware news, Krishi would love to chat with you about day trading and the financial markets—oh! And cricket, as well. View all articles by Krishi Chowdhary Our editorial process The Tech Report editorial policy is centered on providing helpful, accurate content that offers real value to our readers. We only work with experienced writers who have specific knowledge in the topics they cover, including latest developments in technology, online privacy, cryptocurrencies, software, and more. Our editorial policy ensures that each topic is researched and curated by our in-house editors. We maintain rigorous journalistic standards, and every article is 100% written by real authors. More from News View all View all #meta #yandex #spying #android #users
    TECHREPORT.COM
    Meta and Yandex Spying on Android Users Through Localhost Ports: The Dying State of Online Privacy
    Home Meta and Yandex Spying on Android Users Through Localhost Ports: The Dying State of Online Privacy News Meta and Yandex Spying on Android Users Through Localhost Ports: The Dying State of Online Privacy 7 min read Published: June 4, 2025 Key Takeaways Meta and Yandex have been found guilty of secretly listening to localhost ports and using them to transfer sensitive data from Android devices. The corporations use Meta Pixel and Yandex Metrica scripts to transfer cookies from browsers to local apps. Using incognito mode or a VPN can’t fully protect users against it. A Meta spokesperson has called this a ‘miscommunication,’ which seems to be an attempt to underplay the situation. Wake up, Android folks! A new privacy scandal has hit your area of town. According to a new report led by Radboud University, Meta and Yandex have been listening to localhost ports to link your web browsing data with your identity and collect personal information without your consent. The companies use Meta Pixel and the Yandex Metrica scripts, which are embedded on 5.8 million and 3 million websites, respectively, to connect with their native apps on Android devices through localhost sockets. This creates a communication path between the cookies on your website and the local apps, establishing a channel for transferring personal information from your device. Also, you are mistaken if you think using your browser’s incognito mode or a VPN can protect you. Zuckerberg’s latest method of data harvesting can’t be overcome by tweaking any privacy or cookie settings or by using a VPN or incognito mode. How Does It Work? Here’s the method used by Meta to spy on Android devices: As many as 22% of the top 1 million websites contain Meta Pixel – a tracking code that helps website owners measure ad performance and track user behaviour. When Meta Pixel loads, it creates a special cookie called _fbp, which is supposed to be a first-party cookie. This means no other third party, including Meta apps themselves, should have access to this cookie. The _fbp cookie identifies your browser whenever you visit a website, meaning it can identify which person is accessing which websites. However, Meta, being Meta, went and found a loophole around this. Now, whenever you run Facebook or Instagram on your Android device, they can open up listening ports, specifically a TCP port (12387 or 12388) and a UDP port (the first unoccupied port in 12580-12585), on your phone in the background.  Whenever you load a website on your browser, the Meta Pixel uses WebRTC with SDP Munging, which essentially hides the _fbp cookie value inside the SDP message before being transmitted to your phone’s localhost.  Since Facebook and Instagram are already listening to this port, it receives the _fbp cookie value and can easily tie your identity to the website you’re visiting. Remember, Facebook and Instagram already have your identification details since you’re always logged in on these platforms. The report also says that Meta can link all _fbp received from various websites to your ID. Simply put, Meta knows which person is viewing what set of websites. Yandex also uses a similar method to harvest your personal data. Whenever you open a Yandex app, such as Yandex Maps, Yandex Browser, Yandex Search, or Navigator, it opens up ports like 29009, 30102, 29010, and 30103 on your phone.  When you visit a website that contains the Yandex Metrica Script, Yandex’s version of Meta Pixel, the script sends requests to Yandex servers containing obfuscated parameters.  These parameters are then sent to the local host via HTTP and HTTPS, which contains the IP address 127.0.0.1, or the yandexmetrica.com domain, which secretly points to 127.0.0.1. Now, the Yandex Metrica SDK in the Yandex apps receives these parameters and sends device identifiers, such as an Android Advertising ID, UUIDs, or device fingerprints. This entire message is encrypted to hide what it contains. The Yandex Metrica Script receives this info and sends it back to the Yandex servers. Just like Meta, Yandex can also tie your website activity to the device information shared by the SDK. Meta’s Infamous History with Privacy Norms This is not something new or unthinkable that Meta has done. The Mark Zuckerberg-led social media giant has a history of such privacy violations.  For instance, in 2024, the company was accused of collecting biometric data from Texas users without their express consent. The company settled the lawsuit by paying $1.4B.  Another of the most famous lawsuits was the Cambridge Analytica scandal in 2018, where a political consulting firm accessed private data of 87 million Facebook users without consent. The FTC fined Meta $5B for privacy violations along with a 100M settlement with the US Securities and Exchange Commission.  Meta Pixel has also come under scrutiny before, when it was accused of collecting sensitive health information from hospital websites. In another case dating back to 2012, Meta was accused of tracking users even after they logged out from their Facebook accounts. In this case, Meta paid $90M and promised to delete the collected data.  In 2024, South Korea also fined Meta $15M for inappropriately collecting personal data, such as sexual orientation and political beliefs, of 980K users. In September 2024, Meta was fined $101.6M by the Irish Data Protection Commission for inadvertently storing user passwords in plain text in such a way that employees could search for them. The passwords were not encrypted and were essentially leaked internally. So, the latest scandal isn’t entirely out of character for Meta. It has been finding ways to collect your data ever since its incorporation, and it seems like it will continue to do so, regardless of the regulations and safeguards in place. That said, Meta’s recent tracking method is insanely dangerous because there’s no safeguard around it. Even if you visit websites in incognito mode or use a VPN, Meta Pixel can still track your activities.  The past lawsuits also show a very identifiable pattern: Meta doesn’t fight a lawsuit until the end to try to win it. It either accepts the fine or settles the lawsuit with monetary compensation. This essentially goes to show that it passively accepts and even ‘owns’ the illegitimate tracking methods it has been using for decades. It’s quite possible that the top management views these fines and penalties as a cost of collecting data. Meta’s Timid Response Meta’s response claims that there’s some ‘miscommunication’ regarding Google policies. However, the method used in the aforementioned tracking scandal isn’t something that can simply happen due to ‘faulty design’ or miscommunication.  We are in discussions with Google to address a potential miscommunication regarding the application of their policies – Meta Spokesperson This kind of unethical tracking method has to be deliberately designed by engineers for it to work perfectly on such a large scale. While Meta is still trying to underplay the situation, it has paused the ‘feature’ (yep, that’s what they are calling it) as of now. The report also claims that as of June 3, Facebook and Instagram are not actively listening to the new ports. Here’s what will possibly happen next: A lawsuit may be filed based on the report. An investigating committee might be formed to question the matter. The company will come up with lame excuses, such as misinterpretation or miscommunication of policy guidelines. Meta will eventually settle the lawsuit or bear the fine with pride, like it has always done.  The regulatory authorities are apparently chasing a rat that finds new holes to hide every day. Companies like Meta and Yandex seem to be one step ahead of these regulations and have mastered the art of finding loopholes. More than legislative technicalities, it’s the moral ethics of the company that become clear with incidents like this. The intent of these regulations is to protect personal information, and the fact that Meta and Yandex blatantly circumvent these regulations in their spirit shows the absolutely horrific state of capitalism these corporations are in. Krishi is a seasoned tech journalist with over four years of experience writing about PC hardware, consumer technology, and artificial intelligence.  Clarity and accessibility are at the core of Krishi’s writing style. He believes technology writing should empower readers—not confuse them—and he’s committed to ensuring his content is always easy to understand without sacrificing accuracy or depth. Over the years, Krishi has contributed to some of the most reputable names in the industry, including Techopedia, TechRadar, and Tom’s Guide. A man of many talents, Krishi has also proven his mettle as a crypto writer, tackling complex topics with both ease and zeal. His work spans various formats—from in-depth explainers and news coverage to feature pieces and buying guides.  Behind the scenes, Krishi operates from a dual-monitor setup (including a 29-inch LG UltraWide) that’s always buzzing with news feeds, technical documentation, and research notes, as well as the occasional gaming sessions that keep him fresh.  Krishi thrives on staying current, always ready to dive into the latest announcements, industry shifts, and their far-reaching impacts.  When he's not deep into research on the latest PC hardware news, Krishi would love to chat with you about day trading and the financial markets—oh! And cricket, as well. View all articles by Krishi Chowdhary Our editorial process The Tech Report editorial policy is centered on providing helpful, accurate content that offers real value to our readers. We only work with experienced writers who have specific knowledge in the topics they cover, including latest developments in technology, online privacy, cryptocurrencies, software, and more. Our editorial policy ensures that each topic is researched and curated by our in-house editors. We maintain rigorous journalistic standards, and every article is 100% written by real authors. More from News View all View all
    Like
    Love
    Wow
    Sad
    Angry
    193
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • The Netherlands is building a leading neuromorphic computing industry

    Our latest and most advanced technologies — from AI to Industrial IoT, advanced robotics, and self-driving cars — share serious problems: massive energy consumption, limited on-edge capabilities, system hallucinations, and serious accuracy gaps. 
    One possible solution is emerging in the Netherlands. The country is developing a promising ecosystem for neuromorphic computing, which draws on neuroscience to boost IT efficiencies and performance. Billions of euros are being invested in this new form of computing worldwide. The Netherlands aims to become a leader in the market by bringing together startups, established companies, government organisations, and academics in a neuromorphic computing ecosystem.
    A Dutch mission to the UK
    In March, a Dutch delegation landed in the UK to host an “Innovation Mission” with local tech and government representatives. Top Sector ICT, a Dutch government–supported organisation, led the mission, which sought to strengthen and discuss the future of neuromorphic computing in Europe and the Netherlands. 
    We contacted Top Sector ICT, who connected us with one of their collaborators: Dr Johan H. Mentink, an expert in computational physics at Radboud University in the Netherlands. Dr Mentink spoke about how neuromorphic computing can solve the energy, accuracy, and efficiency challenges of our current computing architectures. 

    Grab that deal
    “Current digital computers use power-hungry processes to handle data,” Dr Mentink said. 
    “The result is that some modern data centres use so much energy that they even need their own power plant.” 
    Computing today stores data in one placeand processes it in another place. This means that a lot of energy is spent on transporting data, Dr Mentink explained. 
    In contrast, neuromorphic computing architectures are different at the hardware and software levels. For example, instead of using processors and memories, neuromorphic systems leverage new hardware components such as memristors. These act as both memory and processors. 
    By processing and saving data on the same hardware component, neuromorphic computing removes the energy-intensive and error-prone task of transporting data. Additionally, because data is stored on these components, it can be processed more immediately, resulting in faster decision-making, reduced hallucinations, improved accuracy, and better performance. This concept is being applied to edge computing, Industrial IoT, and robotics to drive faster real-time decision-making. 
    “Just like our brains process and store information in the same place, we can make computers that would combine data storage and processing in one place,” Dr Mentink explained.  
    Early use cases for neuromorphic computing
    Neuromorphic computing is far from just experimental. A great number of new and established technology companies are heavily invested in developing new hardware, edge devices, software, and neuromorphic computing applications.
    Big tech brands such as IBM, NVIDIA, and Intel, with its Loihi chips, are all involved in neuromorphic computing, while companies in the Netherlands, aligned with a 2024 national white paper, are taking a leading regional role. 
    For example, the Dutch company Innatera — a leader in ultra-low power neuromorphic processors — recently secured €15 million in Series-A funding from Invest-NL Deep Tech Fund, the EIC Fund, MIG Capital, Matterwave Ventures, and Delft Enterprises. 
    Innatera is just the tip of the iceberg, as the Netherlands continues to support the new industry through funds, grants, and other incentives.
    Immediate use cases for neuromorphic computing include event-based sensing technologies integrated into smart sensors such as cameras or audio. These neuromorphic devices only process change, which can dramatically reduce power and data load, said Sylvester Kaczmarek, the CEO of OrbiSky Systems, a company providing AI integration for space technology.  
    Neuromorphic hardware and software have the potential to transfer AI running on the edge, especially for low-power devices such as mobile, wearables, or IoT. 
    Pattern recognition, keyword spotting, and simple diagnostics — such as real-time signal processing of complex sensor data streams for biomedical uses, robotics, or industrial monitoring — are some of the leading use cases, Dr Kaczmarek explained. 
    When applied to pattern recognition and classification or anomaly detection, neuromorphic computing can make decisions very quickly and efficiently, 
    Professor Dr Hans Hilgenkamp, Scientific Director of the MESA+ Institute at the University of Twente, agreed that pattern recognition is one of the fields where neuromorphic computing excels. 
    “One may also think aboutfailure prediction in industrial or automotive applications,” he said.   
    The gaps creating neuromorphic opportunities
    Despite the recent progress, the road to establishing robust neuromorphic computing ecosystems in the Netherlands is challenging. Globalised tech supply chains and the standardisation of new technologies leave little room for hardware-level innovation. 
    For example, optical networks and optical chips have proven to outperform traditional systems in use today, but the tech has not been deployed globally. Deploying new hardware involves strategic coordination between the public and private sectors. The global rollout of 5G technology provides a good example of the challenges. It required telcos and governments around the world to deploy not only new antennas, but also smartphones, laptops, and a lot of hardware that could support the new standard. 
    On the software side, meanwhile, 5G systems had a pressing need for global standards to ensure integration, interoperability, and smooth deployment. Additionally, established telcos had to move from pure competition to strategic collaboration— an unfamiliar shift for an industry long built on siloed operations.
    Neuromorphic computing ecosystems face similar obstacles. The Netherlands recognises that the entire industry’s success depends on innovation in materials, devices, circuit designs, hardware architecture, algorithms, and applications. 
    These challenges and gaps are driving new opportunities for tech companies, startups, vendors, and partners. 
    Dr Kaczmarek told us that neuromorphic computing requires full-stack integration. This involves expertise that can connect novel materials and devices through circuit design and architectures to algorithms and applications. “Bringing these layers together is crucial but challenging,” he said. 
    On the algorithms and software side of things, developing new paradigms of programming, learning rules, and software tools native to neuromorphic hardware are also priorities. 
    “It is crucial to make the hardware usable and efficient — co-designing hardware and algorithms because they are intimately coupled in neuromorphic systems,” said Dr Kaczmarek. 
    Other industries which have developed or are considering research on neuromorphic computing include healthcare, agri-food, and sustainable energy. 
    Neuromorphic computing modules or components can also be integrated with conventional CMOS, photonics, AI, and even quantum technologies. 
    Long-term opportunities in the Netherlands
    We asked Dr Hilgenkamp what expertise or innovations are most needed and offer the greatest opportunities for contribution and growth within this emerging ecosystem.
    “The long-term developments involve new materials and a lot of research, which is already taking place on an academic level,” Dr Hilgenkamp said. 
    He added that the idea of “materials that can learn” brings up completely new concepts in materials science that are exciting for researchers. 
    On the other hand, Dr Mentink pointed to the opportunity to transform our economies, which rely on processing massive amounts of data. 
    “Even replacing a small part of that with neuromorphic computing will lead to massive energy savings,” he said. 
    “Moreover, with neuromorphic computing, much more processing can be done close to where the data is produced. This is good news for situations in which data contains privacy-sensitive information.” 
    Concrete examples, according to Dr Mentink, also include fraud detection for credit card transactions, image analysis by robots and drones, anomaly detection of heartbeats, and processing of telecom data.
    “The most promising use cases are those involving huge data flows, strong demands for very fast response times, and small energy budgets,” said Dr Mentink. 
    As the use cases for neuromorphic computing increase, Dr Mentink expects the development of software toolchains that enable quick adoption of new neuromorphic platforms to see growth. This new sector would include services to streamline deployment.
    “Longer-term sustainable growth requires a concerted interdisciplinary effort across the whole computing stack to enable seamless integration of foundational discoveries to applications in new neuromorphic computing systems,” Dr Mentink said. 
    The bottom line
    The potential of neuromorphic computing has translated into billions of dollars in investment in the Netherlands and Europe, as well as in Asia and the rest of the world. 
    Businesses that can innovate, develop, and integrate hardware and software-level neuromorphic technologies stand to gain the most.  
    The potential of neuromorphic computing for greater energy efficiency and performance could ripple across industries. Energy, healthcare, robotics, AI, industrial IoT, and quantum tech all stand to benefit if they integrate the technology. And if the Dutch ecosystem takes off, the Netherlands will be in a position to lead the way.
    Supporting Dutch tech is a key mission of TNW Conference, which takes place on June 19-20 in Amsterdam. Tickets are now on sale — use the code TNWXMEDIA2025 at the checkout to get 30% off.

    Story by

    Ray Fernandez

    Ray Fernandez is a journalist with over a decade of experience reporting on technology, finance, science, and natural resources. His work haRay Fernandez is a journalist with over a decade of experience reporting on technology, finance, science, and natural resources. His work has been published in Bloomberg, TechRepublic, The Sunday Mail, eSecurityPlanet, and many others. He is a contributing writer for Espacio Media Incubator, which has reporters across the US, Europe, Asia, and Latin America.

    Get the TNW newsletter
    Get the most important tech news in your inbox each week.

    Also tagged with
    #netherlands #building #leading #neuromorphic #computing
    The Netherlands is building a leading neuromorphic computing industry
    Our latest and most advanced technologies — from AI to Industrial IoT, advanced robotics, and self-driving cars — share serious problems: massive energy consumption, limited on-edge capabilities, system hallucinations, and serious accuracy gaps.  One possible solution is emerging in the Netherlands. The country is developing a promising ecosystem for neuromorphic computing, which draws on neuroscience to boost IT efficiencies and performance. Billions of euros are being invested in this new form of computing worldwide. The Netherlands aims to become a leader in the market by bringing together startups, established companies, government organisations, and academics in a neuromorphic computing ecosystem. A Dutch mission to the UK In March, a Dutch delegation landed in the UK to host an “Innovation Mission” with local tech and government representatives. Top Sector ICT, a Dutch government–supported organisation, led the mission, which sought to strengthen and discuss the future of neuromorphic computing in Europe and the Netherlands.  We contacted Top Sector ICT, who connected us with one of their collaborators: Dr Johan H. Mentink, an expert in computational physics at Radboud University in the Netherlands. Dr Mentink spoke about how neuromorphic computing can solve the energy, accuracy, and efficiency challenges of our current computing architectures.  Grab that deal “Current digital computers use power-hungry processes to handle data,” Dr Mentink said.  “The result is that some modern data centres use so much energy that they even need their own power plant.”  Computing today stores data in one placeand processes it in another place. This means that a lot of energy is spent on transporting data, Dr Mentink explained.  In contrast, neuromorphic computing architectures are different at the hardware and software levels. For example, instead of using processors and memories, neuromorphic systems leverage new hardware components such as memristors. These act as both memory and processors.  By processing and saving data on the same hardware component, neuromorphic computing removes the energy-intensive and error-prone task of transporting data. Additionally, because data is stored on these components, it can be processed more immediately, resulting in faster decision-making, reduced hallucinations, improved accuracy, and better performance. This concept is being applied to edge computing, Industrial IoT, and robotics to drive faster real-time decision-making.  “Just like our brains process and store information in the same place, we can make computers that would combine data storage and processing in one place,” Dr Mentink explained.   Early use cases for neuromorphic computing Neuromorphic computing is far from just experimental. A great number of new and established technology companies are heavily invested in developing new hardware, edge devices, software, and neuromorphic computing applications. Big tech brands such as IBM, NVIDIA, and Intel, with its Loihi chips, are all involved in neuromorphic computing, while companies in the Netherlands, aligned with a 2024 national white paper, are taking a leading regional role.  For example, the Dutch company Innatera — a leader in ultra-low power neuromorphic processors — recently secured €15 million in Series-A funding from Invest-NL Deep Tech Fund, the EIC Fund, MIG Capital, Matterwave Ventures, and Delft Enterprises.  Innatera is just the tip of the iceberg, as the Netherlands continues to support the new industry through funds, grants, and other incentives. Immediate use cases for neuromorphic computing include event-based sensing technologies integrated into smart sensors such as cameras or audio. These neuromorphic devices only process change, which can dramatically reduce power and data load, said Sylvester Kaczmarek, the CEO of OrbiSky Systems, a company providing AI integration for space technology.   Neuromorphic hardware and software have the potential to transfer AI running on the edge, especially for low-power devices such as mobile, wearables, or IoT.  Pattern recognition, keyword spotting, and simple diagnostics — such as real-time signal processing of complex sensor data streams for biomedical uses, robotics, or industrial monitoring — are some of the leading use cases, Dr Kaczmarek explained.  When applied to pattern recognition and classification or anomaly detection, neuromorphic computing can make decisions very quickly and efficiently,  Professor Dr Hans Hilgenkamp, Scientific Director of the MESA+ Institute at the University of Twente, agreed that pattern recognition is one of the fields where neuromorphic computing excels.  “One may also think aboutfailure prediction in industrial or automotive applications,” he said.    The gaps creating neuromorphic opportunities Despite the recent progress, the road to establishing robust neuromorphic computing ecosystems in the Netherlands is challenging. Globalised tech supply chains and the standardisation of new technologies leave little room for hardware-level innovation.  For example, optical networks and optical chips have proven to outperform traditional systems in use today, but the tech has not been deployed globally. Deploying new hardware involves strategic coordination between the public and private sectors. The global rollout of 5G technology provides a good example of the challenges. It required telcos and governments around the world to deploy not only new antennas, but also smartphones, laptops, and a lot of hardware that could support the new standard.  On the software side, meanwhile, 5G systems had a pressing need for global standards to ensure integration, interoperability, and smooth deployment. Additionally, established telcos had to move from pure competition to strategic collaboration— an unfamiliar shift for an industry long built on siloed operations. Neuromorphic computing ecosystems face similar obstacles. The Netherlands recognises that the entire industry’s success depends on innovation in materials, devices, circuit designs, hardware architecture, algorithms, and applications.  These challenges and gaps are driving new opportunities for tech companies, startups, vendors, and partners.  Dr Kaczmarek told us that neuromorphic computing requires full-stack integration. This involves expertise that can connect novel materials and devices through circuit design and architectures to algorithms and applications. “Bringing these layers together is crucial but challenging,” he said.  On the algorithms and software side of things, developing new paradigms of programming, learning rules, and software tools native to neuromorphic hardware are also priorities.  “It is crucial to make the hardware usable and efficient — co-designing hardware and algorithms because they are intimately coupled in neuromorphic systems,” said Dr Kaczmarek.  Other industries which have developed or are considering research on neuromorphic computing include healthcare, agri-food, and sustainable energy.  Neuromorphic computing modules or components can also be integrated with conventional CMOS, photonics, AI, and even quantum technologies.  Long-term opportunities in the Netherlands We asked Dr Hilgenkamp what expertise or innovations are most needed and offer the greatest opportunities for contribution and growth within this emerging ecosystem. “The long-term developments involve new materials and a lot of research, which is already taking place on an academic level,” Dr Hilgenkamp said.  He added that the idea of “materials that can learn” brings up completely new concepts in materials science that are exciting for researchers.  On the other hand, Dr Mentink pointed to the opportunity to transform our economies, which rely on processing massive amounts of data.  “Even replacing a small part of that with neuromorphic computing will lead to massive energy savings,” he said.  “Moreover, with neuromorphic computing, much more processing can be done close to where the data is produced. This is good news for situations in which data contains privacy-sensitive information.”  Concrete examples, according to Dr Mentink, also include fraud detection for credit card transactions, image analysis by robots and drones, anomaly detection of heartbeats, and processing of telecom data. “The most promising use cases are those involving huge data flows, strong demands for very fast response times, and small energy budgets,” said Dr Mentink.  As the use cases for neuromorphic computing increase, Dr Mentink expects the development of software toolchains that enable quick adoption of new neuromorphic platforms to see growth. This new sector would include services to streamline deployment. “Longer-term sustainable growth requires a concerted interdisciplinary effort across the whole computing stack to enable seamless integration of foundational discoveries to applications in new neuromorphic computing systems,” Dr Mentink said.  The bottom line The potential of neuromorphic computing has translated into billions of dollars in investment in the Netherlands and Europe, as well as in Asia and the rest of the world.  Businesses that can innovate, develop, and integrate hardware and software-level neuromorphic technologies stand to gain the most.   The potential of neuromorphic computing for greater energy efficiency and performance could ripple across industries. Energy, healthcare, robotics, AI, industrial IoT, and quantum tech all stand to benefit if they integrate the technology. And if the Dutch ecosystem takes off, the Netherlands will be in a position to lead the way. Supporting Dutch tech is a key mission of TNW Conference, which takes place on June 19-20 in Amsterdam. Tickets are now on sale — use the code TNWXMEDIA2025 at the checkout to get 30% off. Story by Ray Fernandez Ray Fernandez is a journalist with over a decade of experience reporting on technology, finance, science, and natural resources. His work haRay Fernandez is a journalist with over a decade of experience reporting on technology, finance, science, and natural resources. His work has been published in Bloomberg, TechRepublic, The Sunday Mail, eSecurityPlanet, and many others. He is a contributing writer for Espacio Media Incubator, which has reporters across the US, Europe, Asia, and Latin America. Get the TNW newsletter Get the most important tech news in your inbox each week. Also tagged with #netherlands #building #leading #neuromorphic #computing
    THENEXTWEB.COM
    The Netherlands is building a leading neuromorphic computing industry
    Our latest and most advanced technologies — from AI to Industrial IoT, advanced robotics, and self-driving cars — share serious problems: massive energy consumption, limited on-edge capabilities, system hallucinations, and serious accuracy gaps.  One possible solution is emerging in the Netherlands. The country is developing a promising ecosystem for neuromorphic computing, which draws on neuroscience to boost IT efficiencies and performance. Billions of euros are being invested in this new form of computing worldwide. The Netherlands aims to become a leader in the market by bringing together startups, established companies, government organisations, and academics in a neuromorphic computing ecosystem. A Dutch mission to the UK In March, a Dutch delegation landed in the UK to host an “Innovation Mission” with local tech and government representatives. Top Sector ICT, a Dutch government–supported organisation, led the mission, which sought to strengthen and discuss the future of neuromorphic computing in Europe and the Netherlands.  We contacted Top Sector ICT, who connected us with one of their collaborators: Dr Johan H. Mentink, an expert in computational physics at Radboud University in the Netherlands. Dr Mentink spoke about how neuromorphic computing can solve the energy, accuracy, and efficiency challenges of our current computing architectures.  Grab that deal “Current digital computers use power-hungry processes to handle data,” Dr Mentink said.  “The result is that some modern data centres use so much energy that they even need their own power plant.”  Computing today stores data in one place (memory) and processes it in another place (processors). This means that a lot of energy is spent on transporting data, Dr Mentink explained.  In contrast, neuromorphic computing architectures are different at the hardware and software levels. For example, instead of using processors and memories, neuromorphic systems leverage new hardware components such as memristors. These act as both memory and processors.  By processing and saving data on the same hardware component, neuromorphic computing removes the energy-intensive and error-prone task of transporting data. Additionally, because data is stored on these components, it can be processed more immediately, resulting in faster decision-making, reduced hallucinations, improved accuracy, and better performance. This concept is being applied to edge computing, Industrial IoT, and robotics to drive faster real-time decision-making.  “Just like our brains process and store information in the same place, we can make computers that would combine data storage and processing in one place,” Dr Mentink explained.   Early use cases for neuromorphic computing Neuromorphic computing is far from just experimental. A great number of new and established technology companies are heavily invested in developing new hardware, edge devices, software, and neuromorphic computing applications. Big tech brands such as IBM, NVIDIA, and Intel, with its Loihi chips, are all involved in neuromorphic computing, while companies in the Netherlands, aligned with a 2024 national white paper, are taking a leading regional role.  For example, the Dutch company Innatera — a leader in ultra-low power neuromorphic processors — recently secured €15 million in Series-A funding from Invest-NL Deep Tech Fund, the EIC Fund, MIG Capital, Matterwave Ventures, and Delft Enterprises.  Innatera is just the tip of the iceberg, as the Netherlands continues to support the new industry through funds, grants, and other incentives. Immediate use cases for neuromorphic computing include event-based sensing technologies integrated into smart sensors such as cameras or audio. These neuromorphic devices only process change, which can dramatically reduce power and data load, said Sylvester Kaczmarek, the CEO of OrbiSky Systems, a company providing AI integration for space technology.   Neuromorphic hardware and software have the potential to transfer AI running on the edge, especially for low-power devices such as mobile, wearables, or IoT.  Pattern recognition, keyword spotting, and simple diagnostics — such as real-time signal processing of complex sensor data streams for biomedical uses, robotics, or industrial monitoring — are some of the leading use cases, Dr Kaczmarek explained.  When applied to pattern recognition and classification or anomaly detection, neuromorphic computing can make decisions very quickly and efficiently,  Professor Dr Hans Hilgenkamp, Scientific Director of the MESA+ Institute at the University of Twente, agreed that pattern recognition is one of the fields where neuromorphic computing excels.  “One may also think about [for example] failure prediction in industrial or automotive applications,” he said.    The gaps creating neuromorphic opportunities Despite the recent progress, the road to establishing robust neuromorphic computing ecosystems in the Netherlands is challenging. Globalised tech supply chains and the standardisation of new technologies leave little room for hardware-level innovation.  For example, optical networks and optical chips have proven to outperform traditional systems in use today, but the tech has not been deployed globally. Deploying new hardware involves strategic coordination between the public and private sectors. The global rollout of 5G technology provides a good example of the challenges. It required telcos and governments around the world to deploy not only new antennas, but also smartphones, laptops, and a lot of hardware that could support the new standard.  On the software side, meanwhile, 5G systems had a pressing need for global standards to ensure integration, interoperability, and smooth deployment. Additionally, established telcos had to move from pure competition to strategic collaboration— an unfamiliar shift for an industry long built on siloed operations. Neuromorphic computing ecosystems face similar obstacles. The Netherlands recognises that the entire industry’s success depends on innovation in materials, devices, circuit designs, hardware architecture, algorithms, and applications.  These challenges and gaps are driving new opportunities for tech companies, startups, vendors, and partners.  Dr Kaczmarek told us that neuromorphic computing requires full-stack integration. This involves expertise that can connect novel materials and devices through circuit design and architectures to algorithms and applications. “Bringing these layers together is crucial but challenging,” he said.  On the algorithms and software side of things, developing new paradigms of programming, learning rules (beyond standard deep learning backpropagation), and software tools native to neuromorphic hardware are also priorities.  “It is crucial to make the hardware usable and efficient — co-designing hardware and algorithms because they are intimately coupled in neuromorphic systems,” said Dr Kaczmarek.  Other industries which have developed or are considering research on neuromorphic computing include healthcare (brain-computer interfaces and prosthetics), agri-food, and sustainable energy.  Neuromorphic computing modules or components can also be integrated with conventional CMOS, photonics, AI, and even quantum technologies.  Long-term opportunities in the Netherlands We asked Dr Hilgenkamp what expertise or innovations are most needed and offer the greatest opportunities for contribution and growth within this emerging ecosystem. “The long-term developments involve new materials and a lot of research, which is already taking place on an academic level,” Dr Hilgenkamp said.  He added that the idea of “materials that can learn” brings up completely new concepts in materials science that are exciting for researchers.  On the other hand, Dr Mentink pointed to the opportunity to transform our economies, which rely on processing massive amounts of data.  “Even replacing a small part of that with neuromorphic computing will lead to massive energy savings,” he said.  “Moreover, with neuromorphic computing, much more processing can be done close to where the data is produced. This is good news for situations in which data contains privacy-sensitive information.”  Concrete examples, according to Dr Mentink, also include fraud detection for credit card transactions, image analysis by robots and drones, anomaly detection of heartbeats, and processing of telecom data. “The most promising use cases are those involving huge data flows, strong demands for very fast response times, and small energy budgets,” said Dr Mentink.  As the use cases for neuromorphic computing increase, Dr Mentink expects the development of software toolchains that enable quick adoption of new neuromorphic platforms to see growth. This new sector would include services to streamline deployment. “Longer-term sustainable growth requires a concerted interdisciplinary effort across the whole computing stack to enable seamless integration of foundational discoveries to applications in new neuromorphic computing systems,” Dr Mentink said.  The bottom line The potential of neuromorphic computing has translated into billions of dollars in investment in the Netherlands and Europe, as well as in Asia and the rest of the world.  Businesses that can innovate, develop, and integrate hardware and software-level neuromorphic technologies stand to gain the most.   The potential of neuromorphic computing for greater energy efficiency and performance could ripple across industries. Energy, healthcare, robotics, AI, industrial IoT, and quantum tech all stand to benefit if they integrate the technology. And if the Dutch ecosystem takes off, the Netherlands will be in a position to lead the way. Supporting Dutch tech is a key mission of TNW Conference, which takes place on June 19-20 in Amsterdam. Tickets are now on sale — use the code TNWXMEDIA2025 at the checkout to get 30% off. Story by Ray Fernandez Ray Fernandez is a journalist with over a decade of experience reporting on technology, finance, science, and natural resources. His work ha (show all) Ray Fernandez is a journalist with over a decade of experience reporting on technology, finance, science, and natural resources. His work has been published in Bloomberg, TechRepublic, The Sunday Mail, eSecurityPlanet, and many others. He is a contributing writer for Espacio Media Incubator, which has reporters across the US, Europe, Asia, and Latin America. Get the TNW newsletter Get the most important tech news in your inbox each week. Also tagged with
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • The End of the Universe May Arrive Surprisingly Soon

    May 16, 20253 min readThe Universe May End Sooner Than Scientists Had ExpectedA new study suggests the universe's end could occur much sooner than previously thought. But don't worry, that ultimate cosmic conclusion would still be in the unimaginably distant futureBy Sharmila Kuthunur & SPACE.com An illustration of the remnants of an ancient, dead planetary system orbiting a white dwarf star. New calculations suggest that white dwarfs and other long-lived celestial objects are decaying faster than previously realized. NASA/ZUMA Press Wire Service/ZUMAPRESS.com/Alamy Live NewsAs the story of our cosmos moves forward, stars will slowly burn out, planets will freeze over, and black holes will devour light itself. Eventually, on timescales so long humanity will never witness them, the universe will fade into darkness.But if you've ever wondered exactly when it all might end, you may find it oddly comforting, or perhaps a bit unsettling, to know that someone has actually done the math. As it turns out, this cosmic finale might arrive sooner than scientists previously thought.Don't worry, though — "sooner" still means a mind-bending 10 to the power of 78 years from now. That is a 1 followed by 78 zeros, which is unimaginably far into the future. However, in cosmic terms, this estimate is a dramatic advancement from the previous prediction of 10 to the power of 1,100 years, made by Falcke and his team in 2023.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today."The ultimate end of the universe comes much sooner than expected, but fortunately it still takes a very long time," Heino Falcke, a theoretical astrophysicist at the Radboud University in the Netherlands, who led the new study, said in a statement.The team's new calculations focus on predicting when the universe's most enduring celestial objects — the glowing remnants of dead stars such as white dwarfs and neutron stars — will ultimately fade away.This gradual decay is driven by Hawking radiation, a concept proposed by physicist Stephen Hawking in the 1970s. The theory suggests a peculiar process occurs near the event horizon — the point of no return — around black holes. Normally, virtual pairs of particles are constantly created by what are known as quantum fluctuations. These particle pairs pop in and out of existence, rapidly annihilating each other. Near a black hole's event horizon, however, the intense gravitational field prevents such annihilation. Instead, the pair is separated: one particle, one carrying negative energy, falls into the black hole, reducing its mass, while the other escapes into space.Over incredibly long timescales, Hawking's theory suggests this process causes the black hole to slowly evaporate, eventually vanishing.Falcke and his team extended this idea beyond black holes to other compact objects with strong gravitational fields. They found that the "evaporation time" of other objects emitting Hawking radiation depends solely on their densities. This is because unlike black hole evaporation, which is driven by the presence of an event horizon, this more general form of decay is driven by the curvature of spacetime itself.The team's new findings, described in a paper published Mondayin the Journal of Cosmology and Astroparticle Physics on Monday, offer a new estimate for how long it takes white dwarf stars to dissolve into nothingness. Surprisingly, the team found that neutron stars and stellar-mass black holes decay over the same timescale: about 10 to the power of 67 years. This was unexpected, as black holes have stronger gravitational fields and were thought to evaporate faster."But black holes have no surface," Michael Wondrak, a postdoctoral researcher of astrophysics at Radboud University and a co-author of the study, said in the statement. "They reabsorb some of their own radiation, which inhibits the process."If even white dwarf stars and black holes eventually dissolve into nothing, what does that say about us? Perhaps it suggests meaning isn't found in permanence, but in the fleeting brilliance of asking questions like these — while the stars are still shining.Copyright 2025 Space.com, a Future company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
    #end #universe #arrive #surprisingly #soon
    The End of the Universe May Arrive Surprisingly Soon
    May 16, 20253 min readThe Universe May End Sooner Than Scientists Had ExpectedA new study suggests the universe's end could occur much sooner than previously thought. But don't worry, that ultimate cosmic conclusion would still be in the unimaginably distant futureBy Sharmila Kuthunur & SPACE.com An illustration of the remnants of an ancient, dead planetary system orbiting a white dwarf star. New calculations suggest that white dwarfs and other long-lived celestial objects are decaying faster than previously realized. NASA/ZUMA Press Wire Service/ZUMAPRESS.com/Alamy Live NewsAs the story of our cosmos moves forward, stars will slowly burn out, planets will freeze over, and black holes will devour light itself. Eventually, on timescales so long humanity will never witness them, the universe will fade into darkness.But if you've ever wondered exactly when it all might end, you may find it oddly comforting, or perhaps a bit unsettling, to know that someone has actually done the math. As it turns out, this cosmic finale might arrive sooner than scientists previously thought.Don't worry, though — "sooner" still means a mind-bending 10 to the power of 78 years from now. That is a 1 followed by 78 zeros, which is unimaginably far into the future. However, in cosmic terms, this estimate is a dramatic advancement from the previous prediction of 10 to the power of 1,100 years, made by Falcke and his team in 2023.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today."The ultimate end of the universe comes much sooner than expected, but fortunately it still takes a very long time," Heino Falcke, a theoretical astrophysicist at the Radboud University in the Netherlands, who led the new study, said in a statement.The team's new calculations focus on predicting when the universe's most enduring celestial objects — the glowing remnants of dead stars such as white dwarfs and neutron stars — will ultimately fade away.This gradual decay is driven by Hawking radiation, a concept proposed by physicist Stephen Hawking in the 1970s. The theory suggests a peculiar process occurs near the event horizon — the point of no return — around black holes. Normally, virtual pairs of particles are constantly created by what are known as quantum fluctuations. These particle pairs pop in and out of existence, rapidly annihilating each other. Near a black hole's event horizon, however, the intense gravitational field prevents such annihilation. Instead, the pair is separated: one particle, one carrying negative energy, falls into the black hole, reducing its mass, while the other escapes into space.Over incredibly long timescales, Hawking's theory suggests this process causes the black hole to slowly evaporate, eventually vanishing.Falcke and his team extended this idea beyond black holes to other compact objects with strong gravitational fields. They found that the "evaporation time" of other objects emitting Hawking radiation depends solely on their densities. This is because unlike black hole evaporation, which is driven by the presence of an event horizon, this more general form of decay is driven by the curvature of spacetime itself.The team's new findings, described in a paper published Mondayin the Journal of Cosmology and Astroparticle Physics on Monday, offer a new estimate for how long it takes white dwarf stars to dissolve into nothingness. Surprisingly, the team found that neutron stars and stellar-mass black holes decay over the same timescale: about 10 to the power of 67 years. This was unexpected, as black holes have stronger gravitational fields and were thought to evaporate faster."But black holes have no surface," Michael Wondrak, a postdoctoral researcher of astrophysics at Radboud University and a co-author of the study, said in the statement. "They reabsorb some of their own radiation, which inhibits the process."If even white dwarf stars and black holes eventually dissolve into nothing, what does that say about us? Perhaps it suggests meaning isn't found in permanence, but in the fleeting brilliance of asking questions like these — while the stars are still shining.Copyright 2025 Space.com, a Future company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed. #end #universe #arrive #surprisingly #soon
    WWW.SCIENTIFICAMERICAN.COM
    The End of the Universe May Arrive Surprisingly Soon
    May 16, 20253 min readThe Universe May End Sooner Than Scientists Had ExpectedA new study suggests the universe's end could occur much sooner than previously thought. But don't worry, that ultimate cosmic conclusion would still be in the unimaginably distant futureBy Sharmila Kuthunur & SPACE.com An illustration of the remnants of an ancient, dead planetary system orbiting a white dwarf star. New calculations suggest that white dwarfs and other long-lived celestial objects are decaying faster than previously realized. NASA/ZUMA Press Wire Service/ZUMAPRESS.com/Alamy Live NewsAs the story of our cosmos moves forward, stars will slowly burn out, planets will freeze over, and black holes will devour light itself. Eventually, on timescales so long humanity will never witness them, the universe will fade into darkness.But if you've ever wondered exactly when it all might end, you may find it oddly comforting, or perhaps a bit unsettling, to know that someone has actually done the math. As it turns out, this cosmic finale might arrive sooner than scientists previously thought.Don't worry, though — "sooner" still means a mind-bending 10 to the power of 78 years from now. That is a 1 followed by 78 zeros, which is unimaginably far into the future. However, in cosmic terms, this estimate is a dramatic advancement from the previous prediction of 10 to the power of 1,100 years, made by Falcke and his team in 2023.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today."The ultimate end of the universe comes much sooner than expected, but fortunately it still takes a very long time," Heino Falcke, a theoretical astrophysicist at the Radboud University in the Netherlands, who led the new study, said in a statement.The team's new calculations focus on predicting when the universe's most enduring celestial objects — the glowing remnants of dead stars such as white dwarfs and neutron stars — will ultimately fade away.This gradual decay is driven by Hawking radiation, a concept proposed by physicist Stephen Hawking in the 1970s. The theory suggests a peculiar process occurs near the event horizon — the point of no return — around black holes. Normally, virtual pairs of particles are constantly created by what are known as quantum fluctuations. These particle pairs pop in and out of existence, rapidly annihilating each other. Near a black hole's event horizon, however, the intense gravitational field prevents such annihilation. Instead, the pair is separated: one particle, one carrying negative energy, falls into the black hole, reducing its mass, while the other escapes into space.Over incredibly long timescales, Hawking's theory suggests this process causes the black hole to slowly evaporate, eventually vanishing.Falcke and his team extended this idea beyond black holes to other compact objects with strong gravitational fields. They found that the "evaporation time" of other objects emitting Hawking radiation depends solely on their densities. This is because unlike black hole evaporation, which is driven by the presence of an event horizon, this more general form of decay is driven by the curvature of spacetime itself.The team's new findings, described in a paper published Monday (May 12) in the Journal of Cosmology and Astroparticle Physics on Monday (May 12), offer a new estimate for how long it takes white dwarf stars to dissolve into nothingness. Surprisingly, the team found that neutron stars and stellar-mass black holes decay over the same timescale: about 10 to the power of 67 years. This was unexpected, as black holes have stronger gravitational fields and were thought to evaporate faster."But black holes have no surface," Michael Wondrak, a postdoctoral researcher of astrophysics at Radboud University and a co-author of the study, said in the statement. "They reabsorb some of their own radiation, which inhibits the process."If even white dwarf stars and black holes eventually dissolve into nothing, what does that say about us? Perhaps it suggests meaning isn't found in permanence, but in the fleeting brilliance of asking questions like these — while the stars are still shining.Copyright 2025 Space.com, a Future company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • #333;">The Universe Will Fizzle Out Way Sooner Than Expected, Scientists Say

    By

    Passant Rabie
    Published May 13, 2025

    |
    Comments (1)

    |

    An illustration of a decaying neutron star.
    Daniëlle Futselaar/artsource.nl

    Around 13.8 billion years ago, a tiny but dense fireball gave birth to the vast cosmos that holds trillions of galaxies, including the Milky Way.
    But our universe is dying, and it’s happening at a much faster rate than scientists previously estimated, according to new research.
    The last stellar remnants of the universe will cease to exist in 10 to the power of 78 years (that’s a one with 78 zeros), according to a new estimate from a group of scientists at Radboud University in the Netherlands.
    That’s still a long way off from when the universe powers down for good—but it’s a far earlier fade-to-black moment than the previous 10 to the power of 1,100 years estimate.
    The new paper, published Monday in the Journal of Cosmology and Astroparticle Physics, is a follow-up to a previous study by the same group of researchers.
    In their 2023 study, black hole expert Heino Falcke, quantum physicist Michael Wondrak, and mathematician Walter van Suijlekom suggested that other objects, like neutron stars, could evaporate in much the same way as black holes.
    The original theory, developed by Stephen Hawking in 1974, proposed that radiation escaping near a black hole’s event horizon would gradually erode its mass over time.
    The phenomenon, known as Hawking radiation, remains one of the most surprising ideas about black holes to this day.
    Building on the theory of Hawking radiation, the researchers behind the new paper suggest that the process of erosion depends on the density of the object.
    They found that neutron stars and stellar black holes take roughly the same amount of time to decay, an estimated 10 to the power of 67 years.
    Although black holes have a stronger gravitational field that should cause them to evaporate faster, they also have no surface so they end up reabsorbing some of their own radiation, “which inhibits the process,” Wondrak said in a statement.
    The researchers then calculated how long various celestial bodies would take to evaporate via Hawking-like radiation, leading them to the abbreviated cosmic expiration date. “So the ultimate end of the universe comes much sooner than expected, but fortunately it still takes a very long time,” Falcke said.
    The study also estimates that it would take the Moon around 10 to the power of 90 years to evaporate based on Hawking radiation.
    “By asking these kinds of questions and looking at extreme cases, we want to better understand the theory, and perhaps one day, we unravel the mystery of Hawking radiation,” van Suijlekom said.
    Daily Newsletter
    You May Also Like
    By

    Isaac Schultz
    Published May 11, 2025

    By

    Passant Rabie
    Published March 20, 2025

    By

    Passant Rabie
    Published February 10, 2025

    By

    Isaac Schultz
    Published February 2, 2025

    By

    Margherita Bassi
    Published February 1, 2025

    By

    Isaac Schultz
    Published January 28, 2025

    #666;">المصدر: https://gizmodo.com/the-universe-will-fizzle-out-way-sooner-than-expected-scientists-say-2000601411" style="color: #0066cc; text-decoration: none;">gizmodo.com
    #0066cc;">#the #universe #will #fizzle #out #way #sooner #than #expected #scientists #say #passant #rabie #published #may #comments #illustration #decaying #neutron #stardaniëlle #futselaarartsourcenl #around #billion #years #ago #tiny #but #dense #fireball #gave #birth #vast #cosmos #that #holds #trillions #galaxies #including #milky #waybut #our #dying #and #its #happening #much #faster #rate #previously #estimated #according #new #research #last #stellar #remnants #cease #exist #power #thats #one #with #zeros #estimate #from #group #radboud #university #netherlandsthats #still #long #off #when #powers #down #for #goodbut #far #earlier #fadetoblack #moment #previous #estimatethe #paper #monday #journal #cosmology #astroparticle #physics #followup #study #same #researchersin #their #black #hole #expert #heino #falcke #quantum #physicist #michael #wondrak #mathematician #walter #van #suijlekom #suggested #other #objects #like #stars #could #evaporate #holesthe #original #theory #developed #stephen #hawking #proposed #radiation #escaping #near #holes #event #horizon #would #gradually #erode #mass #over #timethe #phenomenon #known #remains #most #surprising #ideas #about #this #daybuilding #researchers #behind #suggest #process #erosion #depends #density #objectthey #found #take #roughly #amount #time #decay #yearsalthough #have #stronger #gravitational #field #should #cause #them #they #also #surface #end #reabsorbing #some #own #which #inhibits #said #statementthe #then #calculated #how #various #celestial #bodies #via #hawkinglike #leading #abbreviated #cosmic #expiration #dateso #ultimate #comes #fortunately #takes #very #saidthe #estimates #moon #based #radiationby #asking #these #kinds #questions #looking #extreme #cases #want #better #understand #perhaps #day #unravel #mystery #saiddaily #newsletteryou #isaac #schultz #march #february #margherita #bassi #january
    The Universe Will Fizzle Out Way Sooner Than Expected, Scientists Say
    By Passant Rabie Published May 13, 2025 | Comments (1) | An illustration of a decaying neutron star. Daniëlle Futselaar/artsource.nl Around 13.8 billion years ago, a tiny but dense fireball gave birth to the vast cosmos that holds trillions of galaxies, including the Milky Way. But our universe is dying, and it’s happening at a much faster rate than scientists previously estimated, according to new research. The last stellar remnants of the universe will cease to exist in 10 to the power of 78 years (that’s a one with 78 zeros), according to a new estimate from a group of scientists at Radboud University in the Netherlands. That’s still a long way off from when the universe powers down for good—but it’s a far earlier fade-to-black moment than the previous 10 to the power of 1,100 years estimate. The new paper, published Monday in the Journal of Cosmology and Astroparticle Physics, is a follow-up to a previous study by the same group of researchers. In their 2023 study, black hole expert Heino Falcke, quantum physicist Michael Wondrak, and mathematician Walter van Suijlekom suggested that other objects, like neutron stars, could evaporate in much the same way as black holes. The original theory, developed by Stephen Hawking in 1974, proposed that radiation escaping near a black hole’s event horizon would gradually erode its mass over time. The phenomenon, known as Hawking radiation, remains one of the most surprising ideas about black holes to this day. Building on the theory of Hawking radiation, the researchers behind the new paper suggest that the process of erosion depends on the density of the object. They found that neutron stars and stellar black holes take roughly the same amount of time to decay, an estimated 10 to the power of 67 years. Although black holes have a stronger gravitational field that should cause them to evaporate faster, they also have no surface so they end up reabsorbing some of their own radiation, “which inhibits the process,” Wondrak said in a statement. The researchers then calculated how long various celestial bodies would take to evaporate via Hawking-like radiation, leading them to the abbreviated cosmic expiration date. “So the ultimate end of the universe comes much sooner than expected, but fortunately it still takes a very long time,” Falcke said. The study also estimates that it would take the Moon around 10 to the power of 90 years to evaporate based on Hawking radiation. “By asking these kinds of questions and looking at extreme cases, we want to better understand the theory, and perhaps one day, we unravel the mystery of Hawking radiation,” van Suijlekom said. Daily Newsletter You May Also Like By Isaac Schultz Published May 11, 2025 By Passant Rabie Published March 20, 2025 By Passant Rabie Published February 10, 2025 By Isaac Schultz Published February 2, 2025 By Margherita Bassi Published February 1, 2025 By Isaac Schultz Published January 28, 2025
    المصدر: gizmodo.com
    #the #universe #will #fizzle #out #way #sooner #than #expected #scientists #say #passant #rabie #published #may #comments #illustration #decaying #neutron #stardaniëlle #futselaarartsourcenl #around #billion #years #ago #tiny #but #dense #fireball #gave #birth #vast #cosmos #that #holds #trillions #galaxies #including #milky #waybut #our #dying #and #its #happening #much #faster #rate #previously #estimated #according #new #research #last #stellar #remnants #cease #exist #power #thats #one #with #zeros #estimate #from #group #radboud #university #netherlandsthats #still #long #off #when #powers #down #for #goodbut #far #earlier #fadetoblack #moment #previous #estimatethe #paper #monday #journal #cosmology #astroparticle #physics #followup #study #same #researchersin #their #black #hole #expert #heino #falcke #quantum #physicist #michael #wondrak #mathematician #walter #van #suijlekom #suggested #other #objects #like #stars #could #evaporate #holesthe #original #theory #developed #stephen #hawking #proposed #radiation #escaping #near #holes #event #horizon #would #gradually #erode #mass #over #timethe #phenomenon #known #remains #most #surprising #ideas #about #this #daybuilding #researchers #behind #suggest #process #erosion #depends #density #objectthey #found #take #roughly #amount #time #decay #yearsalthough #have #stronger #gravitational #field #should #cause #them #they #also #surface #end #reabsorbing #some #own #which #inhibits #said #statementthe #then #calculated #how #various #celestial #bodies #via #hawkinglike #leading #abbreviated #cosmic #expiration #dateso #ultimate #comes #fortunately #takes #very #saidthe #estimates #moon #based #radiationby #asking #these #kinds #questions #looking #extreme #cases #want #better #understand #perhaps #day #unravel #mystery #saiddaily #newsletteryou #isaac #schultz #march #february #margherita #bassi #january
    GIZMODO.COM
    The Universe Will Fizzle Out Way Sooner Than Expected, Scientists Say
    By Passant Rabie Published May 13, 2025 | Comments (1) | An illustration of a decaying neutron star. Daniëlle Futselaar/artsource.nl Around 13.8 billion years ago, a tiny but dense fireball gave birth to the vast cosmos that holds trillions of galaxies, including the Milky Way. But our universe is dying, and it’s happening at a much faster rate than scientists previously estimated, according to new research. The last stellar remnants of the universe will cease to exist in 10 to the power of 78 years (that’s a one with 78 zeros), according to a new estimate from a group of scientists at Radboud University in the Netherlands. That’s still a long way off from when the universe powers down for good—but it’s a far earlier fade-to-black moment than the previous 10 to the power of 1,100 years estimate. The new paper, published Monday in the Journal of Cosmology and Astroparticle Physics, is a follow-up to a previous study by the same group of researchers. In their 2023 study, black hole expert Heino Falcke, quantum physicist Michael Wondrak, and mathematician Walter van Suijlekom suggested that other objects, like neutron stars, could evaporate in much the same way as black holes. The original theory, developed by Stephen Hawking in 1974, proposed that radiation escaping near a black hole’s event horizon would gradually erode its mass over time. The phenomenon, known as Hawking radiation, remains one of the most surprising ideas about black holes to this day. Building on the theory of Hawking radiation, the researchers behind the new paper suggest that the process of erosion depends on the density of the object. They found that neutron stars and stellar black holes take roughly the same amount of time to decay, an estimated 10 to the power of 67 years. Although black holes have a stronger gravitational field that should cause them to evaporate faster, they also have no surface so they end up reabsorbing some of their own radiation, “which inhibits the process,” Wondrak said in a statement. The researchers then calculated how long various celestial bodies would take to evaporate via Hawking-like radiation, leading them to the abbreviated cosmic expiration date. “So the ultimate end of the universe comes much sooner than expected, but fortunately it still takes a very long time,” Falcke said. The study also estimates that it would take the Moon around 10 to the power of 90 years to evaporate based on Hawking radiation. “By asking these kinds of questions and looking at extreme cases, we want to better understand the theory, and perhaps one day, we unravel the mystery of Hawking radiation,” van Suijlekom said. Daily Newsletter You May Also Like By Isaac Schultz Published May 11, 2025 By Passant Rabie Published March 20, 2025 By Passant Rabie Published February 10, 2025 By Isaac Schultz Published February 2, 2025 By Margherita Bassi Published February 1, 2025 By Isaac Schultz Published January 28, 2025
    0 Yorumlar 0 hisse senetleri 0 önizleme
CGShares https://cgshares.com