• Paper Architecture: From Soviet Subversion to Zaha’s Suprematism

    Architizer’s Vision Awards are back! The global awards program honors the world’s best architectural concepts, ideas and imagery. Submit your work ahead of the Final Entry Deadline on July 11th!
    Behind the term “paper architecture” hides a strange paradox: the radical act of building without, well, building. Paper architecture is usually associated with speculative design projects, presented in the form of drawings, which can also be considered art pieces. However, even though it is often dismissed as a mere utopian or academic exercise, paper architecture has historically served as a powerful form of protest, advocating against political regimes, architectural orthodoxy or cultural stagnation.
    Unbound by real-world limitations such as materials, regulations and budgets, paper architects are free to focus on the messages behind their designs rather than constantly striving for their implementation. In parallel, due to its subtleness, paper architecture has become a platform that enables radical commentary via a rather “safe” medium. Instead of relying on more traditional forms of protestthis powerful visual language, combined with scrupulous aesthetics and imagination can start a more formidable “behind-the-scenes rebellion”.
    Unearthing Nostalgia by Bruno Xavier & Michelle Ashley Ovanessians, A+ Vision Awards, 2023
    Perhaps the most well-known paper architects, Archigram was a radical British collective that was formed in the 1960s in London. Their work Walking City or Plug-In City showcased visions of a playful, technologically driven architecture that deeply contrasted and, by extent, protested against the rigid regime of post-war modernism and its extensive bureaucracy. This pop-art-style architecture served as a powerful critique towards the saturated idea of functional monotony.
    Additionally, the Russian architect, artist, and curator, Yuri Avvakumuv introduced the term “paper architecture” within the restrictive cultural and political climate of late Soviet Russia. Having to deal with heavy censorship, Avvakumuv turned to competitions and speculative drawings in an attempt resist that dominance of totalitarian architecture. Poetic, deeply allegorical and oftentimes ironic architectural renderings, critiqued the bureaucratic sterility of Soviet planning and the state-mandated architectural principles architects had to follow. Consequently, this profound demonstration of un-built architecture within the specific setting, turned into a collective cultural wave that advocated artistic autonomy and expression for the built environment.
    Klothos’ Loom of Memories by Ioana Alexandra Enache, A+ Vision Awards, 2023
    The Amerian architect Lebbeus Woods was also one of the most intellectually intense practitioners of paper architecture, whose work touches upon global issues on war zones and urban trauma. His imaginative, post-apocalyptic cities opened up discussions for rebuilding after destruction. Works such as War and Architecture and Underground Berlin, albeit “dystopic”, acted as moral propositions, exploring potential reconstructions that would “heal” these cities. Through his drawings, he rigorously investigated and examined scenarios of ethical rebuilding, refusing to comply to the principles of popular commerce, and instead creating a new architectural practice of political resistance.
    Finally, operating within a very male-dominated world, Zaha Hadid’s earlier work — particularly on Malevich — served as a protesting tool on multiple levels. Influenced by Suprematist aesthetics, her bold, dynamic compositions stood against the formal conservatism of architectural ideas, where the design must always yield to gravity and function. In parallel, her considerable influence and dominance on the field challenged long-standing norms and served as a powerful counter-narrative against the gender biases that sidelined women in design. Ultimately, her images – part blueprints, part paintings – not only proved that architecture could be unapologetically visionary and abstract but also that materializing it is not as impossible as one would think.My Bedroom by Daniel Wing-Hou Ho, A+ Vision Awards, 2023
    Even though paper architecture began as a medium of rebellion against architectural convention in the mid-20th century, it remains, until today, a vital tool for activism and social justice. Operating in the digital age, social media and digital platforms have amplified its reach, also having given it different visual forms such as digital collages, speculative renders, gifs, reels and interactive visual narratives. What was once a flyer, a journal or a newspaper extract, can now be found in open-source repositories, standing against authoritarianism, climate inaction, political violence and systemic inequality.
    Groups such as Forensic Architecture carry out multidisciplinary research, investigating cases of state violence and violations of human rights through rigorous mapping and speculative visualization. Additionally, competitions such as the eVolo Skyscraper or platforms like ArchOutLoud and Design Earth offer opportunities and space for architects to tackle environmental concerns and dramatize the urgency of inaction. Imaginative floating habitats, food cities, biodegradable megastructures etc. instigate debates and conversations through the form of environmental storytelling.
    The Stamper Battery by By William du Toit, A+ Vision Awards, 2023
    Despite being often condemned as “unbuildable”, “impractical” or even “escapist,” paper architecture acts as a counterweight to the discipline’s increasing instrumentalization as merely a functional or commercial enterprise. In architecture schools it is used as a prompt for “thinking differently” and a tool for “critiquing without compromise”. Above all however, paper architecture matters because it keeps architecture ethically alive. It reminds architects to ask the uncomfortable questions: how should we design for environmental sustainability, migrancy or social equality, instead of focusing on profit, convenience and spectacle? Similar to a moral compass or speculative mirror, unbuilt visions can trigger political, social and environmental turns that reshape not just how we build, but why we build at all.
    Architizer’s Vision Awards are back! The global awards program honors the world’s best architectural concepts, ideas and imagery. Submit your work ahead of the Final Entry Deadline on July 11th!
    Featured Image: Into the Void: Fragmented Time, Space, Memory, and Decay in Hiroshima by Victoria Wong, A+ Vision Awards 2023
    The post Paper Architecture: From Soviet Subversion to Zaha’s Suprematism appeared first on Journal.
    #paper #architecture #soviet #subversion #zahas
    Paper Architecture: From Soviet Subversion to Zaha’s Suprematism
    Architizer’s Vision Awards are back! The global awards program honors the world’s best architectural concepts, ideas and imagery. Submit your work ahead of the Final Entry Deadline on July 11th! Behind the term “paper architecture” hides a strange paradox: the radical act of building without, well, building. Paper architecture is usually associated with speculative design projects, presented in the form of drawings, which can also be considered art pieces. However, even though it is often dismissed as a mere utopian or academic exercise, paper architecture has historically served as a powerful form of protest, advocating against political regimes, architectural orthodoxy or cultural stagnation. Unbound by real-world limitations such as materials, regulations and budgets, paper architects are free to focus on the messages behind their designs rather than constantly striving for their implementation. In parallel, due to its subtleness, paper architecture has become a platform that enables radical commentary via a rather “safe” medium. Instead of relying on more traditional forms of protestthis powerful visual language, combined with scrupulous aesthetics and imagination can start a more formidable “behind-the-scenes rebellion”. Unearthing Nostalgia by Bruno Xavier & Michelle Ashley Ovanessians, A+ Vision Awards, 2023 Perhaps the most well-known paper architects, Archigram was a radical British collective that was formed in the 1960s in London. Their work Walking City or Plug-In City showcased visions of a playful, technologically driven architecture that deeply contrasted and, by extent, protested against the rigid regime of post-war modernism and its extensive bureaucracy. This pop-art-style architecture served as a powerful critique towards the saturated idea of functional monotony. Additionally, the Russian architect, artist, and curator, Yuri Avvakumuv introduced the term “paper architecture” within the restrictive cultural and political climate of late Soviet Russia. Having to deal with heavy censorship, Avvakumuv turned to competitions and speculative drawings in an attempt resist that dominance of totalitarian architecture. Poetic, deeply allegorical and oftentimes ironic architectural renderings, critiqued the bureaucratic sterility of Soviet planning and the state-mandated architectural principles architects had to follow. Consequently, this profound demonstration of un-built architecture within the specific setting, turned into a collective cultural wave that advocated artistic autonomy and expression for the built environment. Klothos’ Loom of Memories by Ioana Alexandra Enache, A+ Vision Awards, 2023 The Amerian architect Lebbeus Woods was also one of the most intellectually intense practitioners of paper architecture, whose work touches upon global issues on war zones and urban trauma. His imaginative, post-apocalyptic cities opened up discussions for rebuilding after destruction. Works such as War and Architecture and Underground Berlin, albeit “dystopic”, acted as moral propositions, exploring potential reconstructions that would “heal” these cities. Through his drawings, he rigorously investigated and examined scenarios of ethical rebuilding, refusing to comply to the principles of popular commerce, and instead creating a new architectural practice of political resistance. Finally, operating within a very male-dominated world, Zaha Hadid’s earlier work — particularly on Malevich — served as a protesting tool on multiple levels. Influenced by Suprematist aesthetics, her bold, dynamic compositions stood against the formal conservatism of architectural ideas, where the design must always yield to gravity and function. In parallel, her considerable influence and dominance on the field challenged long-standing norms and served as a powerful counter-narrative against the gender biases that sidelined women in design. Ultimately, her images – part blueprints, part paintings – not only proved that architecture could be unapologetically visionary and abstract but also that materializing it is not as impossible as one would think.My Bedroom by Daniel Wing-Hou Ho, A+ Vision Awards, 2023 Even though paper architecture began as a medium of rebellion against architectural convention in the mid-20th century, it remains, until today, a vital tool for activism and social justice. Operating in the digital age, social media and digital platforms have amplified its reach, also having given it different visual forms such as digital collages, speculative renders, gifs, reels and interactive visual narratives. What was once a flyer, a journal or a newspaper extract, can now be found in open-source repositories, standing against authoritarianism, climate inaction, political violence and systemic inequality. Groups such as Forensic Architecture carry out multidisciplinary research, investigating cases of state violence and violations of human rights through rigorous mapping and speculative visualization. Additionally, competitions such as the eVolo Skyscraper or platforms like ArchOutLoud and Design Earth offer opportunities and space for architects to tackle environmental concerns and dramatize the urgency of inaction. Imaginative floating habitats, food cities, biodegradable megastructures etc. instigate debates and conversations through the form of environmental storytelling. The Stamper Battery by By William du Toit, A+ Vision Awards, 2023 Despite being often condemned as “unbuildable”, “impractical” or even “escapist,” paper architecture acts as a counterweight to the discipline’s increasing instrumentalization as merely a functional or commercial enterprise. In architecture schools it is used as a prompt for “thinking differently” and a tool for “critiquing without compromise”. Above all however, paper architecture matters because it keeps architecture ethically alive. It reminds architects to ask the uncomfortable questions: how should we design for environmental sustainability, migrancy or social equality, instead of focusing on profit, convenience and spectacle? Similar to a moral compass or speculative mirror, unbuilt visions can trigger political, social and environmental turns that reshape not just how we build, but why we build at all. Architizer’s Vision Awards are back! The global awards program honors the world’s best architectural concepts, ideas and imagery. Submit your work ahead of the Final Entry Deadline on July 11th! Featured Image: Into the Void: Fragmented Time, Space, Memory, and Decay in Hiroshima by Victoria Wong, A+ Vision Awards 2023 The post Paper Architecture: From Soviet Subversion to Zaha’s Suprematism appeared first on Journal. #paper #architecture #soviet #subversion #zahas
    ARCHITIZER.COM
    Paper Architecture: From Soviet Subversion to Zaha’s Suprematism
    Architizer’s Vision Awards are back! The global awards program honors the world’s best architectural concepts, ideas and imagery. Submit your work ahead of the Final Entry Deadline on July 11th! Behind the term “paper architecture” hides a strange paradox: the radical act of building without, well, building. Paper architecture is usually associated with speculative design projects, presented in the form of drawings, which can also be considered art pieces. However, even though it is often dismissed as a mere utopian or academic exercise, paper architecture has historically served as a powerful form of protest, advocating against political regimes, architectural orthodoxy or cultural stagnation. Unbound by real-world limitations such as materials, regulations and budgets, paper architects are free to focus on the messages behind their designs rather than constantly striving for their implementation. In parallel, due to its subtleness, paper architecture has become a platform that enables radical commentary via a rather “safe” medium. Instead of relying on more traditional forms of protest (such as strikes or marches) this powerful visual language, combined with scrupulous aesthetics and imagination can start a more formidable “behind-the-scenes rebellion”. Unearthing Nostalgia by Bruno Xavier & Michelle Ashley Ovanessians, A+ Vision Awards, 2023 Perhaps the most well-known paper architects, Archigram was a radical British collective that was formed in the 1960s in London. Their work Walking City or Plug-In City showcased visions of a playful, technologically driven architecture that deeply contrasted and, by extent, protested against the rigid regime of post-war modernism and its extensive bureaucracy. This pop-art-style architecture served as a powerful critique towards the saturated idea of functional monotony. Additionally, the Russian architect, artist, and curator, Yuri Avvakumuv introduced the term “paper architecture” within the restrictive cultural and political climate of late Soviet Russia (1984). Having to deal with heavy censorship, Avvakumuv turned to competitions and speculative drawings in an attempt resist that dominance of totalitarian architecture. Poetic, deeply allegorical and oftentimes ironic architectural renderings, critiqued the bureaucratic sterility of Soviet planning and the state-mandated architectural principles architects had to follow. Consequently, this profound demonstration of un-built architecture within the specific setting, turned into a collective cultural wave that advocated artistic autonomy and expression for the built environment. Klothos’ Loom of Memories by Ioana Alexandra Enache, A+ Vision Awards, 2023 The Amerian architect Lebbeus Woods was also one of the most intellectually intense practitioners of paper architecture, whose work touches upon global issues on war zones and urban trauma. His imaginative, post-apocalyptic cities opened up discussions for rebuilding after destruction. Works such as War and Architecture and Underground Berlin, albeit “dystopic”, acted as moral propositions, exploring potential reconstructions that would “heal” these cities. Through his drawings, he rigorously investigated and examined scenarios of ethical rebuilding, refusing to comply to the principles of popular commerce, and instead creating a new architectural practice of political resistance. Finally, operating within a very male-dominated world, Zaha Hadid’s earlier work — particularly on Malevich — served as a protesting tool on multiple levels. Influenced by Suprematist aesthetics, her bold, dynamic compositions stood against the formal conservatism of architectural ideas, where the design must always yield to gravity and function. In parallel, her considerable influence and dominance on the field challenged long-standing norms and served as a powerful counter-narrative against the gender biases that sidelined women in design. Ultimately, her images – part blueprints, part paintings – not only proved that architecture could be unapologetically visionary and abstract but also that materializing it is not as impossible as one would think. (Your) My Bedroom by Daniel Wing-Hou Ho, A+ Vision Awards, 2023 Even though paper architecture began as a medium of rebellion against architectural convention in the mid-20th century, it remains, until today, a vital tool for activism and social justice. Operating in the digital age, social media and digital platforms have amplified its reach, also having given it different visual forms such as digital collages, speculative renders, gifs, reels and interactive visual narratives. What was once a flyer, a journal or a newspaper extract, can now be found in open-source repositories, standing against authoritarianism, climate inaction, political violence and systemic inequality. Groups such as Forensic Architecture (Goldsmiths, University of London)  carry out multidisciplinary research, investigating cases of state violence and violations of human rights through rigorous mapping and speculative visualization. Additionally, competitions such as the eVolo Skyscraper or platforms like ArchOutLoud and Design Earth offer opportunities and space for architects to tackle environmental concerns and dramatize the urgency of inaction. Imaginative floating habitats, food cities, biodegradable megastructures etc. instigate debates and conversations through the form of environmental storytelling. The Stamper Battery by By William du Toit, A+ Vision Awards, 2023 Despite being often condemned as “unbuildable”, “impractical” or even “escapist,” paper architecture acts as a counterweight to the discipline’s increasing instrumentalization as merely a functional or commercial enterprise. In architecture schools it is used as a prompt for “thinking differently” and a tool for “critiquing without compromise”. Above all however, paper architecture matters because it keeps architecture ethically alive. It reminds architects to ask the uncomfortable questions: how should we design for environmental sustainability, migrancy or social equality, instead of focusing on profit, convenience and spectacle? Similar to a moral compass or speculative mirror, unbuilt visions can trigger political, social and environmental turns that reshape not just how we build, but why we build at all. Architizer’s Vision Awards are back! The global awards program honors the world’s best architectural concepts, ideas and imagery. Submit your work ahead of the Final Entry Deadline on July 11th! Featured Image: Into the Void: Fragmented Time, Space, Memory, and Decay in Hiroshima by Victoria Wong, A+ Vision Awards 2023 The post Paper Architecture: From Soviet Subversion to Zaha’s Suprematism appeared first on Journal.
    0 Commenti 0 condivisioni
  • Dr. Ella Hawkins Reimagines Ancient Artifacts and Prized Objects as Edible Replicas

    William Morris Biscuit Set. All images courtesy of Ella Hawkins, shared with permission
    Dr. Ella Hawkins Reimagines Ancient Artifacts and Prized Objects as Edible Replicas
    May 31, 2025
    Grace Ebert

    Academic research is notoriously niche and often opaque, but Dr. Ella Hawkins has found a crowd-pleasing way to share her studies. The Birmingham-based artist and design historian translates her interests in Shakespeare performance, costume, and matieral culture into edible replicas.
    Hawkins bakes batches of cookies that she tops with royal icing. Decorating takes a scholarly turn, as she uses tiny paintbrushes and a mini projector to help trace imagery of William Morris’ ornate floral motifs or coastal scenes from English delftware. Rendering a design on a single cookie can take anywhere between two and four hours, depending on the complexity. Unsurprisingly, minuscule calligraphy and portraits are most demanding.
    Ancient Greek Pottery Sherds
    Hawkins first merged baking and her research about a decade ago while studying undergraduate costume design at the University of Warwick. She decided to bake cupcakes based on Shakespeare productions that her class examined. “It felt like a fun way to look back at all the different design styles we’d covered through the year,” she tells Colossal, adding:

    I carried on decorating cakes and cookies based on costume design through my PhD, then branched out and spent lots of time doing cookie versions of other artefacts to keep busy during the pandemic.

    She has since published an academic book on the topic and is a senior lecturer at Royal Welsh College of Music and Drama. But she also continues to translate artifacts and prized objects held within museum collections into delicious canvases.
    There’s a set made in collaboration with Milton’s Cottage, a museum in the country house where John Milton finished his epic Paradise Lost. Anchored by a delicately crosshatched portrait evoking that of the frontispiece, the collection contains typographic titles and signs that appear straight from a 17th-century book.
    Delftware Tiles
    Hawkins ventures farther back in history to ancient Greece with a collection of pottery sherds inspired by objects within the Ashmolean Museum. With a bowed surface to mimic a vessel’s curvature, the irregular shapes feature fragments of various motifs and figures to which she applied a sgraffito technique, a Renaissance method of scratching a surface to reveal the layer below.
    The weathered appearance is the result of blotting a base of pale brown-grey before using a scribe tool to scratch and crack the royal icing coating the surface. She then lined these etchings with a mix of vodka and black food coloring to mimic dirt and wear.Other than a select few preserved for talks and events, Hawkins assures us that the rest of her cookies are eaten. Find more of her work on her website and Instagram.
    Medieval Tiles, inspired by The Tristram Tiles, Chertsey, Surrey, EnglandMilton’s Cottage Biscuit Set developed in collaboration with Milton’s Cottage
    Outlander Biscuit Set
    Elizabethan Gauntlet Biscuit Set
    Next article
    #ella #hawkins #reimagines #ancient #artifacts
    Dr. Ella Hawkins Reimagines Ancient Artifacts and Prized Objects as Edible Replicas
    William Morris Biscuit Set. All images courtesy of Ella Hawkins, shared with permission Dr. Ella Hawkins Reimagines Ancient Artifacts and Prized Objects as Edible Replicas May 31, 2025 Grace Ebert Academic research is notoriously niche and often opaque, but Dr. Ella Hawkins has found a crowd-pleasing way to share her studies. The Birmingham-based artist and design historian translates her interests in Shakespeare performance, costume, and matieral culture into edible replicas. Hawkins bakes batches of cookies that she tops with royal icing. Decorating takes a scholarly turn, as she uses tiny paintbrushes and a mini projector to help trace imagery of William Morris’ ornate floral motifs or coastal scenes from English delftware. Rendering a design on a single cookie can take anywhere between two and four hours, depending on the complexity. Unsurprisingly, minuscule calligraphy and portraits are most demanding. Ancient Greek Pottery Sherds Hawkins first merged baking and her research about a decade ago while studying undergraduate costume design at the University of Warwick. She decided to bake cupcakes based on Shakespeare productions that her class examined. “It felt like a fun way to look back at all the different design styles we’d covered through the year,” she tells Colossal, adding: I carried on decorating cakes and cookies based on costume design through my PhD, then branched out and spent lots of time doing cookie versions of other artefacts to keep busy during the pandemic. She has since published an academic book on the topic and is a senior lecturer at Royal Welsh College of Music and Drama. But she also continues to translate artifacts and prized objects held within museum collections into delicious canvases. There’s a set made in collaboration with Milton’s Cottage, a museum in the country house where John Milton finished his epic Paradise Lost. Anchored by a delicately crosshatched portrait evoking that of the frontispiece, the collection contains typographic titles and signs that appear straight from a 17th-century book. Delftware Tiles Hawkins ventures farther back in history to ancient Greece with a collection of pottery sherds inspired by objects within the Ashmolean Museum. With a bowed surface to mimic a vessel’s curvature, the irregular shapes feature fragments of various motifs and figures to which she applied a sgraffito technique, a Renaissance method of scratching a surface to reveal the layer below. The weathered appearance is the result of blotting a base of pale brown-grey before using a scribe tool to scratch and crack the royal icing coating the surface. She then lined these etchings with a mix of vodka and black food coloring to mimic dirt and wear.Other than a select few preserved for talks and events, Hawkins assures us that the rest of her cookies are eaten. Find more of her work on her website and Instagram. Medieval Tiles, inspired by The Tristram Tiles, Chertsey, Surrey, EnglandMilton’s Cottage Biscuit Set developed in collaboration with Milton’s Cottage Outlander Biscuit Set Elizabethan Gauntlet Biscuit Set Next article #ella #hawkins #reimagines #ancient #artifacts
    WWW.THISISCOLOSSAL.COM
    Dr. Ella Hawkins Reimagines Ancient Artifacts and Prized Objects as Edible Replicas
    William Morris Biscuit Set. All images courtesy of Ella Hawkins, shared with permission Dr. Ella Hawkins Reimagines Ancient Artifacts and Prized Objects as Edible Replicas May 31, 2025 Grace Ebert Academic research is notoriously niche and often opaque, but Dr. Ella Hawkins has found a crowd-pleasing way to share her studies. The Birmingham-based artist and design historian translates her interests in Shakespeare performance, costume, and matieral culture into edible replicas. Hawkins bakes batches of cookies that she tops with royal icing. Decorating takes a scholarly turn, as she uses tiny paintbrushes and a mini projector to help trace imagery of William Morris’ ornate floral motifs or coastal scenes from English delftware. Rendering a design on a single cookie can take anywhere between two and four hours, depending on the complexity. Unsurprisingly, minuscule calligraphy and portraits are most demanding. Ancient Greek Pottery Sherds Hawkins first merged baking and her research about a decade ago while studying undergraduate costume design at the University of Warwick. She decided to bake cupcakes based on Shakespeare productions that her class examined. “It felt like a fun way to look back at all the different design styles we’d covered through the year,” she tells Colossal, adding: I carried on decorating cakes and cookies based on costume design through my PhD (mainly as goodies to give out during talks, or as gifts for designers that I interviewed), then branched out and spent lots of time doing cookie versions of other artefacts to keep busy during the pandemic. She has since published an academic book on the topic and is a senior lecturer at Royal Welsh College of Music and Drama. But she also continues to translate artifacts and prized objects held within museum collections into delicious canvases. There’s a set made in collaboration with Milton’s Cottage, a museum in the country house where John Milton finished his epic Paradise Lost. Anchored by a delicately crosshatched portrait evoking that of the frontispiece, the collection contains typographic titles and signs that appear straight from a 17th-century book. Delftware Tiles Hawkins ventures farther back in history to ancient Greece with a collection of pottery sherds inspired by objects within the Ashmolean Museum. With a bowed surface to mimic a vessel’s curvature, the irregular shapes feature fragments of various motifs and figures to which she applied a sgraffito technique, a Renaissance method of scratching a surface to reveal the layer below. The weathered appearance is the result of blotting a base of pale brown-grey before using a scribe tool to scratch and crack the royal icing coating the surface. She then lined these etchings with a mix of vodka and black food coloring to mimic dirt and wear. (It’s worth taking a look at this process video.) Other than a select few preserved for talks and events, Hawkins assures us that the rest of her cookies are eaten. Find more of her work on her website and Instagram. Medieval Tiles, inspired by The Tristram Tiles, Chertsey, Surrey, England (c. 1260s-70s) Milton’s Cottage Biscuit Set developed in collaboration with Milton’s Cottage Outlander Biscuit Set Elizabethan Gauntlet Biscuit Set Next article
    9 Commenti 0 condivisioni
  • Google and DOJ tussle over how AI will remake the web in antitrust closing arguments

    Google's reckoning

    Google and DOJ tussle over how AI will remake the web in antitrust closing arguments

    Google and the DOJ get one last chance to make their cases.

    Ryan Whitwam



    May 30, 2025 5:40 pm

    |

    15

    Credit:

    Ryan Whitwam

    Credit:

    Ryan Whitwam

    Story text

    Size

    Small
    Standard
    Large

    Width
    *

    Standard
    Wide

    Links

    Standard
    Orange

    * Subscribers only
      Learn more

    From its humble beginnings in the late 20th century, Google has come to dominate online searches, putting it squarely in the US government's antitrust crosshairs. The ongoing search antitrust case threatens to upend Google's dominance, giving smaller players a chance to thrive and possibly wiping others out. After wrapping up testimony in the case earlier this month, lawyers for Google and the Department of Justice have now made their closing arguments.
    The DOJ won the initial trial, securing a ruling that Google used anticompetitive practices to maintain its monopoly in general search. During the time this case has taken to meander its way through the legal system, the online landscape has been radically altered, making it harder than ever to envision a post-Google Internet.
    To address Google's monopoly, the DOJ is asking United States District Judge Amit Mehta to impose limits on Google's business dealings and order a divestment of the Chrome browser. Forcing the sale of Chrome would be a major penalty and a coup for the DOJ lawyers, but this issue has been overshadowed somewhat as the case drags on. During closing arguments, the two sides dueled over how Google's search deals and the rise of AI could change the Internet as we know it.
    Collateral damage
    This case has examined the myriad ways Google used its influence and money to suppress competition. One of the DOJ's main targets is the placement deals Google signs with companies like Apple and Mozilla to be the default search provider. Google has contended that people can change the defaults anytime they wish, but the DOJ produced evidence at trial that almost no one does, and Google knows that.
    During closing arguments,  Mehta asked both sides about testimony from a Mozilla executive alleging that losing the Google search deal could destroy the company. Similarly, Apple's Eddie Cue said he loses sleep over the possibility of losing the Google revenue—unsurprising as the arrangement is believed to net the company billion per year.

    Should Firefox die to teach Google a lesson?

    Credit:
    Santiago Mejia/San Francisco Chronicle

    Should Firefox die to teach Google a lesson?

    Credit:

    Santiago Mejia/San Francisco Chronicle

    The DOJ's David Dahlquist admitted that there could be some "private impact" but contended Apple and Mozilla are overestimating the risk. Mehta didn't seem totally satisfied with the government's position, noting that he didn't want to damage other markets in an effort to fix search.
    Google's counsel also went after the government on the privacy front. One of the DOJ's proposed remedies would require Google to license its search index and algorithm, which CEO Sundar Pichai claimed was no better than a spinoff of Google's core product. Google also claims that forcing it to license search would put everyone's privacy at risk because it has a vast amount of user data that fuels search. Google attorney John Schmidtlein said the DOJ's treatment of user privacy in the remedies was a "complete failure."
    Mehta questioned the government lawyers pointedly on the issue of privacy, which he noted was barely addressed in the remedy filings. The DOJ's Adam Severt suggested an independent committee would have to be empaneled to decide how to handle Google's user data, but he was vague on how long such a process could take. Google's team didn't like this idea at all.

    Case may hinge on AI
    During testimony in early May, Mehta commented that the role AI plays in the trial had evolved very quickly. In 2023, everyone in his courtroom agreed that the impact of AI on search was still years away, and that's definitely not the case now. That same thread is present in closing arguments.
    Mehta asked the DOJ's Dahlquist if someone new was just going to "come off the sidelines" and build a new link-based search product, given  the developments with AI. Dahlquist didn't answer directly, noting that although generative AI products didn't exist at the time covered by the antitrust action, they would be key to search going forward. Google certainly believes the AI future is already here—it has gone all-in with AI search over the past year.

    At the same time, Google is seeking to set itself apart from AI upstarts. "Generative AI companies are not trying to out-Google Google," said Schmidtlein. Google's team contends that its actions have not harmed any AI products like ChatGPT or Perplexity, and at any rate, they are not in the search market as defined by the court.
    Mehta mused about the future of search, suggesting we may have to rethink what a general search engine is in 2025. "Maybe people don’t want 10 blue links anymore," he said.
    The Chromium problem and an elegant solution
    At times during the case, Mehta has expressed skepticism about the divestment of Chrome. During closing arguments, Dahlquist reiterated the close relationship between search and browsers, reminding the court that 35 percent of Google's search volume comes from Chrome.
    Mehta now seems more receptive to a Chrome split than before, perhaps in part because the effects of the other remedies are becoming so murky. He called the Chrome divestment "less speculative" and "more elegant" than the data and placement remedies. Google again claimed, as it has throughout the remedy phase, that forcing it to give up Chrome is unsupported in the law and that Chrome's dominance is a result of innovation.
    Even if Mehta leans toward ordering this remedy, Chromium may be a sticking point. The judge seems unconvinced that the supposed buyers—a group which apparently includes almost every major tech firm—have the scale and expertise needed to maintain Chromium. This open source project forms the foundation of many other browsers, making its continued smooth operation critical to the web.
    If Google gives up Chrome, Chromium goes with it, but what about the people who maintain it? The DOJ contends that it's common for employees to come along with an acquisition, but that's far from certain. There was some discussion of ensuring a buyer could commit to hiring staff to maintain Chromium. The DOJ suggests Google could be ordered to provide financial incentives to ensure critical roles are filled, but that sounds potentially messy.
    A Chrome sale seems more likely now than it did earlier, but nothing is assured yet. Following the final arguments from each side, it's up to Mehta to mull over the facts before deciding Google's fate. That's expected to happen in August, but nothing will change for Google right away. The company has already confirmed it will appeal the case, hoping to have the original ruling overturned. It could still be years before this case reaches its ultimate conclusion.

    Ryan Whitwam
    Senior Technology Reporter

    Ryan Whitwam
    Senior Technology Reporter

    Ryan Whitwam is a senior technology reporter at Ars Technica, covering the ways Google, AI, and mobile technology continue to change the world. Over his 20-year career, he's written for Android Police, ExtremeTech, Wirecutter, NY Times, and more. He has reviewed more phones than most people will ever own. You can follow him on Bluesky, where you will see photos of his dozens of mechanical keyboards.

    15 Comments
    #google #doj #tussle #over #how
    Google and DOJ tussle over how AI will remake the web in antitrust closing arguments
    Google's reckoning Google and DOJ tussle over how AI will remake the web in antitrust closing arguments Google and the DOJ get one last chance to make their cases. Ryan Whitwam – May 30, 2025 5:40 pm | 15 Credit: Ryan Whitwam Credit: Ryan Whitwam Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more From its humble beginnings in the late 20th century, Google has come to dominate online searches, putting it squarely in the US government's antitrust crosshairs. The ongoing search antitrust case threatens to upend Google's dominance, giving smaller players a chance to thrive and possibly wiping others out. After wrapping up testimony in the case earlier this month, lawyers for Google and the Department of Justice have now made their closing arguments. The DOJ won the initial trial, securing a ruling that Google used anticompetitive practices to maintain its monopoly in general search. During the time this case has taken to meander its way through the legal system, the online landscape has been radically altered, making it harder than ever to envision a post-Google Internet. To address Google's monopoly, the DOJ is asking United States District Judge Amit Mehta to impose limits on Google's business dealings and order a divestment of the Chrome browser. Forcing the sale of Chrome would be a major penalty and a coup for the DOJ lawyers, but this issue has been overshadowed somewhat as the case drags on. During closing arguments, the two sides dueled over how Google's search deals and the rise of AI could change the Internet as we know it. Collateral damage This case has examined the myriad ways Google used its influence and money to suppress competition. One of the DOJ's main targets is the placement deals Google signs with companies like Apple and Mozilla to be the default search provider. Google has contended that people can change the defaults anytime they wish, but the DOJ produced evidence at trial that almost no one does, and Google knows that. During closing arguments,  Mehta asked both sides about testimony from a Mozilla executive alleging that losing the Google search deal could destroy the company. Similarly, Apple's Eddie Cue said he loses sleep over the possibility of losing the Google revenue—unsurprising as the arrangement is believed to net the company billion per year. Should Firefox die to teach Google a lesson? Credit: Santiago Mejia/San Francisco Chronicle Should Firefox die to teach Google a lesson? Credit: Santiago Mejia/San Francisco Chronicle The DOJ's David Dahlquist admitted that there could be some "private impact" but contended Apple and Mozilla are overestimating the risk. Mehta didn't seem totally satisfied with the government's position, noting that he didn't want to damage other markets in an effort to fix search. Google's counsel also went after the government on the privacy front. One of the DOJ's proposed remedies would require Google to license its search index and algorithm, which CEO Sundar Pichai claimed was no better than a spinoff of Google's core product. Google also claims that forcing it to license search would put everyone's privacy at risk because it has a vast amount of user data that fuels search. Google attorney John Schmidtlein said the DOJ's treatment of user privacy in the remedies was a "complete failure." Mehta questioned the government lawyers pointedly on the issue of privacy, which he noted was barely addressed in the remedy filings. The DOJ's Adam Severt suggested an independent committee would have to be empaneled to decide how to handle Google's user data, but he was vague on how long such a process could take. Google's team didn't like this idea at all. Case may hinge on AI During testimony in early May, Mehta commented that the role AI plays in the trial had evolved very quickly. In 2023, everyone in his courtroom agreed that the impact of AI on search was still years away, and that's definitely not the case now. That same thread is present in closing arguments. Mehta asked the DOJ's Dahlquist if someone new was just going to "come off the sidelines" and build a new link-based search product, given  the developments with AI. Dahlquist didn't answer directly, noting that although generative AI products didn't exist at the time covered by the antitrust action, they would be key to search going forward. Google certainly believes the AI future is already here—it has gone all-in with AI search over the past year. At the same time, Google is seeking to set itself apart from AI upstarts. "Generative AI companies are not trying to out-Google Google," said Schmidtlein. Google's team contends that its actions have not harmed any AI products like ChatGPT or Perplexity, and at any rate, they are not in the search market as defined by the court. Mehta mused about the future of search, suggesting we may have to rethink what a general search engine is in 2025. "Maybe people don’t want 10 blue links anymore," he said. The Chromium problem and an elegant solution At times during the case, Mehta has expressed skepticism about the divestment of Chrome. During closing arguments, Dahlquist reiterated the close relationship between search and browsers, reminding the court that 35 percent of Google's search volume comes from Chrome. Mehta now seems more receptive to a Chrome split than before, perhaps in part because the effects of the other remedies are becoming so murky. He called the Chrome divestment "less speculative" and "more elegant" than the data and placement remedies. Google again claimed, as it has throughout the remedy phase, that forcing it to give up Chrome is unsupported in the law and that Chrome's dominance is a result of innovation. Even if Mehta leans toward ordering this remedy, Chromium may be a sticking point. The judge seems unconvinced that the supposed buyers—a group which apparently includes almost every major tech firm—have the scale and expertise needed to maintain Chromium. This open source project forms the foundation of many other browsers, making its continued smooth operation critical to the web. If Google gives up Chrome, Chromium goes with it, but what about the people who maintain it? The DOJ contends that it's common for employees to come along with an acquisition, but that's far from certain. There was some discussion of ensuring a buyer could commit to hiring staff to maintain Chromium. The DOJ suggests Google could be ordered to provide financial incentives to ensure critical roles are filled, but that sounds potentially messy. A Chrome sale seems more likely now than it did earlier, but nothing is assured yet. Following the final arguments from each side, it's up to Mehta to mull over the facts before deciding Google's fate. That's expected to happen in August, but nothing will change for Google right away. The company has already confirmed it will appeal the case, hoping to have the original ruling overturned. It could still be years before this case reaches its ultimate conclusion. Ryan Whitwam Senior Technology Reporter Ryan Whitwam Senior Technology Reporter Ryan Whitwam is a senior technology reporter at Ars Technica, covering the ways Google, AI, and mobile technology continue to change the world. Over his 20-year career, he's written for Android Police, ExtremeTech, Wirecutter, NY Times, and more. He has reviewed more phones than most people will ever own. You can follow him on Bluesky, where you will see photos of his dozens of mechanical keyboards. 15 Comments #google #doj #tussle #over #how
    ARSTECHNICA.COM
    Google and DOJ tussle over how AI will remake the web in antitrust closing arguments
    Google's reckoning Google and DOJ tussle over how AI will remake the web in antitrust closing arguments Google and the DOJ get one last chance to make their cases. Ryan Whitwam – May 30, 2025 5:40 pm | 15 Credit: Ryan Whitwam Credit: Ryan Whitwam Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more From its humble beginnings in the late 20th century, Google has come to dominate online searches, putting it squarely in the US government's antitrust crosshairs. The ongoing search antitrust case threatens to upend Google's dominance, giving smaller players a chance to thrive and possibly wiping others out. After wrapping up testimony in the case earlier this month, lawyers for Google and the Department of Justice have now made their closing arguments. The DOJ won the initial trial, securing a ruling that Google used anticompetitive practices to maintain its monopoly in general search. During the time this case has taken to meander its way through the legal system, the online landscape has been radically altered, making it harder than ever to envision a post-Google Internet. To address Google's monopoly, the DOJ is asking United States District Judge Amit Mehta to impose limits on Google's business dealings and order a divestment of the Chrome browser. Forcing the sale of Chrome would be a major penalty and a coup for the DOJ lawyers, but this issue has been overshadowed somewhat as the case drags on. During closing arguments, the two sides dueled over how Google's search deals and the rise of AI could change the Internet as we know it. Collateral damage This case has examined the myriad ways Google used its influence and money to suppress competition. One of the DOJ's main targets is the placement deals Google signs with companies like Apple and Mozilla to be the default search provider. Google has contended that people can change the defaults anytime they wish, but the DOJ produced evidence at trial that almost no one does, and Google knows that. During closing arguments,  Mehta asked both sides about testimony from a Mozilla executive alleging that losing the Google search deal could destroy the company. Similarly, Apple's Eddie Cue said he loses sleep over the possibility of losing the Google revenue—unsurprising as the arrangement is believed to net the company $20 billion per year. Should Firefox die to teach Google a lesson? Credit: Santiago Mejia/San Francisco Chronicle Should Firefox die to teach Google a lesson? Credit: Santiago Mejia/San Francisco Chronicle The DOJ's David Dahlquist admitted that there could be some "private impact" but contended Apple and Mozilla are overestimating the risk. Mehta didn't seem totally satisfied with the government's position, noting that he didn't want to damage other markets in an effort to fix search. Google's counsel also went after the government on the privacy front. One of the DOJ's proposed remedies would require Google to license its search index and algorithm, which CEO Sundar Pichai claimed was no better than a spinoff of Google's core product. Google also claims that forcing it to license search would put everyone's privacy at risk because it has a vast amount of user data that fuels search. Google attorney John Schmidtlein said the DOJ's treatment of user privacy in the remedies was a "complete failure." Mehta questioned the government lawyers pointedly on the issue of privacy, which he noted was barely addressed in the remedy filings. The DOJ's Adam Severt suggested an independent committee would have to be empaneled to decide how to handle Google's user data, but he was vague on how long such a process could take. Google's team didn't like this idea at all. Case may hinge on AI During testimony in early May, Mehta commented that the role AI plays in the trial had evolved very quickly. In 2023, everyone in his courtroom agreed that the impact of AI on search was still years away, and that's definitely not the case now. That same thread is present in closing arguments. Mehta asked the DOJ's Dahlquist if someone new was just going to "come off the sidelines" and build a new link-based search product, given  the developments with AI. Dahlquist didn't answer directly, noting that although generative AI products didn't exist at the time covered by the antitrust action, they would be key to search going forward. Google certainly believes the AI future is already here—it has gone all-in with AI search over the past year. At the same time, Google is seeking to set itself apart from AI upstarts. "Generative AI companies are not trying to out-Google Google," said Schmidtlein. Google's team contends that its actions have not harmed any AI products like ChatGPT or Perplexity, and at any rate, they are not in the search market as defined by the court. Mehta mused about the future of search, suggesting we may have to rethink what a general search engine is in 2025. "Maybe people don’t want 10 blue links anymore," he said. The Chromium problem and an elegant solution At times during the case, Mehta has expressed skepticism about the divestment of Chrome. During closing arguments, Dahlquist reiterated the close relationship between search and browsers, reminding the court that 35 percent of Google's search volume comes from Chrome. Mehta now seems more receptive to a Chrome split than before, perhaps in part because the effects of the other remedies are becoming so murky. He called the Chrome divestment "less speculative" and "more elegant" than the data and placement remedies. Google again claimed, as it has throughout the remedy phase, that forcing it to give up Chrome is unsupported in the law and that Chrome's dominance is a result of innovation. Even if Mehta leans toward ordering this remedy, Chromium may be a sticking point. The judge seems unconvinced that the supposed buyers—a group which apparently includes almost every major tech firm—have the scale and expertise needed to maintain Chromium. This open source project forms the foundation of many other browsers, making its continued smooth operation critical to the web. If Google gives up Chrome, Chromium goes with it, but what about the people who maintain it? The DOJ contends that it's common for employees to come along with an acquisition, but that's far from certain. There was some discussion of ensuring a buyer could commit to hiring staff to maintain Chromium. The DOJ suggests Google could be ordered to provide financial incentives to ensure critical roles are filled, but that sounds potentially messy. A Chrome sale seems more likely now than it did earlier, but nothing is assured yet. Following the final arguments from each side, it's up to Mehta to mull over the facts before deciding Google's fate. That's expected to happen in August, but nothing will change for Google right away. The company has already confirmed it will appeal the case, hoping to have the original ruling overturned. It could still be years before this case reaches its ultimate conclusion. Ryan Whitwam Senior Technology Reporter Ryan Whitwam Senior Technology Reporter Ryan Whitwam is a senior technology reporter at Ars Technica, covering the ways Google, AI, and mobile technology continue to change the world. Over his 20-year career, he's written for Android Police, ExtremeTech, Wirecutter, NY Times, and more. He has reviewed more phones than most people will ever own. You can follow him on Bluesky, where you will see photos of his dozens of mechanical keyboards. 15 Comments
    0 Commenti 0 condivisioni
  • Where hyperscale hardware goes to retire: Ars visits a very big ITAD site

    You are the data center

    Where hyperscale hardware goes to retire: Ars visits a very big ITAD site

    Watching memory DIMMs get sorted like Wonka children inside SK TES' facility.

    Kevin Purdy



    May 26, 2025 7:30 am

    |

    9

    A worker at SK TES' Fredericksburg, Va. facility, processing incoming gear.

    Credit:

    SK TES

    A worker at SK TES' Fredericksburg, Va. facility, processing incoming gear.

    Credit:

    SK TES

    Story text

    Size

    Small
    Standard
    Large

    Width
    *

    Standard
    Wide

    Links

    Standard
    Orange

    * Subscribers only
      Learn more

    "The biggest risk is data escape."
    Eric Ingebretsen, chief commercial officer at SK TES, an IT asset disposition provider, tells me this early on during a tour of a 128,000-square-foot facility in Fredericksburg, Virginia. He will restate this a few times.
    A big part of this site's pitch to its clients, including the "hyperscale" customers with gigantic data centers nearby, is that each device is labeled, tracked, and inventoried for its drives—both obvious and hidden—and is either securely wiped or destroyed. The process, commonly called ITAD, is used by larger businesses, especially when they upgrade fleets of servers or workers' devices. ITAD providers ensure all the old gear is wiped clean, then resold, repurposed, recycled, or destroyed.
    In keeping with the spirit of client confidentiality, I could not take photos or videos during my visit, record our talks, or capture anything beyond what I could scribble in my notepad.. I did, however, see some intriguing things and learn about what happens to all the drives and rack-mounted gear we call "the cloud" once anything gets more than a few years old.
    Undocumented drives: The tiny terror
    The loading docks at SK's facility are essentially divided into two: one section for the hyperscalers and one for everything else. SK is discreet about its clients, but given its northern Virginia location, you can make some guesses about some of the online-selling, search-result-providing, software-providing firms this site is servicing.
    Pallets arrive in big, shrink-wrapped squares, as tall as my shoulders, with break-once security seals. Each device has its serial number assigned to an asset tag, one that will follow that unit through the whole facility. Laptops and desktops head to a retail station on a long roller line. At that spot, workers—the kind exceedingly familiar with all the BIOS startup keys—run an automated Blancco system to reset them at the firmware level. Workers sometimes have to dig deeper, like getting into a switch or router with SSH or undoing a RAID setup to enable programmed wiping.

    Inside the laptop/desktop examination bay at SK TES's Fredericksburg, Va. site.

    Credit:
    SK tes

    Inside the laptop/desktop examination bay at SK TES's Fredericksburg, Va. site.

    Credit:

    SK tes

    The details of each unit—CPU, memory, HDD size—are taken down and added to the asset tag, and the device is sent on to be physically examined. This step is important because "many a concealed drive finds its way into this line," Kent Green, manager of this site, told me. Inside the machines coming from big firms, there are sometimes little USB, SD, SATA, or M.2 drives hiding out. Some were make-do solutions installed by IT and not documented, and others were put there by employees tired of waiting for more storage. "Some managers have been pretty surprised when they learn what we found," Green said.
    With everything wiped and with some sense of what they're made of, each device gets a rating. It's a three-character system, like "A-3-6," based on function, cosmetic condition, and component value. Based on needs, trends, and other data, devices that are cleared for resale go to either wholesale, retail, component harvesting, or scrap.
    Full-body laptop skins

    Wiping down and prepping a laptop, potentially for a full-cover adhesive skin.

    Credit:
    SK TES

    Wiping down and prepping a laptop, potentially for a full-cover adhesive skin.

    Credit:

    SK TES

    If a device has retail value, it heads into a section of this giant facility where workers do further checks. Automated software plays sounds on the speakers, checks that every keyboard key is sending signals, and checks that laptop batteries are at 80 percent capacity or better. At the end of the line is my favorite discovery: full-body laptop skins.
    Some laptops—certain Lenovo, Dell, and HP models—are so ubiquitous in corporate fleets that it's worth buying an adhesive laminating sticker in their exact shape. They're an uncanny match for the matte black, silver, and slightly less silver finishes of the laptops, covering up any blemishes and scratches. Watching one of the workers apply this made me jealous of their ability to essentially reset a laptop's condition. Once rated, tested, and stickered, laptops go into a clever "cradle" box, get the UN 3481 "battery inside" sticker, and can be sold through retail.

    5,632 HDDs at once

    Beyond these folks are some of the more than 5,000 HDD wiping baysat the SK TES facility.

    Credit:
    SK TES

    Beyond these folks are some of the more than 5,000 HDD wiping baysat the SK TES facility.

    Credit:

    SK TES

    That includes buyers of reconditioned hard drives, and boy, are there a lot of drives moving through this site. Once a drive is verified through its SMART data to be worth grading and refurbishing, it's put into one of more than two dozen wiping bays, each holding about 192 drives. If the bays were completely full, 5,632 drives could be wiped concurrently. The month before I visited, the site had processed about 58,000 drives, according to Ingebretsen.
    There are also stacks and stacks of memory and CPUs in this non-retail corner of the site. I walked by one box labeled "SR1Y5", and he confirmed there were 3,600 units inside.

    The RoboFlex II. This baby weighs 35 pounds, has Good and Bad bins, and whips sticks around at remarkable speed.

    Credit:
    SimmTester

    The RoboFlex II. This baby weighs 35 pounds, has Good and Bad bins, and whips sticks around at remarkable speed.

    Credit:

    SimmTester

    Nearby, in the memory-testing section, I find the memory machine that will stick in my own memory the rest of my life: the RoboFlex-II Handler. You drop RAM DIMMs or SODIMMs into one of its two bays, and it tests the pins on each stick. Each stick is rated "Good" or "Bad" and kicked in the appropriate direction by a 90-PSI air blast. I asked the workers at this station if they think about the entirely relevant scene from Willy Wonka & the Chocolate Factory. They do, and quite often.
    Where does all this stuff go? SK TES sells retail devices like laptops, desktops, and mobile devices through its "Stock Must Go" brand on eBay and elsewhere. Chips and memory are typically bought up by laboratories, crypto miners, data centers, and a lot of high-volume overseas customers. There are steady enterprise customers for the drives, usually putting them back into datacenters. It's something like million in sales each month, an SK TES representative told me.
    Big data, and only getting bigger
    The other business—the thing that makes ITAD "disposition" instead of just "refurbishing"—is dismantling and handing off devices for shredding. The Financial Times has reported that Amazon and Microsoft have 100 percent data shredding policies, with Google also shredding much of its drive turnover. The US National Renewable Energy Laboratory estimated in 2022 that by 2025, roughly 50 million end-of-life data center drives would be shredded every year.

    ITAD businesses like SK TES make the pitch that companies can create revenue to reinvest in operations through offering gear for refurbishment. SK TES representatives told me that most of the Virginia site's customers are "focused on reuse," while "a small portion" of equipment is shredded and sent off-site to be recycled.
    The site, built on the guts of a mattress factory, was put up specifically to handle the high volumes of server racks and HDDs coming in from data centers. It has a staff of 165, though it fluctuates a bit between big server hauls and downtime. The full-fledged site had been open one year when I visited. The biggest challenge, Ingebretsen told me, was getting power everywhere it needed to go inside the facility as volume fluctuated and needs expanded.
    Data centers are massive and growing, to the point of creating entire sub-industries that employ dozens of people to handle their tech turnover. The Northern Virginia Technology Council industry group puts this region's data center growth at 500 percent between 2015 and 2023, and it continues, though some pushback is happening. Many data centers were accessed to allow me to navigate to SK TES's site via Apple Maps and write this post, and for you to read it. It reminds me of the adage—made popular by the CEO of GPS maker TomTom—that you are not stuck in traffic, you are the traffic.
    After my tour, I got my phone back from security, talked a bit with Ingebretsen, then headed out to my car. I spent a few minutes jotting down the most notable things I'd seen inside, then looked up and out the windshield. There was a black tarp wrapped around a chain-link fence of the lot next door, with logos announcing the construction of a new data center. Data centers are everywhere—and nowhere in particular.

    Kevin Purdy
    Senior Technology Reporter

    Kevin Purdy
    Senior Technology Reporter

    Kevin is a senior technology reporter at Ars Technica, covering open-source software, PC gaming, home automation, repairability, e-bikes, and tech history. He has previously worked at Lifehacker, Wirecutter, iFixit, and Carbon Switch.

    9 Comments
    #where #hyperscale #hardware #goes #retire
    Where hyperscale hardware goes to retire: Ars visits a very big ITAD site
    You are the data center Where hyperscale hardware goes to retire: Ars visits a very big ITAD site Watching memory DIMMs get sorted like Wonka children inside SK TES' facility. Kevin Purdy – May 26, 2025 7:30 am | 9 A worker at SK TES' Fredericksburg, Va. facility, processing incoming gear. Credit: SK TES A worker at SK TES' Fredericksburg, Va. facility, processing incoming gear. Credit: SK TES Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more "The biggest risk is data escape." Eric Ingebretsen, chief commercial officer at SK TES, an IT asset disposition provider, tells me this early on during a tour of a 128,000-square-foot facility in Fredericksburg, Virginia. He will restate this a few times. A big part of this site's pitch to its clients, including the "hyperscale" customers with gigantic data centers nearby, is that each device is labeled, tracked, and inventoried for its drives—both obvious and hidden—and is either securely wiped or destroyed. The process, commonly called ITAD, is used by larger businesses, especially when they upgrade fleets of servers or workers' devices. ITAD providers ensure all the old gear is wiped clean, then resold, repurposed, recycled, or destroyed. In keeping with the spirit of client confidentiality, I could not take photos or videos during my visit, record our talks, or capture anything beyond what I could scribble in my notepad.. I did, however, see some intriguing things and learn about what happens to all the drives and rack-mounted gear we call "the cloud" once anything gets more than a few years old. Undocumented drives: The tiny terror The loading docks at SK's facility are essentially divided into two: one section for the hyperscalers and one for everything else. SK is discreet about its clients, but given its northern Virginia location, you can make some guesses about some of the online-selling, search-result-providing, software-providing firms this site is servicing. Pallets arrive in big, shrink-wrapped squares, as tall as my shoulders, with break-once security seals. Each device has its serial number assigned to an asset tag, one that will follow that unit through the whole facility. Laptops and desktops head to a retail station on a long roller line. At that spot, workers—the kind exceedingly familiar with all the BIOS startup keys—run an automated Blancco system to reset them at the firmware level. Workers sometimes have to dig deeper, like getting into a switch or router with SSH or undoing a RAID setup to enable programmed wiping. Inside the laptop/desktop examination bay at SK TES's Fredericksburg, Va. site. Credit: SK tes Inside the laptop/desktop examination bay at SK TES's Fredericksburg, Va. site. Credit: SK tes The details of each unit—CPU, memory, HDD size—are taken down and added to the asset tag, and the device is sent on to be physically examined. This step is important because "many a concealed drive finds its way into this line," Kent Green, manager of this site, told me. Inside the machines coming from big firms, there are sometimes little USB, SD, SATA, or M.2 drives hiding out. Some were make-do solutions installed by IT and not documented, and others were put there by employees tired of waiting for more storage. "Some managers have been pretty surprised when they learn what we found," Green said. With everything wiped and with some sense of what they're made of, each device gets a rating. It's a three-character system, like "A-3-6," based on function, cosmetic condition, and component value. Based on needs, trends, and other data, devices that are cleared for resale go to either wholesale, retail, component harvesting, or scrap. Full-body laptop skins Wiping down and prepping a laptop, potentially for a full-cover adhesive skin. Credit: SK TES Wiping down and prepping a laptop, potentially for a full-cover adhesive skin. Credit: SK TES If a device has retail value, it heads into a section of this giant facility where workers do further checks. Automated software plays sounds on the speakers, checks that every keyboard key is sending signals, and checks that laptop batteries are at 80 percent capacity or better. At the end of the line is my favorite discovery: full-body laptop skins. Some laptops—certain Lenovo, Dell, and HP models—are so ubiquitous in corporate fleets that it's worth buying an adhesive laminating sticker in their exact shape. They're an uncanny match for the matte black, silver, and slightly less silver finishes of the laptops, covering up any blemishes and scratches. Watching one of the workers apply this made me jealous of their ability to essentially reset a laptop's condition. Once rated, tested, and stickered, laptops go into a clever "cradle" box, get the UN 3481 "battery inside" sticker, and can be sold through retail. 5,632 HDDs at once Beyond these folks are some of the more than 5,000 HDD wiping baysat the SK TES facility. Credit: SK TES Beyond these folks are some of the more than 5,000 HDD wiping baysat the SK TES facility. Credit: SK TES That includes buyers of reconditioned hard drives, and boy, are there a lot of drives moving through this site. Once a drive is verified through its SMART data to be worth grading and refurbishing, it's put into one of more than two dozen wiping bays, each holding about 192 drives. If the bays were completely full, 5,632 drives could be wiped concurrently. The month before I visited, the site had processed about 58,000 drives, according to Ingebretsen. There are also stacks and stacks of memory and CPUs in this non-retail corner of the site. I walked by one box labeled "SR1Y5", and he confirmed there were 3,600 units inside. The RoboFlex II. This baby weighs 35 pounds, has Good and Bad bins, and whips sticks around at remarkable speed. Credit: SimmTester The RoboFlex II. This baby weighs 35 pounds, has Good and Bad bins, and whips sticks around at remarkable speed. Credit: SimmTester Nearby, in the memory-testing section, I find the memory machine that will stick in my own memory the rest of my life: the RoboFlex-II Handler. You drop RAM DIMMs or SODIMMs into one of its two bays, and it tests the pins on each stick. Each stick is rated "Good" or "Bad" and kicked in the appropriate direction by a 90-PSI air blast. I asked the workers at this station if they think about the entirely relevant scene from Willy Wonka & the Chocolate Factory. They do, and quite often. Where does all this stuff go? SK TES sells retail devices like laptops, desktops, and mobile devices through its "Stock Must Go" brand on eBay and elsewhere. Chips and memory are typically bought up by laboratories, crypto miners, data centers, and a lot of high-volume overseas customers. There are steady enterprise customers for the drives, usually putting them back into datacenters. It's something like million in sales each month, an SK TES representative told me. Big data, and only getting bigger The other business—the thing that makes ITAD "disposition" instead of just "refurbishing"—is dismantling and handing off devices for shredding. The Financial Times has reported that Amazon and Microsoft have 100 percent data shredding policies, with Google also shredding much of its drive turnover. The US National Renewable Energy Laboratory estimated in 2022 that by 2025, roughly 50 million end-of-life data center drives would be shredded every year. ITAD businesses like SK TES make the pitch that companies can create revenue to reinvest in operations through offering gear for refurbishment. SK TES representatives told me that most of the Virginia site's customers are "focused on reuse," while "a small portion" of equipment is shredded and sent off-site to be recycled. The site, built on the guts of a mattress factory, was put up specifically to handle the high volumes of server racks and HDDs coming in from data centers. It has a staff of 165, though it fluctuates a bit between big server hauls and downtime. The full-fledged site had been open one year when I visited. The biggest challenge, Ingebretsen told me, was getting power everywhere it needed to go inside the facility as volume fluctuated and needs expanded. Data centers are massive and growing, to the point of creating entire sub-industries that employ dozens of people to handle their tech turnover. The Northern Virginia Technology Council industry group puts this region's data center growth at 500 percent between 2015 and 2023, and it continues, though some pushback is happening. Many data centers were accessed to allow me to navigate to SK TES's site via Apple Maps and write this post, and for you to read it. It reminds me of the adage—made popular by the CEO of GPS maker TomTom—that you are not stuck in traffic, you are the traffic. After my tour, I got my phone back from security, talked a bit with Ingebretsen, then headed out to my car. I spent a few minutes jotting down the most notable things I'd seen inside, then looked up and out the windshield. There was a black tarp wrapped around a chain-link fence of the lot next door, with logos announcing the construction of a new data center. Data centers are everywhere—and nowhere in particular. Kevin Purdy Senior Technology Reporter Kevin Purdy Senior Technology Reporter Kevin is a senior technology reporter at Ars Technica, covering open-source software, PC gaming, home automation, repairability, e-bikes, and tech history. He has previously worked at Lifehacker, Wirecutter, iFixit, and Carbon Switch. 9 Comments #where #hyperscale #hardware #goes #retire
    ARSTECHNICA.COM
    Where hyperscale hardware goes to retire: Ars visits a very big ITAD site
    You are the data center Where hyperscale hardware goes to retire: Ars visits a very big ITAD site Watching memory DIMMs get sorted like Wonka children inside SK TES' facility. Kevin Purdy – May 26, 2025 7:30 am | 9 A worker at SK TES' Fredericksburg, Va. facility, processing incoming gear. Credit: SK TES A worker at SK TES' Fredericksburg, Va. facility, processing incoming gear. Credit: SK TES Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more "The biggest risk is data escape." Eric Ingebretsen, chief commercial officer at SK TES, an IT asset disposition provider, tells me this early on during a tour of a 128,000-square-foot facility in Fredericksburg, Virginia. He will restate this a few times. A big part of this site's pitch to its clients, including the "hyperscale" customers with gigantic data centers nearby, is that each device is labeled, tracked, and inventoried for its drives—both obvious and hidden—and is either securely wiped or destroyed. The process, commonly called ITAD, is used by larger businesses, especially when they upgrade fleets of servers or workers' devices. ITAD providers ensure all the old gear is wiped clean, then resold, repurposed, recycled, or destroyed. In keeping with the spirit of client confidentiality, I could not take photos or videos during my visit, record our talks, or capture anything beyond what I could scribble in my notepad. (The images in this post are provided by SK TES and were not taken during my visit). I did, however, see some intriguing things and learn about what happens to all the drives and rack-mounted gear we call "the cloud" once anything gets more than a few years old. Undocumented drives: The tiny terror The loading docks at SK's facility are essentially divided into two: one section for the hyperscalers and one for everything else. SK is discreet about its clients, but given its northern Virginia location, you can make some guesses about some of the online-selling, search-result-providing, software-providing firms this site is servicing. Pallets arrive in big, shrink-wrapped squares, as tall as my shoulders, with break-once security seals. Each device has its serial number assigned to an asset tag, one that will follow that unit through the whole facility. Laptops and desktops head to a retail station on a long roller line. At that spot, workers—the kind exceedingly familiar with all the BIOS startup keys—run an automated Blancco system to reset them at the firmware level. Workers sometimes have to dig deeper, like getting into a switch or router with SSH or undoing a RAID setup to enable programmed wiping. Inside the laptop/desktop examination bay at SK TES's Fredericksburg, Va. site. Credit: SK tes Inside the laptop/desktop examination bay at SK TES's Fredericksburg, Va. site. Credit: SK tes The details of each unit—CPU, memory, HDD size—are taken down and added to the asset tag, and the device is sent on to be physically examined. This step is important because "many a concealed drive finds its way into this line," Kent Green, manager of this site, told me. Inside the machines coming from big firms, there are sometimes little USB, SD, SATA, or M.2 drives hiding out. Some were make-do solutions installed by IT and not documented, and others were put there by employees tired of waiting for more storage. "Some managers have been pretty surprised when they learn what we found," Green said. With everything wiped and with some sense of what they're made of, each device gets a rating. It's a three-character system, like "A-3-6," based on function, cosmetic condition, and component value. Based on needs, trends, and other data, devices that are cleared for resale go to either wholesale, retail, component harvesting, or scrap. Full-body laptop skins Wiping down and prepping a laptop, potentially for a full-cover adhesive skin. Credit: SK TES Wiping down and prepping a laptop, potentially for a full-cover adhesive skin. Credit: SK TES If a device has retail value, it heads into a section of this giant facility where workers do further checks. Automated software plays sounds on the speakers, checks that every keyboard key is sending signals, and checks that laptop batteries are at 80 percent capacity or better. At the end of the line is my favorite discovery: full-body laptop skins. Some laptops—certain Lenovo, Dell, and HP models—are so ubiquitous in corporate fleets that it's worth buying an adhesive laminating sticker in their exact shape. They're an uncanny match for the matte black, silver, and slightly less silver finishes of the laptops, covering up any blemishes and scratches. Watching one of the workers apply this made me jealous of their ability to essentially reset a laptop's condition (so one could apply whole new layers of swag stickers, of course). Once rated, tested, and stickered, laptops go into a clever "cradle" box, get the UN 3481 "battery inside" sticker, and can be sold through retail. 5,632 HDDs at once Beyond these folks are some of the more than 5,000 HDD wiping bays (black, with all the wires running to them) at the SK TES facility. Credit: SK TES Beyond these folks are some of the more than 5,000 HDD wiping bays (black, with all the wires running to them) at the SK TES facility. Credit: SK TES That includes buyers of reconditioned hard drives, and boy, are there a lot of drives moving through this site. Once a drive is verified through its SMART data to be worth grading and refurbishing, it's put into one of more than two dozen wiping bays, each holding about 192 drives (with a special bay handling some M.2 and other non-HDD sizes). If the bays were completely full, 5,632 drives could be wiped concurrently. The month before I visited, the site had processed about 58,000 drives, according to Ingebretsen. There are also stacks and stacks of memory and CPUs in this non-retail corner of the site. I walked by one box labeled "SR1Y5" (i.e., Intel Xeon E5-2676 v3 chips), and he confirmed there were 3,600 units inside. The RoboFlex II. This baby weighs 35 pounds, has Good and Bad bins, and whips sticks around at remarkable speed. Credit: SimmTester The RoboFlex II. This baby weighs 35 pounds, has Good and Bad bins, and whips sticks around at remarkable speed. Credit: SimmTester Nearby, in the memory-testing section, I find the memory machine that will stick in my own memory the rest of my life: the RoboFlex-II Handler. You drop RAM DIMMs or SODIMMs into one of its two bays, and it tests the pins on each stick. Each stick is rated "Good" or "Bad" and kicked in the appropriate direction by a 90-PSI air blast. I asked the workers at this station if they think about the entirely relevant scene from Willy Wonka & the Chocolate Factory. They do, and quite often. Where does all this stuff go? SK TES sells retail devices like laptops, desktops, and mobile devices through its "Stock Must Go" brand on eBay and elsewhere. Chips and memory are typically bought up by laboratories, crypto miners, data centers, and a lot of high-volume overseas customers. There are steady enterprise customers for the drives, usually putting them back into datacenters. It's something like $2.5 million in sales each month, an SK TES representative told me. Big data, and only getting bigger The other business—the thing that makes ITAD "disposition" instead of just "refurbishing"—is dismantling and handing off devices for shredding. The Financial Times has reported that Amazon and Microsoft have 100 percent data shredding policies, with Google also shredding much of its drive turnover. The US National Renewable Energy Laboratory estimated in 2022 that by 2025, roughly 50 million end-of-life data center drives would be shredded every year. ITAD businesses like SK TES make the pitch that companies can create revenue to reinvest in operations through offering gear for refurbishment. SK TES representatives told me that most of the Virginia site's customers are "focused on reuse," while "a small portion" of equipment is shredded and sent off-site to be recycled. The site, built on the guts of a mattress factory, was put up specifically to handle the high volumes of server racks and HDDs coming in from data centers. It has a staff of 165, though it fluctuates a bit between big server hauls and downtime. The full-fledged site had been open one year when I visited. The biggest challenge, Ingebretsen told me, was getting power everywhere it needed to go inside the facility as volume fluctuated and needs expanded. Data centers are massive and growing, to the point of creating entire sub-industries that employ dozens of people to handle their tech turnover. The Northern Virginia Technology Council industry group puts this region's data center growth at 500 percent between 2015 and 2023, and it continues, though some pushback is happening. Many data centers were accessed to allow me to navigate to SK TES's site via Apple Maps and write this post, and for you to read it. It reminds me of the adage—made popular by the CEO of GPS maker TomTom—that you are not stuck in traffic, you are the traffic. After my tour, I got my phone back from security, talked a bit with Ingebretsen, then headed out to my car. I spent a few minutes jotting down the most notable things I'd seen inside, then looked up and out the windshield. There was a black tarp wrapped around a chain-link fence of the lot next door, with logos announcing the construction of a new data center. Data centers are everywhere—and nowhere in particular. Kevin Purdy Senior Technology Reporter Kevin Purdy Senior Technology Reporter Kevin is a senior technology reporter at Ars Technica, covering open-source software, PC gaming, home automation, repairability, e-bikes, and tech history. He has previously worked at Lifehacker, Wirecutter, iFixit, and Carbon Switch. 9 Comments
    0 Commenti 0 condivisioni
  • This AI Paper Introduces Group Think: A Token-Level Multi-Agent Reasoning Paradigm for Faster and Collaborative LLM Inference

    A prominent area of exploration involves enabling large language modelsto function collaboratively. Multi-agent systems powered by LLMs are now being examined for their potential to coordinate challenging problems by splitting tasks and working simultaneously. This direction has gained attention due to its potential to increase efficiency and reduce latency in real-time applications.
    A common issue in collaborative LLM systems is agents’ sequential, turn-based communication. In such systems, each agent must wait for others to complete their reasoning steps before proceeding. This slows down processing, especially in situations demanding rapid responses. Moreover, agents often duplicate efforts or generate inconsistent outputs, as they cannot see the evolving thoughts of their peers during generation. This latency and redundancy reduce the practicality of deploying multi-agent LLMs, particularly when time and computation are constrained, such as edge devices.

    Most current solutions have relied on sequential or independently parallel sampling techniques to improve reasoning. Methods like Chain-of-Thought prompting help models to solve problems in a structured way but often come with increased inference time. Approaches such as Tree-of-Thoughts and Graph-of-Thoughts expand on this by branching reasoning paths. However, these approaches still do not allow for real-time mutual adaptation among agents. Multi-agent setups have explored collaborative methods, but mostly through alternating message exchanges, which again introduces delays. Some advanced systems propose complex dynamic scheduling or role-based configurations, which are not optimized for efficient inference.
    Research from MediaTek Research introduced a new method called Group Think. This approach enables multiple reasoning agents within a single LLM to operate concurrently, observing each other’s partial outputs at the token level. Each reasoning thread adapts to the evolving thoughts of the others mid-generation. This mechanism reduces duplication and enables agents to shift direction if another thread is better positioned to continue a specific line of reasoning. Group Think is implemented through a token-level attention mechanism that lets each agent attend to previously generated tokens from all agents, supporting real-time collaboration.
    The method works by assigning each agent its own sequence of token indices, allowing their outputs to be interleaved in memory. These interleaved tokens are stored in a shared cache accessible to all agents during generation. This design allows efficient attention across reasoning threads without architectural changes to the transformer model. The implementation works both on personal devices and in data centers. On local devices, it effectively uses idle compute by batching multiple agent outputs, even with a batch size of one. In data centers, Group Think allows multiple requests to be processed together, interleaving tokens across agents while maintaining correct attention dynamics.

    Performance tests demonstrate that Group Think significantly improves latency and output quality. In enumeration tasks, such as listing 100 distinct names, it achieved near-complete results more rapidly than conventional Chain-of-Thought approaches. The acceleration was proportional to the number of thinkers; for example, four thinkers reduced latency by a factor of about four. In divide-and-conquer problems, using the Floyd–Warshall algorithm on a graph of five nodes, four thinkers reduced the completion time to half that of a single agent. Group Think solved code generation challenges in programming tasks more effectively than baseline models. With four or more thinkers, the model produced correct code segments much faster than traditional reasoning models.
    This research shows that existing LLMs, though not explicitly trained for collaboration, can already demonstrate emergent group reasoning behaviors under the Group Think setup. In experiments, agents naturally diversified their work to avoid redundancy, often dividing tasks by topic or focus area. These findings suggest that Group Think’s efficiency and sophistication could be enhanced further with dedicated training on collaborative data.

    Check out the Paper. All credit for this research goes to the researchers of this project. Also, feel free to follow us on Twitter and don’t forget to join our 95k+ ML SubReddit and Subscribe to our Newsletter.
    NikhilNikhil is an intern consultant at Marktechpost. He is pursuing an integrated dual degree in Materials at the Indian Institute of Technology, Kharagpur. Nikhil is an AI/ML enthusiast who is always researching applications in fields like biomaterials and biomedical science. With a strong background in Material Science, he is exploring new advancements and creating opportunities to contribute.Nikhilhttps://www.marktechpost.com/author/nikhil0980/Researchers from the National University of Singapore Introduce ‘Thinkless,’ an Adaptive Framework that Reduces Unnecessary Reasoning by up to 90% Using DeGRPONikhilhttps://www.marktechpost.com/author/nikhil0980/This AI Paper Introduces MathCoder-VL and FigCodifier: Advancing Multimodal Mathematical Reasoning with Vision-to-Code AlignmentNikhilhttps://www.marktechpost.com/author/nikhil0980/This AI Paper Introduces PARSCALE: A Parallel Computation Method for Efficient and Scalable Language Model DeploymentNikhilhttps://www.marktechpost.com/author/nikhil0980/Google AI Releases Standalone NotebookLM Mobile App with Offline Audio and Seamless Source Integration
    #this #paper #introduces #group #think
    This AI Paper Introduces Group Think: A Token-Level Multi-Agent Reasoning Paradigm for Faster and Collaborative LLM Inference
    A prominent area of exploration involves enabling large language modelsto function collaboratively. Multi-agent systems powered by LLMs are now being examined for their potential to coordinate challenging problems by splitting tasks and working simultaneously. This direction has gained attention due to its potential to increase efficiency and reduce latency in real-time applications. A common issue in collaborative LLM systems is agents’ sequential, turn-based communication. In such systems, each agent must wait for others to complete their reasoning steps before proceeding. This slows down processing, especially in situations demanding rapid responses. Moreover, agents often duplicate efforts or generate inconsistent outputs, as they cannot see the evolving thoughts of their peers during generation. This latency and redundancy reduce the practicality of deploying multi-agent LLMs, particularly when time and computation are constrained, such as edge devices. Most current solutions have relied on sequential or independently parallel sampling techniques to improve reasoning. Methods like Chain-of-Thought prompting help models to solve problems in a structured way but often come with increased inference time. Approaches such as Tree-of-Thoughts and Graph-of-Thoughts expand on this by branching reasoning paths. However, these approaches still do not allow for real-time mutual adaptation among agents. Multi-agent setups have explored collaborative methods, but mostly through alternating message exchanges, which again introduces delays. Some advanced systems propose complex dynamic scheduling or role-based configurations, which are not optimized for efficient inference. Research from MediaTek Research introduced a new method called Group Think. This approach enables multiple reasoning agents within a single LLM to operate concurrently, observing each other’s partial outputs at the token level. Each reasoning thread adapts to the evolving thoughts of the others mid-generation. This mechanism reduces duplication and enables agents to shift direction if another thread is better positioned to continue a specific line of reasoning. Group Think is implemented through a token-level attention mechanism that lets each agent attend to previously generated tokens from all agents, supporting real-time collaboration. The method works by assigning each agent its own sequence of token indices, allowing their outputs to be interleaved in memory. These interleaved tokens are stored in a shared cache accessible to all agents during generation. This design allows efficient attention across reasoning threads without architectural changes to the transformer model. The implementation works both on personal devices and in data centers. On local devices, it effectively uses idle compute by batching multiple agent outputs, even with a batch size of one. In data centers, Group Think allows multiple requests to be processed together, interleaving tokens across agents while maintaining correct attention dynamics. Performance tests demonstrate that Group Think significantly improves latency and output quality. In enumeration tasks, such as listing 100 distinct names, it achieved near-complete results more rapidly than conventional Chain-of-Thought approaches. The acceleration was proportional to the number of thinkers; for example, four thinkers reduced latency by a factor of about four. In divide-and-conquer problems, using the Floyd–Warshall algorithm on a graph of five nodes, four thinkers reduced the completion time to half that of a single agent. Group Think solved code generation challenges in programming tasks more effectively than baseline models. With four or more thinkers, the model produced correct code segments much faster than traditional reasoning models. This research shows that existing LLMs, though not explicitly trained for collaboration, can already demonstrate emergent group reasoning behaviors under the Group Think setup. In experiments, agents naturally diversified their work to avoid redundancy, often dividing tasks by topic or focus area. These findings suggest that Group Think’s efficiency and sophistication could be enhanced further with dedicated training on collaborative data. Check out the Paper. All credit for this research goes to the researchers of this project. Also, feel free to follow us on Twitter and don’t forget to join our 95k+ ML SubReddit and Subscribe to our Newsletter. NikhilNikhil is an intern consultant at Marktechpost. He is pursuing an integrated dual degree in Materials at the Indian Institute of Technology, Kharagpur. Nikhil is an AI/ML enthusiast who is always researching applications in fields like biomaterials and biomedical science. With a strong background in Material Science, he is exploring new advancements and creating opportunities to contribute.Nikhilhttps://www.marktechpost.com/author/nikhil0980/Researchers from the National University of Singapore Introduce ‘Thinkless,’ an Adaptive Framework that Reduces Unnecessary Reasoning by up to 90% Using DeGRPONikhilhttps://www.marktechpost.com/author/nikhil0980/This AI Paper Introduces MathCoder-VL and FigCodifier: Advancing Multimodal Mathematical Reasoning with Vision-to-Code AlignmentNikhilhttps://www.marktechpost.com/author/nikhil0980/This AI Paper Introduces PARSCALE: A Parallel Computation Method for Efficient and Scalable Language Model DeploymentNikhilhttps://www.marktechpost.com/author/nikhil0980/Google AI Releases Standalone NotebookLM Mobile App with Offline Audio and Seamless Source Integration #this #paper #introduces #group #think
    WWW.MARKTECHPOST.COM
    This AI Paper Introduces Group Think: A Token-Level Multi-Agent Reasoning Paradigm for Faster and Collaborative LLM Inference
    A prominent area of exploration involves enabling large language models (LLMs) to function collaboratively. Multi-agent systems powered by LLMs are now being examined for their potential to coordinate challenging problems by splitting tasks and working simultaneously. This direction has gained attention due to its potential to increase efficiency and reduce latency in real-time applications. A common issue in collaborative LLM systems is agents’ sequential, turn-based communication. In such systems, each agent must wait for others to complete their reasoning steps before proceeding. This slows down processing, especially in situations demanding rapid responses. Moreover, agents often duplicate efforts or generate inconsistent outputs, as they cannot see the evolving thoughts of their peers during generation. This latency and redundancy reduce the practicality of deploying multi-agent LLMs, particularly when time and computation are constrained, such as edge devices. Most current solutions have relied on sequential or independently parallel sampling techniques to improve reasoning. Methods like Chain-of-Thought prompting help models to solve problems in a structured way but often come with increased inference time. Approaches such as Tree-of-Thoughts and Graph-of-Thoughts expand on this by branching reasoning paths. However, these approaches still do not allow for real-time mutual adaptation among agents. Multi-agent setups have explored collaborative methods, but mostly through alternating message exchanges, which again introduces delays. Some advanced systems propose complex dynamic scheduling or role-based configurations, which are not optimized for efficient inference. Research from MediaTek Research introduced a new method called Group Think. This approach enables multiple reasoning agents within a single LLM to operate concurrently, observing each other’s partial outputs at the token level. Each reasoning thread adapts to the evolving thoughts of the others mid-generation. This mechanism reduces duplication and enables agents to shift direction if another thread is better positioned to continue a specific line of reasoning. Group Think is implemented through a token-level attention mechanism that lets each agent attend to previously generated tokens from all agents, supporting real-time collaboration. The method works by assigning each agent its own sequence of token indices, allowing their outputs to be interleaved in memory. These interleaved tokens are stored in a shared cache accessible to all agents during generation. This design allows efficient attention across reasoning threads without architectural changes to the transformer model. The implementation works both on personal devices and in data centers. On local devices, it effectively uses idle compute by batching multiple agent outputs, even with a batch size of one. In data centers, Group Think allows multiple requests to be processed together, interleaving tokens across agents while maintaining correct attention dynamics. Performance tests demonstrate that Group Think significantly improves latency and output quality. In enumeration tasks, such as listing 100 distinct names, it achieved near-complete results more rapidly than conventional Chain-of-Thought approaches. The acceleration was proportional to the number of thinkers; for example, four thinkers reduced latency by a factor of about four. In divide-and-conquer problems, using the Floyd–Warshall algorithm on a graph of five nodes, four thinkers reduced the completion time to half that of a single agent. Group Think solved code generation challenges in programming tasks more effectively than baseline models. With four or more thinkers, the model produced correct code segments much faster than traditional reasoning models. This research shows that existing LLMs, though not explicitly trained for collaboration, can already demonstrate emergent group reasoning behaviors under the Group Think setup. In experiments, agents naturally diversified their work to avoid redundancy, often dividing tasks by topic or focus area. These findings suggest that Group Think’s efficiency and sophistication could be enhanced further with dedicated training on collaborative data. Check out the Paper. All credit for this research goes to the researchers of this project. Also, feel free to follow us on Twitter and don’t forget to join our 95k+ ML SubReddit and Subscribe to our Newsletter. NikhilNikhil is an intern consultant at Marktechpost. He is pursuing an integrated dual degree in Materials at the Indian Institute of Technology, Kharagpur. Nikhil is an AI/ML enthusiast who is always researching applications in fields like biomaterials and biomedical science. With a strong background in Material Science, he is exploring new advancements and creating opportunities to contribute.Nikhilhttps://www.marktechpost.com/author/nikhil0980/Researchers from the National University of Singapore Introduce ‘Thinkless,’ an Adaptive Framework that Reduces Unnecessary Reasoning by up to 90% Using DeGRPONikhilhttps://www.marktechpost.com/author/nikhil0980/This AI Paper Introduces MathCoder-VL and FigCodifier: Advancing Multimodal Mathematical Reasoning with Vision-to-Code AlignmentNikhilhttps://www.marktechpost.com/author/nikhil0980/This AI Paper Introduces PARSCALE (Parallel Scaling): A Parallel Computation Method for Efficient and Scalable Language Model DeploymentNikhilhttps://www.marktechpost.com/author/nikhil0980/Google AI Releases Standalone NotebookLM Mobile App with Offline Audio and Seamless Source Integration
    0 Commenti 0 condivisioni
  • Have we finally solved mystery of magnetic moon rocks?

    i ate a rock from the moon

    Have we finally solved mystery of magnetic moon rocks?

    Simulations show how effects of asteroid impact could amplify the early Moon's weak magnetic field.

    Jennifer Ouellette



    May 23, 2025 2:36 pm

    |

    5

    NASA Lunar sample 60015 on display at Space Center Houston Lunar Samples Vault, at NASA's Johnson Space Center

    Credit:

    OptoMechEngineer/CC BY-SA 4.0

    NASA Lunar sample 60015 on display at Space Center Houston Lunar Samples Vault, at NASA's Johnson Space Center

    Credit:

    OptoMechEngineer/CC BY-SA 4.0

    Story text

    Size

    Small
    Standard
    Large

    Width
    *

    Standard
    Wide

    Links

    Standard
    Orange

    * Subscribers only
      Learn more

    NASA's Apollo missions brought back moon rock samples for scientists to study. We've learned a great deal over the ensuing decades, but one enduring mystery remains. Many of those lunar samples show signs of exposure to strong magnetic fields comparable to Earth's, yet the Moon doesn't have such a field today. So, how did the moon rocks get their magnetism?
    There have been many attempts to explain this anomaly. The latest comes from MIT scientists, who argue in a new paper published in the journal Science Advances that a large asteroid impact briefly boosted the Moon's early weak magnetic field—and that this spike is what is recorded in some lunar samples.
    Evidence gleaned from orbiting spacecraft observations, as well as results announced earlier this year from China's Chang'e 5 and Chang'e 6 missions, is largely consistent with the existence of at least a weak magnetic field on the early Moon. But where did this field come from? These usually form in planetary bodies as a result of a dynamo, in which molten metals in the core start to convect thanks to slowly dissipating heat. The problem is that the early Moon's small core had a mantle that wasn't much cooler than its core, so there would not have been significant convection to produce a sufficiently strong dynamo.
    There have been proposed hypotheses as to how the Moon could have developed a core dynamo. For instance, a 2022 analysis suggested that in the first billion years, when the Moon was covered in molten rock, giant rocks formed as the magma cooled and solidified. Denser minerals sank to the core while lighter ones formed a crust.
    Over time, the authors argued, a titanium layer crystallized just beneath the surface, and because it was denser than lighter minerals just beneath, that layer eventually broke into small blobs and sank through the mantle. The temperature difference between the cooler sinking rocks and the hotter core generated convection, creating intermittently strong magnetic fields—thus explaining why some rocks have that magnetic signature and others don't.
    Or perhaps there is no need for the presence of a dynamo-driven magnetic field at all. For instance, the authors of a 2021 study thought earlier analyses of lunar samples may have been altered during the process. They re-examined samples from the 1972 Apollo 16 mission using CO2 lasers to heat them, thus avoiding any alteration of the magnetic carriers. They concluded that any magnetic signatures in those samples could be explained by the impact of meteorites or comets hitting the Moon.

    Bracing for impact
    In 2020, two of the current paper's authors, MIT's Benjamin Weiss and Rona Oran, ran simulations to test whether a giant impact could generate a plasma that, in turn, would amplify the Moon's existing weak solar-generated magnetic field sufficiently to account for the levels of magnetism measured in the moon rocks. Those results seemed to rule out the possibility. This time around, they have come up with a new hypothesis that essentially combines elements of the dynamo and the plasma-generating impact hypotheses—taking into account an impact's resulting shockwave for good measure.

    Amplification of the lunar dynamo field by an Imbrium-­sized impact at the magnetic equator.

    Credit:

    Isaac S. Narrett et al., 2025

    They tested their hypothesis by running impact simulations, focusing on the level of impact that created the Moon's Imbrium basin, as well as plasma cloud simulations. Their starting assumption was that the early Moon had a dynamo that generated a weak magnetic field 50 times weaker than Earth's. The results confirmed that a large asteroid impact, for example, could have kicked up a plasma cloud, part of which spread outward into space. The remaining plasma streamed around to the other side of the Moon, amplifying the existing weak magnetic field for around 40 minutes.
    A key factor is the shock wave created by the initial impact, similar to seismic waves, which would have rattled surrounding rocks enough to reorient their subatomic spins in line with the newly amplified magnetic field. Weiss has likened the effect to tossing a deck of 52 playing cards into the air within a magnetic field. If each card had its own compass needle, its magnetism would be in a new orientation once each card hit the ground.
    It's a complicated scenario that admittedly calls for a degree of serendipity. But we might not have to wait too long for confirmation one way or the other. The answer could lie in analyzing fresh lunar samples and looking for telltale signatures not just of high magnetism but also shock.Scientists are looking to NASA's planned Artemis crewed missions for this, since sample returns are among the objectives. Much will depend on NASA's future funding, which is currently facing substantial cuts, although thus far, Artemis II and III remain on track.
    Science Advances, 2025. DOI: 10.1126/sciadv.adr7401  .

    Jennifer Ouellette
    Senior Writer

    Jennifer Ouellette
    Senior Writer

    Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

    5 Comments
    #have #finally #solved #mystery #magnetic
    Have we finally solved mystery of magnetic moon rocks?
    i ate a rock from the moon Have we finally solved mystery of magnetic moon rocks? Simulations show how effects of asteroid impact could amplify the early Moon's weak magnetic field. Jennifer Ouellette – May 23, 2025 2:36 pm | 5 NASA Lunar sample 60015 on display at Space Center Houston Lunar Samples Vault, at NASA's Johnson Space Center Credit: OptoMechEngineer/CC BY-SA 4.0 NASA Lunar sample 60015 on display at Space Center Houston Lunar Samples Vault, at NASA's Johnson Space Center Credit: OptoMechEngineer/CC BY-SA 4.0 Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more NASA's Apollo missions brought back moon rock samples for scientists to study. We've learned a great deal over the ensuing decades, but one enduring mystery remains. Many of those lunar samples show signs of exposure to strong magnetic fields comparable to Earth's, yet the Moon doesn't have such a field today. So, how did the moon rocks get their magnetism? There have been many attempts to explain this anomaly. The latest comes from MIT scientists, who argue in a new paper published in the journal Science Advances that a large asteroid impact briefly boosted the Moon's early weak magnetic field—and that this spike is what is recorded in some lunar samples. Evidence gleaned from orbiting spacecraft observations, as well as results announced earlier this year from China's Chang'e 5 and Chang'e 6 missions, is largely consistent with the existence of at least a weak magnetic field on the early Moon. But where did this field come from? These usually form in planetary bodies as a result of a dynamo, in which molten metals in the core start to convect thanks to slowly dissipating heat. The problem is that the early Moon's small core had a mantle that wasn't much cooler than its core, so there would not have been significant convection to produce a sufficiently strong dynamo. There have been proposed hypotheses as to how the Moon could have developed a core dynamo. For instance, a 2022 analysis suggested that in the first billion years, when the Moon was covered in molten rock, giant rocks formed as the magma cooled and solidified. Denser minerals sank to the core while lighter ones formed a crust. Over time, the authors argued, a titanium layer crystallized just beneath the surface, and because it was denser than lighter minerals just beneath, that layer eventually broke into small blobs and sank through the mantle. The temperature difference between the cooler sinking rocks and the hotter core generated convection, creating intermittently strong magnetic fields—thus explaining why some rocks have that magnetic signature and others don't. Or perhaps there is no need for the presence of a dynamo-driven magnetic field at all. For instance, the authors of a 2021 study thought earlier analyses of lunar samples may have been altered during the process. They re-examined samples from the 1972 Apollo 16 mission using CO2 lasers to heat them, thus avoiding any alteration of the magnetic carriers. They concluded that any magnetic signatures in those samples could be explained by the impact of meteorites or comets hitting the Moon. Bracing for impact In 2020, two of the current paper's authors, MIT's Benjamin Weiss and Rona Oran, ran simulations to test whether a giant impact could generate a plasma that, in turn, would amplify the Moon's existing weak solar-generated magnetic field sufficiently to account for the levels of magnetism measured in the moon rocks. Those results seemed to rule out the possibility. This time around, they have come up with a new hypothesis that essentially combines elements of the dynamo and the plasma-generating impact hypotheses—taking into account an impact's resulting shockwave for good measure. Amplification of the lunar dynamo field by an Imbrium-­sized impact at the magnetic equator. Credit: Isaac S. Narrett et al., 2025 They tested their hypothesis by running impact simulations, focusing on the level of impact that created the Moon's Imbrium basin, as well as plasma cloud simulations. Their starting assumption was that the early Moon had a dynamo that generated a weak magnetic field 50 times weaker than Earth's. The results confirmed that a large asteroid impact, for example, could have kicked up a plasma cloud, part of which spread outward into space. The remaining plasma streamed around to the other side of the Moon, amplifying the existing weak magnetic field for around 40 minutes. A key factor is the shock wave created by the initial impact, similar to seismic waves, which would have rattled surrounding rocks enough to reorient their subatomic spins in line with the newly amplified magnetic field. Weiss has likened the effect to tossing a deck of 52 playing cards into the air within a magnetic field. If each card had its own compass needle, its magnetism would be in a new orientation once each card hit the ground. It's a complicated scenario that admittedly calls for a degree of serendipity. But we might not have to wait too long for confirmation one way or the other. The answer could lie in analyzing fresh lunar samples and looking for telltale signatures not just of high magnetism but also shock.Scientists are looking to NASA's planned Artemis crewed missions for this, since sample returns are among the objectives. Much will depend on NASA's future funding, which is currently facing substantial cuts, although thus far, Artemis II and III remain on track. Science Advances, 2025. DOI: 10.1126/sciadv.adr7401  . Jennifer Ouellette Senior Writer Jennifer Ouellette Senior Writer Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban. 5 Comments #have #finally #solved #mystery #magnetic
    ARSTECHNICA.COM
    Have we finally solved mystery of magnetic moon rocks?
    i ate a rock from the moon Have we finally solved mystery of magnetic moon rocks? Simulations show how effects of asteroid impact could amplify the early Moon's weak magnetic field. Jennifer Ouellette – May 23, 2025 2:36 pm | 5 NASA Lunar sample 60015 on display at Space Center Houston Lunar Samples Vault, at NASA's Johnson Space Center Credit: OptoMechEngineer/CC BY-SA 4.0 NASA Lunar sample 60015 on display at Space Center Houston Lunar Samples Vault, at NASA's Johnson Space Center Credit: OptoMechEngineer/CC BY-SA 4.0 Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more NASA's Apollo missions brought back moon rock samples for scientists to study. We've learned a great deal over the ensuing decades, but one enduring mystery remains. Many of those lunar samples show signs of exposure to strong magnetic fields comparable to Earth's, yet the Moon doesn't have such a field today. So, how did the moon rocks get their magnetism? There have been many attempts to explain this anomaly. The latest comes from MIT scientists, who argue in a new paper published in the journal Science Advances that a large asteroid impact briefly boosted the Moon's early weak magnetic field—and that this spike is what is recorded in some lunar samples. Evidence gleaned from orbiting spacecraft observations, as well as results announced earlier this year from China's Chang'e 5 and Chang'e 6 missions, is largely consistent with the existence of at least a weak magnetic field on the early Moon. But where did this field come from? These usually form in planetary bodies as a result of a dynamo, in which molten metals in the core start to convect thanks to slowly dissipating heat. The problem is that the early Moon's small core had a mantle that wasn't much cooler than its core, so there would not have been significant convection to produce a sufficiently strong dynamo. There have been proposed hypotheses as to how the Moon could have developed a core dynamo. For instance, a 2022 analysis suggested that in the first billion years, when the Moon was covered in molten rock, giant rocks formed as the magma cooled and solidified. Denser minerals sank to the core while lighter ones formed a crust. Over time, the authors argued, a titanium layer crystallized just beneath the surface, and because it was denser than lighter minerals just beneath, that layer eventually broke into small blobs and sank through the mantle (gravitational overturn). The temperature difference between the cooler sinking rocks and the hotter core generated convection, creating intermittently strong magnetic fields—thus explaining why some rocks have that magnetic signature and others don't. Or perhaps there is no need for the presence of a dynamo-driven magnetic field at all. For instance, the authors of a 2021 study thought earlier analyses of lunar samples may have been altered during the process. They re-examined samples from the 1972 Apollo 16 mission using CO2 lasers to heat them, thus avoiding any alteration of the magnetic carriers. They concluded that any magnetic signatures in those samples could be explained by the impact of meteorites or comets hitting the Moon. Bracing for impact In 2020, two of the current paper's authors, MIT's Benjamin Weiss and Rona Oran, ran simulations to test whether a giant impact could generate a plasma that, in turn, would amplify the Moon's existing weak solar-generated magnetic field sufficiently to account for the levels of magnetism measured in the moon rocks. Those results seemed to rule out the possibility. This time around, they have come up with a new hypothesis that essentially combines elements of the dynamo and the plasma-generating impact hypotheses—taking into account an impact's resulting shockwave for good measure. Amplification of the lunar dynamo field by an Imbrium-­sized impact at the magnetic equator. Credit: Isaac S. Narrett et al., 2025 They tested their hypothesis by running impact simulations, focusing on the level of impact that created the Moon's Imbrium basin, as well as plasma cloud simulations. Their starting assumption was that the early Moon had a dynamo that generated a weak magnetic field 50 times weaker than Earth's. The results confirmed that a large asteroid impact, for example, could have kicked up a plasma cloud, part of which spread outward into space. The remaining plasma streamed around to the other side of the Moon, amplifying the existing weak magnetic field for around 40 minutes. A key factor is the shock wave created by the initial impact, similar to seismic waves, which would have rattled surrounding rocks enough to reorient their subatomic spins in line with the newly amplified magnetic field. Weiss has likened the effect to tossing a deck of 52 playing cards into the air within a magnetic field. If each card had its own compass needle, its magnetism would be in a new orientation once each card hit the ground. It's a complicated scenario that admittedly calls for a degree of serendipity. But we might not have to wait too long for confirmation one way or the other. The answer could lie in analyzing fresh lunar samples and looking for telltale signatures not just of high magnetism but also shock. (Early lunar samples were often discarded if they showed signs of shock.) Scientists are looking to NASA's planned Artemis crewed missions for this, since sample returns are among the objectives. Much will depend on NASA's future funding, which is currently facing substantial cuts, although thus far, Artemis II and III remain on track. Science Advances, 2025. DOI: 10.1126/sciadv.adr7401  (About DOIs). Jennifer Ouellette Senior Writer Jennifer Ouellette Senior Writer Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban. 5 Comments
    0 Commenti 0 condivisioni
  • New Ontario bills gut environmental protections, eliminate green building bylaws

    The Legislative Assembly of Ontario, from www.ola.org
     
    Two recent bills introduced in the Ontario Legislature are poised to gut environmental protections, and severely curb the authority of municipal planners. Here’s a summary of the tabled bills 5 and 17, focused on areas of relevance to architects.
    Bill 5: Repealing the Endangered Species Act, introducing regulation-free Special Economic Zones
    The omnibus Bill 5, Protect Ontario by Unleashing our Economy Act, 2025, is ostensibly aimed at stimulating the economy by removing barriers to development.
    One of its key components is replacing the province’s Endangered Species Act with a hollowed-out Species Conservation Act. The new act allows the government to pick and choose which species are protected, and narrowly defines their “habitat” as the nest or den of an animal—not the broader feeding grounds, forests, or wetlands they need to survive.
    Developers must currently apply for a permit when their projects threaten a species or habitat, and these applications are reviewed by environmental experts. This process would be replaced by an online registration form; when the form is submitted, a company is free to start building, including damaging or destroying habitats of listed specied, so long as the activity is registered.  The new Species Conservation Act will completely exclude migratory birds and certain aquatic species.
    “It’s a developer’s dream and an environmental nightmare,” writes environmental lawyers Ecojustice.
    Bill 5 also contains provisions for creating Special Economic Zones, where provincial and municipal laws do not apply—a status that the Province could claim for any project or proponent. This would allow work on these projects to be exempt from zoning regulations and approvals, as well as from labour laws, health and safety laws, traffic and speeding laws, and even laws preventing trespassing on private property, notes advocacy group Environmental Defence.
    The Bill specifically exempts the Ontario Place redevelopment from the Environmental Bill of Rights. As a result, explains lawyers from Dentons, “the public will not receive notice of, or have opportunity to, comment on proposals, decisions, or events that could affect the environment as it relates to the Ontario Place Redevelopment Project.”
    Advocacy group Ontario Place For All writes: “The introduction of this clause is a clear response to the overwhelming number of comments—over 2200—from the community to the Environmental Registry of Ontario regarding the Ford government’s application to cut an existing combined sewer overflowthat will be in the way of Therme’s planned beach. The application has the CSO emptying into the west channel inside the breakwater and potentially allowing raw sewage into an area used recreationally by rowers, paddlers, swimmers, and for water shows by the CNE. The Auditor General’s Report estimated the cost of moving the CSO to be approximately million.”
    The Bill also amends the Ontario Heritage Act, allowing the Province to exempt properties from archaeological and heritage conservation requirements if they could potentially advance provincial priorities including, but not limited to, transit, housing, health, long-term care, or infrastructure.
    Another part of the bill would damage the clean energy transition, notes Environmental Defense. “Bill 5 would enable the government to ban all parts of energy projects that come from abroad, especially China. China makes the majority of solar panels, wind turbinesand control systems in the world,” it writes. “If enacted, Bill 5 would likely end solar power installation in Ontario and deprive Ontarians access to the cleanest source of new electricity available.”
    In the Legislature, Liberal member Ted Tsu noted, “They called this bill, Bill 5, the Protect Ontario by Unleashing our Economy Act. However, upon studying the bill, I think a more appropriate short title would be ‘don’t protect Ontario and use tariffs as cover to unleash lobbying act.’ That is a summary of what I think is wrong in principle with Bill 5.”
    Bill 5 has undergone its second reading and will be the subject of a Standing Committee hearing.

    Bill 17: Striking down green development standards, paring down planning applications
    Bill 17: Protecting Ontario by Building Faster and Smarter Act, 2025 aims to dismantle the City of Toronto’s Green Building Bylaw, along with limiting municipal authority in planning processes. These changes are proposed in the ostensible interest of speeding up construction in order to lower housing costs.
    The bill states that municipalities must follow the Building Code, and prohibits them for passing by-laws or imposing construction standards that exceed those set out in the Building Code. This seems to deliver a major win to development group RESCON, which has been lobbying to strike down the Toronto Green Standard.
    Fifteen municipalities in the Greater Toronto Area currently have green development standards. Non-profit group The Atmospheric Fundnotes that green standards do not slow housing construction. “In 2023, Toronto exceeded its housing targets by 51%, with nearly 96% of housing starts being subject to the Toronto Green Standard. Overall, Toronto’s housing starts have grown or stayed consistent nearly every year since the TGS was implemented.” The group also notes that the Ontario Building Code’s energy efficiency requirements have not been updated since 2017, and that Ontario’s cities will not meet their climate targets without more progressive pathways to low-carbon construction.
    Also of direct impact to architects is the proposed standardization of requirements for “complete” planning applications. Under the tabled bill, the Minister of Municipal Affairs and Housing will have the power to govern what information or material is requiredin connection with official plan amendments, zoning by-law amendments, site plan approval, draft plans of subdivisions, and consent applications. This would prevail over existing Official Plan requirements. Currently, the Ontario government is proposing that sun/shadow, wind, urban design and lighting studies would not be required as part of a complete planning application.
    The bills would also deem an application to be complete not when it’s accepted by a municipal planning authority, but solely on the basis of it being prepared by prescribed professional. The prescribed professions are not yet defined, but the government has cited Engineers as an example.
    Bill 17 proposes to grant minor variances “as of right” so long that they fall with a certain percentage of current setback regulations.This would apply to urban residential lands outside of the Greenbelt.
    The Bill proposes amendments to the Development Charges Act that will change what municipalities can charge, including eliminating development charges for long-term care homes. The bill limits Inclusionary Zoning to apply to a maximum 5% set-aside rate, and a maximum 25-year period of affordability.
    Dentons notes that: “While not specifically provided for in Bill 17, the Technical Briefing suggests that, the Minister of Infrastructure will have authority to approve MZOs, an authority currently held only by the Minister of Municipal Affairs and Housing.”
    Environmental Defense’s Phil Pothen writes: “Some of the measures proposed in Bill 17—like deferring development charges—could help build smarter and faster if they were applied selectively to infill, mid-rise and multiplex housing. But the bill’s current language would apply these changes to sprawl and McMansion development as well.”
    He adds: “Bill 17 also includes provisions that seem aimed at erasing municipal urban rules and green building standards, imposing generic road-design standards on urban and suburban streets and preventing urban design. Those changes could actually make it harder to speed up housing—reversing progress toward more efficient construction and land use and the modes of transportation that support them.”
    The Bill would also amend the Building Code to eliminate the need for a secondary provincial approval of innovative construction products if they have already been examined by the Canadian Construction Materials Centre of the National Research Council of Canada.
    The Ontario government is currently seeking comment on their proposed regulation to standardize complete application requirements. They are also currently seeking comment on the proposed regulation that provides for as-of-rights within 10% of current required setbacks. These comment periods are open until June 26, 2025.

    The post New Ontario bills gut environmental protections, eliminate green building bylaws appeared first on Canadian Architect.
    #new #ontario #bills #gut #environmental
    New Ontario bills gut environmental protections, eliminate green building bylaws
    The Legislative Assembly of Ontario, from www.ola.org   Two recent bills introduced in the Ontario Legislature are poised to gut environmental protections, and severely curb the authority of municipal planners. Here’s a summary of the tabled bills 5 and 17, focused on areas of relevance to architects. Bill 5: Repealing the Endangered Species Act, introducing regulation-free Special Economic Zones The omnibus Bill 5, Protect Ontario by Unleashing our Economy Act, 2025, is ostensibly aimed at stimulating the economy by removing barriers to development. One of its key components is replacing the province’s Endangered Species Act with a hollowed-out Species Conservation Act. The new act allows the government to pick and choose which species are protected, and narrowly defines their “habitat” as the nest or den of an animal—not the broader feeding grounds, forests, or wetlands they need to survive. Developers must currently apply for a permit when their projects threaten a species or habitat, and these applications are reviewed by environmental experts. This process would be replaced by an online registration form; when the form is submitted, a company is free to start building, including damaging or destroying habitats of listed specied, so long as the activity is registered.  The new Species Conservation Act will completely exclude migratory birds and certain aquatic species. “It’s a developer’s dream and an environmental nightmare,” writes environmental lawyers Ecojustice. Bill 5 also contains provisions for creating Special Economic Zones, where provincial and municipal laws do not apply—a status that the Province could claim for any project or proponent. This would allow work on these projects to be exempt from zoning regulations and approvals, as well as from labour laws, health and safety laws, traffic and speeding laws, and even laws preventing trespassing on private property, notes advocacy group Environmental Defence. The Bill specifically exempts the Ontario Place redevelopment from the Environmental Bill of Rights. As a result, explains lawyers from Dentons, “the public will not receive notice of, or have opportunity to, comment on proposals, decisions, or events that could affect the environment as it relates to the Ontario Place Redevelopment Project.” Advocacy group Ontario Place For All writes: “The introduction of this clause is a clear response to the overwhelming number of comments—over 2200—from the community to the Environmental Registry of Ontario regarding the Ford government’s application to cut an existing combined sewer overflowthat will be in the way of Therme’s planned beach. The application has the CSO emptying into the west channel inside the breakwater and potentially allowing raw sewage into an area used recreationally by rowers, paddlers, swimmers, and for water shows by the CNE. The Auditor General’s Report estimated the cost of moving the CSO to be approximately million.” The Bill also amends the Ontario Heritage Act, allowing the Province to exempt properties from archaeological and heritage conservation requirements if they could potentially advance provincial priorities including, but not limited to, transit, housing, health, long-term care, or infrastructure. Another part of the bill would damage the clean energy transition, notes Environmental Defense. “Bill 5 would enable the government to ban all parts of energy projects that come from abroad, especially China. China makes the majority of solar panels, wind turbinesand control systems in the world,” it writes. “If enacted, Bill 5 would likely end solar power installation in Ontario and deprive Ontarians access to the cleanest source of new electricity available.” In the Legislature, Liberal member Ted Tsu noted, “They called this bill, Bill 5, the Protect Ontario by Unleashing our Economy Act. However, upon studying the bill, I think a more appropriate short title would be ‘don’t protect Ontario and use tariffs as cover to unleash lobbying act.’ That is a summary of what I think is wrong in principle with Bill 5.” Bill 5 has undergone its second reading and will be the subject of a Standing Committee hearing. Bill 17: Striking down green development standards, paring down planning applications Bill 17: Protecting Ontario by Building Faster and Smarter Act, 2025 aims to dismantle the City of Toronto’s Green Building Bylaw, along with limiting municipal authority in planning processes. These changes are proposed in the ostensible interest of speeding up construction in order to lower housing costs. The bill states that municipalities must follow the Building Code, and prohibits them for passing by-laws or imposing construction standards that exceed those set out in the Building Code. This seems to deliver a major win to development group RESCON, which has been lobbying to strike down the Toronto Green Standard. Fifteen municipalities in the Greater Toronto Area currently have green development standards. Non-profit group The Atmospheric Fundnotes that green standards do not slow housing construction. “In 2023, Toronto exceeded its housing targets by 51%, with nearly 96% of housing starts being subject to the Toronto Green Standard. Overall, Toronto’s housing starts have grown or stayed consistent nearly every year since the TGS was implemented.” The group also notes that the Ontario Building Code’s energy efficiency requirements have not been updated since 2017, and that Ontario’s cities will not meet their climate targets without more progressive pathways to low-carbon construction. Also of direct impact to architects is the proposed standardization of requirements for “complete” planning applications. Under the tabled bill, the Minister of Municipal Affairs and Housing will have the power to govern what information or material is requiredin connection with official plan amendments, zoning by-law amendments, site plan approval, draft plans of subdivisions, and consent applications. This would prevail over existing Official Plan requirements. Currently, the Ontario government is proposing that sun/shadow, wind, urban design and lighting studies would not be required as part of a complete planning application. The bills would also deem an application to be complete not when it’s accepted by a municipal planning authority, but solely on the basis of it being prepared by prescribed professional. The prescribed professions are not yet defined, but the government has cited Engineers as an example. Bill 17 proposes to grant minor variances “as of right” so long that they fall with a certain percentage of current setback regulations.This would apply to urban residential lands outside of the Greenbelt. The Bill proposes amendments to the Development Charges Act that will change what municipalities can charge, including eliminating development charges for long-term care homes. The bill limits Inclusionary Zoning to apply to a maximum 5% set-aside rate, and a maximum 25-year period of affordability. Dentons notes that: “While not specifically provided for in Bill 17, the Technical Briefing suggests that, the Minister of Infrastructure will have authority to approve MZOs, an authority currently held only by the Minister of Municipal Affairs and Housing.” Environmental Defense’s Phil Pothen writes: “Some of the measures proposed in Bill 17—like deferring development charges—could help build smarter and faster if they were applied selectively to infill, mid-rise and multiplex housing. But the bill’s current language would apply these changes to sprawl and McMansion development as well.” He adds: “Bill 17 also includes provisions that seem aimed at erasing municipal urban rules and green building standards, imposing generic road-design standards on urban and suburban streets and preventing urban design. Those changes could actually make it harder to speed up housing—reversing progress toward more efficient construction and land use and the modes of transportation that support them.” The Bill would also amend the Building Code to eliminate the need for a secondary provincial approval of innovative construction products if they have already been examined by the Canadian Construction Materials Centre of the National Research Council of Canada. The Ontario government is currently seeking comment on their proposed regulation to standardize complete application requirements. They are also currently seeking comment on the proposed regulation that provides for as-of-rights within 10% of current required setbacks. These comment periods are open until June 26, 2025. The post New Ontario bills gut environmental protections, eliminate green building bylaws appeared first on Canadian Architect. #new #ontario #bills #gut #environmental
    WWW.CANADIANARCHITECT.COM
    New Ontario bills gut environmental protections, eliminate green building bylaws
    The Legislative Assembly of Ontario, from www.ola.org   Two recent bills introduced in the Ontario Legislature are poised to gut environmental protections, and severely curb the authority of municipal planners. Here’s a summary of the tabled bills 5 and 17, focused on areas of relevance to architects. Bill 5: Repealing the Endangered Species Act, introducing regulation-free Special Economic Zones The omnibus Bill 5, Protect Ontario by Unleashing our Economy Act, 2025, is ostensibly aimed at stimulating the economy by removing barriers to development. One of its key components is replacing the province’s Endangered Species Act with a hollowed-out Species Conservation Act. The new act allows the government to pick and choose which species are protected, and narrowly defines their “habitat” as the nest or den of an animal—not the broader feeding grounds, forests, or wetlands they need to survive. Developers must currently apply for a permit when their projects threaten a species or habitat, and these applications are reviewed by environmental experts. This process would be replaced by an online registration form; when the form is submitted, a company is free to start building, including damaging or destroying habitats of listed specied, so long as the activity is registered.  The new Species Conservation Act will completely exclude migratory birds and certain aquatic species. “It’s a developer’s dream and an environmental nightmare,” writes environmental lawyers Ecojustice. Bill 5 also contains provisions for creating Special Economic Zones, where provincial and municipal laws do not apply—a status that the Province could claim for any project or proponent. This would allow work on these projects to be exempt from zoning regulations and approvals, as well as from labour laws, health and safety laws, traffic and speeding laws, and even laws preventing trespassing on private property, notes advocacy group Environmental Defence. The Bill specifically exempts the Ontario Place redevelopment from the Environmental Bill of Rights. As a result, explains lawyers from Dentons, “the public will not receive notice of, or have opportunity to, comment on proposals, decisions, or events that could affect the environment as it relates to the Ontario Place Redevelopment Project.” Advocacy group Ontario Place For All writes: “The introduction of this clause is a clear response to the overwhelming number of comments—over 2200—from the community to the Environmental Registry of Ontario regarding the Ford government’s application to cut an existing combined sewer overflow (CSO) that will be in the way of Therme’s planned beach. The application has the CSO emptying into the west channel inside the breakwater and potentially allowing raw sewage into an area used recreationally by rowers, paddlers, swimmers, and for water shows by the CNE. The Auditor General’s Report estimated the cost of moving the CSO to be approximately $60 million.” The Bill also amends the Ontario Heritage Act, allowing the Province to exempt properties from archaeological and heritage conservation requirements if they could potentially advance provincial priorities including, but not limited to, transit, housing, health, long-term care, or infrastructure. Another part of the bill would damage the clean energy transition, notes Environmental Defense. “Bill 5 would enable the government to ban all parts of energy projects that come from abroad, especially China. China makes the majority of solar panels (over 80 per cent), wind turbines (around 60 per cent) and control systems in the world,” it writes. “If enacted, Bill 5 would likely end solar power installation in Ontario and deprive Ontarians access to the cleanest source of new electricity available.” In the Legislature, Liberal member Ted Tsu noted, “They called this bill, Bill 5, the Protect Ontario by Unleashing our Economy Act. However, upon studying the bill, I think a more appropriate short title would be ‘don’t protect Ontario and use tariffs as cover to unleash lobbying act.’ That is a summary of what I think is wrong in principle with Bill 5.” Bill 5 has undergone its second reading and will be the subject of a Standing Committee hearing. Bill 17: Striking down green development standards, paring down planning applications Bill 17: Protecting Ontario by Building Faster and Smarter Act, 2025 aims to dismantle the City of Toronto’s Green Building Bylaw, along with limiting municipal authority in planning processes. These changes are proposed in the ostensible interest of speeding up construction in order to lower housing costs. The bill states that municipalities must follow the Building Code, and prohibits them for passing by-laws or imposing construction standards that exceed those set out in the Building Code. This seems to deliver a major win to development group RESCON, which has been lobbying to strike down the Toronto Green Standard. Fifteen municipalities in the Greater Toronto Area currently have green development standards. Non-profit group The Atmospheric Fund (TAF) notes that green standards do not slow housing construction. “In 2023, Toronto exceeded its housing targets by 51%, with nearly 96% of housing starts being subject to the Toronto Green Standard. Overall, Toronto’s housing starts have grown or stayed consistent nearly every year since the TGS was implemented.” The group also notes that the Ontario Building Code’s energy efficiency requirements have not been updated since 2017, and that Ontario’s cities will not meet their climate targets without more progressive pathways to low-carbon construction. Also of direct impact to architects is the proposed standardization of requirements for “complete” planning applications. Under the tabled bill, the Minister of Municipal Affairs and Housing will have the power to govern what information or material is required (or prohibited) in connection with official plan amendments, zoning by-law amendments, site plan approval, draft plans of subdivisions, and consent applications. This would prevail over existing Official Plan requirements. Currently, the Ontario government is proposing that sun/shadow, wind, urban design and lighting studies would not be required as part of a complete planning application. The bills would also deem an application to be complete not when it’s accepted by a municipal planning authority, but solely on the basis of it being prepared by prescribed professional. The prescribed professions are not yet defined, but the government has cited Engineers as an example. Bill 17 proposes to grant minor variances “as of right” so long that they fall with a certain percentage of current setback regulations. (They are currently proposing 10%.) This would apply to urban residential lands outside of the Greenbelt. The Bill proposes amendments to the Development Charges Act that will change what municipalities can charge, including eliminating development charges for long-term care homes. The bill limits Inclusionary Zoning to apply to a maximum 5% set-aside rate, and a maximum 25-year period of affordability. Dentons notes that: “While not specifically provided for in Bill 17, the Technical Briefing suggests that, the Minister of Infrastructure will have authority to approve MZOs, an authority currently held only by the Minister of Municipal Affairs and Housing.” Environmental Defense’s Phil Pothen writes: “Some of the measures proposed in Bill 17—like deferring development charges—could help build smarter and faster if they were applied selectively to infill, mid-rise and multiplex housing. But the bill’s current language would apply these changes to sprawl and McMansion development as well.” He adds: “Bill 17 also includes provisions that seem aimed at erasing municipal urban rules and green building standards, imposing generic road-design standards on urban and suburban streets and preventing urban design. Those changes could actually make it harder to speed up housing—reversing progress toward more efficient construction and land use and the modes of transportation that support them.” The Bill would also amend the Building Code to eliminate the need for a secondary provincial approval of innovative construction products if they have already been examined by the Canadian Construction Materials Centre of the National Research Council of Canada. The Ontario government is currently seeking comment on their proposed regulation to standardize complete application requirements. They are also currently seeking comment on the proposed regulation that provides for as-of-rights within 10% of current required setbacks. These comment periods are open until June 26, 2025. The post New Ontario bills gut environmental protections, eliminate green building bylaws appeared first on Canadian Architect.
    0 Commenti 0 condivisioni
  • Signal to Windows Recall: Drop dead

    Windows, as all but the most besotted Microsoft fans know, has historically been a security disaster. Seriously, what other program has a dedicated day each month to reveal its latest security holes?

    But now, Windows Recall, the AI-powered “feature” that continuously takes snapshots of your screen to create a searchable timeline of everything you do, has arrived for Copilot+ PCs running Windows 11 version 24H2 and newer.

    After a year of controversy and multiple delays prompted by widespread privacy and security concerns, Microsoft has significantly changed Recall’s architecture. The feature is now opt-in, requires Windows Hello biometric authentication, encrypts all snapshots locally, filters out sensitive data such as credit card numbers, and allows users to filter out specific apps or websites from being captured.

    I am so unimpressed. A few days ago, in the latest Patch Tuesday release, Microsoft revealed five — count ’em, five! — zero-day security holes in Windows alone. Do you expect me to trust Recall with a track record like this?

    Besides, even if I don’t enable the feature, what if our beloved federal government decides that for our protection, it would be better if Microsoft turned on Recall for some users? After all, it’s almost impossible to run Windows these days without having a Microsoft ID, making it easy to pick and choose who gets what “update.”

    Other people feel the same way. Recall remains a lightning rod for criticism. Privacy advocates and security experts continue to warn that the very nature of Recall capturing and storing everything displayed on a user’s screen every few seconds is inherently too risky. Even if you don’t use the feature yourself, what about all the people you communicate with who might have Recall turned on? How could you even know?

    A friend at the University of Pennsylvania told me that the school has examined Microsoft Recall and found that it “introduces substantial and unacceptable security, legality, and privacy challenges.” Sounds about right to me.

    Amusingly enough, Kaspersky, the Russian security company that has its own security issues, also states that you should avoid Recall. Why? Well, yes, when you first activate Recall, you are required to use biometric authentication. After that, your PIN will do nicely. Oh, and its automatic filtering of sensitive data is unreliable. Sure, it will stop taking snapshots when you’re in private mode on Chrome or Edge. Vivaldi? Not so much.

    And as Kaspersky points out, if you use videoconferencing with automatic transcription enabled, Recall will save a complete call transcript detailing who said what. Oh boy!

    Signal, the popular secure messaging program, wants nothing to do with this. It has introduced a new “Screen security” setting in its Windows desktop app, specifically designed to protect its users from Recall.

    Enabled by default on Windows 11, this feature uses a Digital Rights Managementflag to stop any application, including Windows Recall, from capturing screenshots of Signal chats. When Recall or other screenshot tools try to capture Signal’s window, it will produce a blank image.

    Why? In a blog post, Signal explained:

    “Although Microsoft made several adjustments over the past twelve months in response to critical feedback, the revamped version of Recall still places any content that’s displayed within privacy-preserving apps like Signal at risk. As a result, we are enabling an extra layer of protection by default on Windows 11 in order to help maintain the security of Signal Desktop on that platform, even though it introduces some usability trade-offs. Microsoft has simply given us no other option.”

    Actually, you do have another option: Desktop Linux. I said it ages ago, and I’ll say it again now. If you really care about security on your desktop, you want Linux.
    #signal #windows #recall #drop #dead
    Signal to Windows Recall: Drop dead
    Windows, as all but the most besotted Microsoft fans know, has historically been a security disaster. Seriously, what other program has a dedicated day each month to reveal its latest security holes? But now, Windows Recall, the AI-powered “feature” that continuously takes snapshots of your screen to create a searchable timeline of everything you do, has arrived for Copilot+ PCs running Windows 11 version 24H2 and newer. After a year of controversy and multiple delays prompted by widespread privacy and security concerns, Microsoft has significantly changed Recall’s architecture. The feature is now opt-in, requires Windows Hello biometric authentication, encrypts all snapshots locally, filters out sensitive data such as credit card numbers, and allows users to filter out specific apps or websites from being captured. I am so unimpressed. A few days ago, in the latest Patch Tuesday release, Microsoft revealed five — count ’em, five! — zero-day security holes in Windows alone. Do you expect me to trust Recall with a track record like this? Besides, even if I don’t enable the feature, what if our beloved federal government decides that for our protection, it would be better if Microsoft turned on Recall for some users? After all, it’s almost impossible to run Windows these days without having a Microsoft ID, making it easy to pick and choose who gets what “update.” Other people feel the same way. Recall remains a lightning rod for criticism. Privacy advocates and security experts continue to warn that the very nature of Recall capturing and storing everything displayed on a user’s screen every few seconds is inherently too risky. Even if you don’t use the feature yourself, what about all the people you communicate with who might have Recall turned on? How could you even know? A friend at the University of Pennsylvania told me that the school has examined Microsoft Recall and found that it “introduces substantial and unacceptable security, legality, and privacy challenges.” Sounds about right to me. Amusingly enough, Kaspersky, the Russian security company that has its own security issues, also states that you should avoid Recall. Why? Well, yes, when you first activate Recall, you are required to use biometric authentication. After that, your PIN will do nicely. Oh, and its automatic filtering of sensitive data is unreliable. Sure, it will stop taking snapshots when you’re in private mode on Chrome or Edge. Vivaldi? Not so much. And as Kaspersky points out, if you use videoconferencing with automatic transcription enabled, Recall will save a complete call transcript detailing who said what. Oh boy! Signal, the popular secure messaging program, wants nothing to do with this. It has introduced a new “Screen security” setting in its Windows desktop app, specifically designed to protect its users from Recall. Enabled by default on Windows 11, this feature uses a Digital Rights Managementflag to stop any application, including Windows Recall, from capturing screenshots of Signal chats. When Recall or other screenshot tools try to capture Signal’s window, it will produce a blank image. Why? In a blog post, Signal explained: “Although Microsoft made several adjustments over the past twelve months in response to critical feedback, the revamped version of Recall still places any content that’s displayed within privacy-preserving apps like Signal at risk. As a result, we are enabling an extra layer of protection by default on Windows 11 in order to help maintain the security of Signal Desktop on that platform, even though it introduces some usability trade-offs. Microsoft has simply given us no other option.” Actually, you do have another option: Desktop Linux. I said it ages ago, and I’ll say it again now. If you really care about security on your desktop, you want Linux. #signal #windows #recall #drop #dead
    WWW.COMPUTERWORLD.COM
    Signal to Windows Recall: Drop dead
    Windows, as all but the most besotted Microsoft fans know, has historically been a security disaster. Seriously, what other program has a dedicated day each month to reveal its latest security holes? But now, Windows Recall, the AI-powered “feature” that continuously takes snapshots of your screen to create a searchable timeline of everything you do, has arrived for Copilot+ PCs running Windows 11 version 24H2 and newer. After a year of controversy and multiple delays prompted by widespread privacy and security concerns, Microsoft has significantly changed Recall’s architecture. The feature is now opt-in, requires Windows Hello biometric authentication, encrypts all snapshots locally, filters out sensitive data such as credit card numbers, and allows users to filter out specific apps or websites from being captured. I am so unimpressed. A few days ago, in the latest Patch Tuesday release, Microsoft revealed five — count ’em, five! — zero-day security holes in Windows alone. Do you expect me to trust Recall with a track record like this? Besides, even if I don’t enable the feature, what if our beloved federal government decides that for our protection, it would be better if Microsoft turned on Recall for some users? After all, it’s almost impossible to run Windows these days without having a Microsoft ID, making it easy to pick and choose who gets what “update.” Other people feel the same way. Recall remains a lightning rod for criticism. Privacy advocates and security experts continue to warn that the very nature of Recall capturing and storing everything displayed on a user’s screen every few seconds is inherently too risky. Even if you don’t use the feature yourself, what about all the people you communicate with who might have Recall turned on? How could you even know? A friend at the University of Pennsylvania told me that the school has examined Microsoft Recall and found that it “introduces substantial and unacceptable security, legality, and privacy challenges.” Sounds about right to me. Amusingly enough, Kaspersky, the Russian security company that has its own security issues, also states that you should avoid Recall. Why? Well, yes, when you first activate Recall, you are required to use biometric authentication. After that, your PIN will do nicely. Oh, and its automatic filtering of sensitive data is unreliable. Sure, it will stop taking snapshots when you’re in private mode on Chrome or Edge. Vivaldi? Not so much. And as Kaspersky points out, if you use videoconferencing with automatic transcription enabled, Recall will save a complete call transcript detailing who said what. Oh boy! Signal, the popular secure messaging program (well, secure when you use it correctly — unlike, say, the US Secretary of Defense), wants nothing to do with this. It has introduced a new “Screen security” setting in its Windows desktop app, specifically designed to protect its users from Recall. Enabled by default on Windows 11, this feature uses a Digital Rights Management (DRM) flag to stop any application, including Windows Recall, from capturing screenshots of Signal chats. When Recall or other screenshot tools try to capture Signal’s window, it will produce a blank image. Why? In a blog post, Signal explained: “Although Microsoft made several adjustments over the past twelve months in response to critical feedback, the revamped version of Recall still places any content that’s displayed within privacy-preserving apps like Signal at risk. As a result, we are enabling an extra layer of protection by default on Windows 11 in order to help maintain the security of Signal Desktop on that platform, even though it introduces some usability trade-offs. Microsoft has simply given us no other option.” Actually, you do have another option: Desktop Linux. I said it ages ago, and I’ll say it again now. If you really care about security on your desktop, you want Linux.
    0 Commenti 0 condivisioni
  • Earth’s Core Is Leaking Gold Into Volcanoes, Scientists Say

    By

    Isaac Schultz

    Published May 23, 2025

    |

    Comments|

    A volcanic eruption. Photo: United States Geological SurveyEarth’s core is apparently a bit leakier than scientists expected. In a new study published in Nature, researchers describe evidence that traces of precious metals from Earth’s metallic core, including ruthenium and gold, are seeping up into volcanic rocks on the surface. The University of Göttingen-led team examined lava from Hawaii’s volcanic islands and discovered an unusually high concentration of a rare isotope: ruthenium-100, an isotope that’s more common in Earth’s core than in the rocky mantle. The isotope’s presence suggested that the lava had somehow picked up material from the planet’s deepest layer—more than 1,800 milesbeneath your feet. “When the first results came in, we realized that we had literally struck gold,” said Nils Messling, a geochemist at the University of Göttingen, in a university release. “Our data confirmed that material from the core, including gold and other precious metals, is leaking into the Earth’s mantle above.”

    Earth’s core formed over 4 billion years ago and contains more than 99.999% of the planet’s gold supply. But as Nature reported, previous studies indicated that some volcanic rocks consisted of material from Earth’s core, raising questions about how the heck that material got to the surface. Now, thanks to ultra-high precision isotopic analysis developed by the Göttingen team, researchers were able to resolve previously undetectable differences in ruthenium isotopes—an achievement that dialed the team into the relationship between Earth’s center and its most explosive sites on the surface. “Our findings not only show that the Earth’s core is not as isolated as previously assumed,” said Professor Matthias Willbold, also of the University of Göttingen, “We can now also prove that huge volumes of super-heated mantle material–several hundreds of quadrillion metric tonnes of rock–originate at the core-mantle boundary and rise to the Earth’s surface to form ocean islands like Hawaii.”

    The team’s findings indicate that Earth’s supply of precious metals near the surface may owe some of its origins to this deep-seated reserve of molten rock. Studying other hotspots—think of Iceland, Japan, and other regions crammed with active volcanoes—could clarify how much of the material brought to the surface originates from the boundary between Earth’s core and its mantle.

    Daily Newsletter

    You May Also Like

    By

    Margherita Bassi

    Published May 21, 2025

    By

    Isaac Schultz

    Published May 6, 2025

    By

    Isaac Schultz

    Published April 27, 2025

    By

    Isaac Schultz

    Published March 18, 2025

    By

    Margherita Bassi

    Published March 15, 2025

    By

    Adam Kovac

    Published March 2, 2025
    #earths #core #leaking #gold #into
    Earth’s Core Is Leaking Gold Into Volcanoes, Scientists Say
    By Isaac Schultz Published May 23, 2025 | Comments| A volcanic eruption. Photo: United States Geological SurveyEarth’s core is apparently a bit leakier than scientists expected. In a new study published in Nature, researchers describe evidence that traces of precious metals from Earth’s metallic core, including ruthenium and gold, are seeping up into volcanic rocks on the surface. The University of Göttingen-led team examined lava from Hawaii’s volcanic islands and discovered an unusually high concentration of a rare isotope: ruthenium-100, an isotope that’s more common in Earth’s core than in the rocky mantle. The isotope’s presence suggested that the lava had somehow picked up material from the planet’s deepest layer—more than 1,800 milesbeneath your feet. “When the first results came in, we realized that we had literally struck gold,” said Nils Messling, a geochemist at the University of Göttingen, in a university release. “Our data confirmed that material from the core, including gold and other precious metals, is leaking into the Earth’s mantle above.” Earth’s core formed over 4 billion years ago and contains more than 99.999% of the planet’s gold supply. But as Nature reported, previous studies indicated that some volcanic rocks consisted of material from Earth’s core, raising questions about how the heck that material got to the surface. Now, thanks to ultra-high precision isotopic analysis developed by the Göttingen team, researchers were able to resolve previously undetectable differences in ruthenium isotopes—an achievement that dialed the team into the relationship between Earth’s center and its most explosive sites on the surface. “Our findings not only show that the Earth’s core is not as isolated as previously assumed,” said Professor Matthias Willbold, also of the University of Göttingen, “We can now also prove that huge volumes of super-heated mantle material–several hundreds of quadrillion metric tonnes of rock–originate at the core-mantle boundary and rise to the Earth’s surface to form ocean islands like Hawaii.” The team’s findings indicate that Earth’s supply of precious metals near the surface may owe some of its origins to this deep-seated reserve of molten rock. Studying other hotspots—think of Iceland, Japan, and other regions crammed with active volcanoes—could clarify how much of the material brought to the surface originates from the boundary between Earth’s core and its mantle. Daily Newsletter You May Also Like By Margherita Bassi Published May 21, 2025 By Isaac Schultz Published May 6, 2025 By Isaac Schultz Published April 27, 2025 By Isaac Schultz Published March 18, 2025 By Margherita Bassi Published March 15, 2025 By Adam Kovac Published March 2, 2025 #earths #core #leaking #gold #into
    GIZMODO.COM
    Earth’s Core Is Leaking Gold Into Volcanoes, Scientists Say
    By Isaac Schultz Published May 23, 2025 | Comments (0) | A volcanic eruption. Photo: United States Geological Survey (M. Patrick) Earth’s core is apparently a bit leakier than scientists expected. In a new study published in Nature, researchers describe evidence that traces of precious metals from Earth’s metallic core, including ruthenium and gold, are seeping up into volcanic rocks on the surface. The University of Göttingen-led team examined lava from Hawaii’s volcanic islands and discovered an unusually high concentration of a rare isotope: ruthenium-100, an isotope that’s more common in Earth’s core than in the rocky mantle. The isotope’s presence suggested that the lava had somehow picked up material from the planet’s deepest layer—more than 1,800 miles (2,900 kilometers) beneath your feet. “When the first results came in, we realized that we had literally struck gold,” said Nils Messling, a geochemist at the University of Göttingen, in a university release. “Our data confirmed that material from the core, including gold and other precious metals, is leaking into the Earth’s mantle above.” Earth’s core formed over 4 billion years ago and contains more than 99.999% of the planet’s gold supply. But as Nature reported, previous studies indicated that some volcanic rocks consisted of material from Earth’s core, raising questions about how the heck that material got to the surface. Now, thanks to ultra-high precision isotopic analysis developed by the Göttingen team, researchers were able to resolve previously undetectable differences in ruthenium isotopes—an achievement that dialed the team into the relationship between Earth’s center and its most explosive sites on the surface. “Our findings not only show that the Earth’s core is not as isolated as previously assumed,” said Professor Matthias Willbold, also of the University of Göttingen, “We can now also prove that huge volumes of super-heated mantle material–several hundreds of quadrillion metric tonnes of rock–originate at the core-mantle boundary and rise to the Earth’s surface to form ocean islands like Hawaii.” The team’s findings indicate that Earth’s supply of precious metals near the surface may owe some of its origins to this deep-seated reserve of molten rock. Studying other hotspots—think of Iceland, Japan, and other regions crammed with active volcanoes—could clarify how much of the material brought to the surface originates from the boundary between Earth’s core and its mantle. Daily Newsletter You May Also Like By Margherita Bassi Published May 21, 2025 By Isaac Schultz Published May 6, 2025 By Isaac Schultz Published April 27, 2025 By Isaac Schultz Published March 18, 2025 By Margherita Bassi Published March 15, 2025 By Adam Kovac Published March 2, 2025
    0 Commenti 0 condivisioni