“Microsoft has simply given us no other option,” Signal says as it blocks Windows Recall "altar of AI aspirations" “Microsoft has simply given us no other option,” Signal says as it blocks Windows Recall Even after its..."> “Microsoft has simply given us no other option,” Signal says as it blocks Windows Recall "altar of AI aspirations" “Microsoft has simply given us no other option,” Signal says as it blocks Windows Recall Even after its..." /> “Microsoft has simply given us no other option,” Signal says as it blocks Windows Recall "altar of AI aspirations" “Microsoft has simply given us no other option,” Signal says as it blocks Windows Recall Even after its..." />

Upgrade to Pro

“Microsoft has simply given us no other option,” Signal says as it blocks Windows Recall

"altar of AI aspirations"

“Microsoft has simply given us no other option,” Signal says as it blocks Windows Recall

Even after its refurbishing, Recall provides few ways to exclude specific apps.

Dan Goodin



May 21, 2025 4:21 pm

|

25

The Signal messaging app on a mobile phone.

Credit:

Getty Images

The Signal messaging app on a mobile phone.

Credit:

Getty Images

Story text

Size

Small
Standard
Large

Width
*

Standard
Wide

Links

Standard
Orange

* Subscribers only
  Learn more

Signal Messenger is warning the users of its Windows Desktop version that the privacy of their messages is under threat by Recall, the AI tool rolling out in Windows 11 that will screenshot, index, and store almost everything a user does every three seconds.
Effective immediately, Signal for Windows will by default block the ability of Windows to screenshot the app. Signal users who want to disable the block—for instance to preserve a conversation for their records or make use of accessibility features for sight-impaired users—will have to change settings inside their desktop version to enable screenshots.
My kingdom for an API
“Although Microsoft made several adjustments over the past twelve months in response to critical feedback, the revamped version of Recall still places any content that’s displayed within privacy-preserving apps like Signal at risk,” Signal officials wrote Wednesday. “As a result, we are enabling an extra layer of protection by default on Windows 11 in order to help maintain the security of Signal Desktop on that platform even though it introduces some usability trade-offs. Microsoft has simply given us no other option.”
When Recall was first introduced in May 2024, security and privacy practitioners quickly warned it created undue risks for both Windows users and those using other platforms who interact with Windows users. Many of the criticisms were based on specific designs. Recall was turned on by default. Screenshots and OCR data were stored in plaintext, where it could be accessed by any app with user system rights. It provided few granular tools to limit the type of content that was sucked into its massive vacuum bag of data.
After facing one of its worst PR disasters in recent memory, Microsoft pulled Recall out of Windows 11 previews a few months after adding it. Then, last month, Microsoft reintroduced a significantly overhauled version of the tool.
As Ars Senior Technology Reporter Andrew Cunningham painstakingly documented a few weeks later, the refurbished Recall went to great lengths to correct some of the poorly thought-through designs in the first iteration. Recall was now opt-in, rather than on by default. The database storing Recall data was now encrypted, with the keys secured in a secure enclave separate from Windows. And the tool now provided some level of user control to limit the type of content it indexed.

But the changes go only so far in limiting the risks Recall poses. As I pointed out, when Recall is turned on, it indexes Zoom meetings, emails, photos, medical conditions, and—yes—Signal conversations, not just with the user, but anyone interacting with that user, without their knowledge or consent.
Researcher Kevin Beaumont performed his own deep-dive analysis that also found that some of the new controls were lacking. For instance, Recall continued to screenshot his payment card details. It also decrypted the database with a simple fingerprint scan or PIN. And its unclear whether the type of sophisticated malware that routinely infects both consumer and enterprise Windows users will be able to decrypt encrypted database contents.
And as my Ars colleague Cunningham also noted, Beaumont found that Microsoft still provided no means for developers to prevent content displayed in their apps from being indexed. That left Signal developers at a disadvantage, so they had to get creative.
With no API for blocking Recall in the Windows Desktop version, Signal is instead invoking an API Microsoft provides for protecting copyrighted material. App developers can turn on the DRM setting to prevent Windows from taking screenshots of copyrighted content displayed in the app. Signal is now repurposing the API to add an extra layer of privacy.
“We hope that the AI teams building systems like Recall will think through these implications more carefully in the future,” Signal wrote Wednesday. “Apps like Signal shouldn’t have to implement ‘one weird trick’ in order to maintain the privacy and integrity of their services without proper developer tools. People who care about privacy shouldn’t be forced to sacrifice accessibility upon the altar of AI aspirations either.”
Signal's move will lessen the chances of Recall permanently indexing private messages, but it also has its limits. The measure only provides protection when all parties to a chat—at least those using the Windows Desktop version—haven't changed the default settings.
Microsoft officials didn’t immediately respond to an email asking why Windows provides developers with no granular control over Recall and whether the company has plans to add any.

Dan Goodin
Senior Security Editor

Dan Goodin
Senior Security Editor

Dan Goodin is Senior Security Editor at Ars Technica, where he oversees coverage of malware, computer espionage, botnets, hardware hacking, encryption, and passwords. In his spare time, he enjoys gardening, cooking, and following the independent music scene. Dan is based in San Francisco. Follow him at here on Mastodon and here on Bluesky. Contact him on Signal at DanArs.82.

25 Comments
#microsoft #has #simply #given #other
“Microsoft has simply given us no other option,” Signal says as it blocks Windows Recall
"altar of AI aspirations" “Microsoft has simply given us no other option,” Signal says as it blocks Windows Recall Even after its refurbishing, Recall provides few ways to exclude specific apps. Dan Goodin – May 21, 2025 4:21 pm | 25 The Signal messaging app on a mobile phone. Credit: Getty Images The Signal messaging app on a mobile phone. Credit: Getty Images Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more Signal Messenger is warning the users of its Windows Desktop version that the privacy of their messages is under threat by Recall, the AI tool rolling out in Windows 11 that will screenshot, index, and store almost everything a user does every three seconds. Effective immediately, Signal for Windows will by default block the ability of Windows to screenshot the app. Signal users who want to disable the block—for instance to preserve a conversation for their records or make use of accessibility features for sight-impaired users—will have to change settings inside their desktop version to enable screenshots. My kingdom for an API “Although Microsoft made several adjustments over the past twelve months in response to critical feedback, the revamped version of Recall still places any content that’s displayed within privacy-preserving apps like Signal at risk,” Signal officials wrote Wednesday. “As a result, we are enabling an extra layer of protection by default on Windows 11 in order to help maintain the security of Signal Desktop on that platform even though it introduces some usability trade-offs. Microsoft has simply given us no other option.” When Recall was first introduced in May 2024, security and privacy practitioners quickly warned it created undue risks for both Windows users and those using other platforms who interact with Windows users. Many of the criticisms were based on specific designs. Recall was turned on by default. Screenshots and OCR data were stored in plaintext, where it could be accessed by any app with user system rights. It provided few granular tools to limit the type of content that was sucked into its massive vacuum bag of data. After facing one of its worst PR disasters in recent memory, Microsoft pulled Recall out of Windows 11 previews a few months after adding it. Then, last month, Microsoft reintroduced a significantly overhauled version of the tool. As Ars Senior Technology Reporter Andrew Cunningham painstakingly documented a few weeks later, the refurbished Recall went to great lengths to correct some of the poorly thought-through designs in the first iteration. Recall was now opt-in, rather than on by default. The database storing Recall data was now encrypted, with the keys secured in a secure enclave separate from Windows. And the tool now provided some level of user control to limit the type of content it indexed. But the changes go only so far in limiting the risks Recall poses. As I pointed out, when Recall is turned on, it indexes Zoom meetings, emails, photos, medical conditions, and—yes—Signal conversations, not just with the user, but anyone interacting with that user, without their knowledge or consent. Researcher Kevin Beaumont performed his own deep-dive analysis that also found that some of the new controls were lacking. For instance, Recall continued to screenshot his payment card details. It also decrypted the database with a simple fingerprint scan or PIN. And its unclear whether the type of sophisticated malware that routinely infects both consumer and enterprise Windows users will be able to decrypt encrypted database contents. And as my Ars colleague Cunningham also noted, Beaumont found that Microsoft still provided no means for developers to prevent content displayed in their apps from being indexed. That left Signal developers at a disadvantage, so they had to get creative. With no API for blocking Recall in the Windows Desktop version, Signal is instead invoking an API Microsoft provides for protecting copyrighted material. App developers can turn on the DRM setting to prevent Windows from taking screenshots of copyrighted content displayed in the app. Signal is now repurposing the API to add an extra layer of privacy. “We hope that the AI teams building systems like Recall will think through these implications more carefully in the future,” Signal wrote Wednesday. “Apps like Signal shouldn’t have to implement ‘one weird trick’ in order to maintain the privacy and integrity of their services without proper developer tools. People who care about privacy shouldn’t be forced to sacrifice accessibility upon the altar of AI aspirations either.” Signal's move will lessen the chances of Recall permanently indexing private messages, but it also has its limits. The measure only provides protection when all parties to a chat—at least those using the Windows Desktop version—haven't changed the default settings. Microsoft officials didn’t immediately respond to an email asking why Windows provides developers with no granular control over Recall and whether the company has plans to add any. Dan Goodin Senior Security Editor Dan Goodin Senior Security Editor Dan Goodin is Senior Security Editor at Ars Technica, where he oversees coverage of malware, computer espionage, botnets, hardware hacking, encryption, and passwords. In his spare time, he enjoys gardening, cooking, and following the independent music scene. Dan is based in San Francisco. Follow him at here on Mastodon and here on Bluesky. Contact him on Signal at DanArs.82. 25 Comments #microsoft #has #simply #given #other
ARSTECHNICA.COM
“Microsoft has simply given us no other option,” Signal says as it blocks Windows Recall
"altar of AI aspirations" “Microsoft has simply given us no other option,” Signal says as it blocks Windows Recall Even after its refurbishing, Recall provides few ways to exclude specific apps. Dan Goodin – May 21, 2025 4:21 pm | 25 The Signal messaging app on a mobile phone. Credit: Getty Images The Signal messaging app on a mobile phone. Credit: Getty Images Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more Signal Messenger is warning the users of its Windows Desktop version that the privacy of their messages is under threat by Recall, the AI tool rolling out in Windows 11 that will screenshot, index, and store almost everything a user does every three seconds. Effective immediately, Signal for Windows will by default block the ability of Windows to screenshot the app. Signal users who want to disable the block—for instance to preserve a conversation for their records or make use of accessibility features for sight-impaired users—will have to change settings inside their desktop version to enable screenshots. My kingdom for an API “Although Microsoft made several adjustments over the past twelve months in response to critical feedback, the revamped version of Recall still places any content that’s displayed within privacy-preserving apps like Signal at risk,” Signal officials wrote Wednesday. “As a result, we are enabling an extra layer of protection by default on Windows 11 in order to help maintain the security of Signal Desktop on that platform even though it introduces some usability trade-offs. Microsoft has simply given us no other option.” When Recall was first introduced in May 2024, security and privacy practitioners quickly warned it created undue risks for both Windows users and those using other platforms who interact with Windows users. Many of the criticisms were based on specific designs. Recall was turned on by default. Screenshots and OCR data were stored in plaintext, where it could be accessed by any app with user system rights. It provided few granular tools to limit the type of content that was sucked into its massive vacuum bag of data. After facing one of its worst PR disasters in recent memory, Microsoft pulled Recall out of Windows 11 previews a few months after adding it. Then, last month, Microsoft reintroduced a significantly overhauled version of the tool. As Ars Senior Technology Reporter Andrew Cunningham painstakingly documented a few weeks later, the refurbished Recall went to great lengths to correct some of the poorly thought-through designs in the first iteration. Recall was now opt-in, rather than on by default. The database storing Recall data was now encrypted, with the keys secured in a secure enclave separate from Windows. And the tool now provided some level of user control to limit the type of content it indexed. But the changes go only so far in limiting the risks Recall poses. As I pointed out, when Recall is turned on, it indexes Zoom meetings, emails, photos, medical conditions, and—yes—Signal conversations, not just with the user, but anyone interacting with that user, without their knowledge or consent. Researcher Kevin Beaumont performed his own deep-dive analysis that also found that some of the new controls were lacking. For instance, Recall continued to screenshot his payment card details. It also decrypted the database with a simple fingerprint scan or PIN. And its unclear whether the type of sophisticated malware that routinely infects both consumer and enterprise Windows users will be able to decrypt encrypted database contents. And as my Ars colleague Cunningham also noted, Beaumont found that Microsoft still provided no means for developers to prevent content displayed in their apps from being indexed. That left Signal developers at a disadvantage, so they had to get creative. With no API for blocking Recall in the Windows Desktop version, Signal is instead invoking an API Microsoft provides for protecting copyrighted material. App developers can turn on the DRM setting to prevent Windows from taking screenshots of copyrighted content displayed in the app. Signal is now repurposing the API to add an extra layer of privacy. “We hope that the AI teams building systems like Recall will think through these implications more carefully in the future,” Signal wrote Wednesday. “Apps like Signal shouldn’t have to implement ‘one weird trick’ in order to maintain the privacy and integrity of their services without proper developer tools. People who care about privacy shouldn’t be forced to sacrifice accessibility upon the altar of AI aspirations either.” Signal's move will lessen the chances of Recall permanently indexing private messages, but it also has its limits. The measure only provides protection when all parties to a chat—at least those using the Windows Desktop version—haven't changed the default settings. Microsoft officials didn’t immediately respond to an email asking why Windows provides developers with no granular control over Recall and whether the company has plans to add any. Dan Goodin Senior Security Editor Dan Goodin Senior Security Editor Dan Goodin is Senior Security Editor at Ars Technica, where he oversees coverage of malware, computer espionage, botnets, hardware hacking, encryption, and passwords. In his spare time, he enjoys gardening, cooking, and following the independent music scene. Dan is based in San Francisco. Follow him at here on Mastodon and here on Bluesky. Contact him on Signal at DanArs.82. 25 Comments
·101 Views