Windows 11’s new “Recall” feature promised a convenient “photographic memory” for your PC — silently snapping screenshots every few seconds and filing them away for AI-powered searches. But privacy advocates (and now Signal) aren’t waiting around for Microsoft to get it right. In a bold move, Signal’s Windows 11 client now ships with screen security enabled by default, blocking Recall from ever capturing your private chats.
First unveiled in May 2024 as part of Microsoft’s Copilot+ PC initiative, Recall is designed to help you track down that one slide or message you vaguely remember. Behind the scenes, Recall continuously captures snapshots of your screen and stores them locally, encrypting them and letting you search by keyword or image description later.
Critics were swift to point out the risks: uninvited archiving of sensitive data, scant opt-out controls for app developers, and a history of Microsoft products stretching user trust thin. Despite assurances that Recall would remain local and opt-in, the initial revelations spurred a public outcry, prompting Microsoft to delay and retool the feature twice before its April 2025 launch.
Enter Signal, the encrypted-by-default messaging app whose ethos clashes head-on with any persistent screen-grab tool. In its latest Windows 11 update, Signal enables “Screen security” by default — the same kind of DRM that black-outs screenshots of Netflix shows. Attempt to snap your Signal window, and you’ll be greeted with… nothing but black.
Under Signal Settings > Privacy > Screen Security, technically savvy users can disable this protection. But Signal makes the call for you, citing that without such measures, Recall could catalog participants’ faces, address details, or any other private media exchanged in chats.
Developer Joshua Lund didn’t mince words. In a blog post, Lund argued that Microsoft has “simply given us no other option” by failing to provide app makers with an API to exclude sensitive content from Recall’s capture. He urged OS vendors to ensure that privacy-focused developers can decisively block system-level AI snooping.
Microsoft does offer some mitigations — filtering out private/incognito browser windows by default, and letting Copilot+ users manually exclude apps from Recall. But these settings aren’t well publicized, nor easily discovered by the average user. Signal argues that security shouldn’t depend on user initiative for each app — it should be baked in.
Blocking screenshots universally isn’t without downsides. Screen-reader tools and other assistive technologies often rely on taking snapshots of visual elements to interpret content for users with visual impairments. Signal acknowledges that its DRM approach “could cause problems for people who use accessibility features like screen readers,” and offers explicit warnings when disabling screen security.
The tension between robust privacy and seamless accessibility is a recurring theme in digital rights debates. As more protections appear, so too do concerns about inadvertently locking out legitimate use cases. Signal’s team has left the setting just one click away from deactivation, but warns that doing so “could compromise privacy.”
Signal’s defensive tactic may set a precedent. Other privacy-first applications — secure file syncers, VPN clients, even financial services — could adopt similar DRM flags to keep Recall at bay. But without a standardized API from Microsoft, each developer must roll their own workaround, leading to inconsistent user experiences.
Imagine toggling on screen security for your banking app but forgetting to do so for a sensitive document viewer. Or trying to take a legitimate screenshot for a work audit, only to hit a black screen. The onus falls on individual developers until Microsoft provides clearer, more granular controls.
Signal’s move underscores a larger industry shift: as operating systems bake AI deeper into their cores, the delineation between user data and system services blurs. Privacy-minded developers are sounding the alarm that app boundaries must be respected, especially when sensitive conversations or documents are at stake.
Joshua Lund’s parting shot: “OS vendors like Microsoft need to ensure that developers of apps like Signal always have the necessary tools and options at their disposal to reject granting OS-level AI systems access to any sensitive information within their apps.” If Microsoft heeds the call, future iterations of Recall might include a robust, developer-facing API — obviating the need for heavy-handed DRM tricks.
Signal’s default screen-security rollout is more than a software update; it’s a statement. In a world where your operating system quietly catalogs every keystroke and pixel, privacy-first apps are forced into digital trenches. Until OS-level AI platforms offer clear, consistent opt-out capabilities for sensitive content, expect Signal and its ilk to deploy whatever defenses they can. On Windows 11 today, that defense is a black screen — and for privacy enthusiasts, that’s exactly what they want to capture.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
