Header image

Microsoft Adds Security and Privacy Layers to Recall

Opt-in option added to controversial screen shot software ahead of launch next week.

Microsoft has moved its controversial Recall feature to be opt-in following privacy concerns.

Initially Recall was added as a feature in its AI tool Copilot, which was “a new way to instantly find something you’ve previously seen on your PC.”

Recall would periodically take a snapshot of what appears on your screen, and these images are encrypted, stored and analysed locally, using on-device AI capabilities to understand their context.

Users in Control?

Microsoft assured users that they would always be in control of what’s saved, and they can disable saving snapshots, pause temporarily, filter applications and delete your snapshots at any time.

With the preview of Recall due to be launched next week, on June 18th, users will be able to opt-in to saving snapshots using Recall. “If you don’t proactively choose to turn it on, it will be off by default,” the company said in a statement.

Also, Microsoft has added additional layers of data protection, including “just in time” decryption protected by Windows Hello Enhanced Sign-in Security (ESS), so that the Recall snapshots will only be decrypted and accessible when the user authenticates.

“This gives an additional layer of protection to Recall data in addition to other default enabled Window Security features like SmartScreen and Defender, which use advanced AI techniques to help prevent malware from accessing data like Recall.”

The search index database will also be encrypted. “As we always do, we will continue to listen to and learn from our customers, including consumers, developers and enterprises, to evolve our experiences in ways that are meaningful to them,” the statement said.

Controversy and Privacy Concerns

Before these changes were made, Recall was described as a privacy nightmare, as “an infostealer into base Windows OS and enable[d] by default” and that "Microsoft will need a lawful basis to record and re-display the user’s personal information.”

Researcher Kevin Beaumont did an analysis of the Copilot+ software in May, and said “it spits constant screenshots into the current user’s AppData as part of image storage” and these are extracted into a SQLite database file. “It 100% does not need physical access and can be stolen,” he said.

The controversy led to the Information Commissioner’s Office making enquiries with Microsoft to understand the safeguards in place to protect user privacy.

“We expect organisations to be transparent with users about how their data is being used and only process personal data to the extent that it is necessary to achieve a specific purpose,” a spokesperson said.


“Industry must consider data protection from the outset and rigorously assess and mitigate risks to peoples' rights and freedoms before bringing products to market. We are making enquiries with Microsoft to understand the safeguards in place to protect user privacy.”

Dan Raywood Senior Editor SC Media UK

Dan Raywood is a seasoned B2B journalist with over 20 years of experience, specializing in cybersecurity for the past 15 years. He has extensively covered topics from Advanced Persistent Threats and nation-state hackers to major data breaches and regulatory changes. Outside work, Dan enjoys supporting Tottenham Hotspur, managing mischievous cats, and sampling craft beers.

Dan Raywood Senior Editor SC Media UK

Dan Raywood is a seasoned B2B journalist with over 20 years of experience, specializing in cybersecurity for the past 15 years. He has extensively covered topics from Advanced Persistent Threats and nation-state hackers to major data breaches and regulatory changes. Outside work, Dan enjoys supporting Tottenham Hotspur, managing mischievous cats, and sampling craft beers.

Upcoming Events

11
Jul

Beyond Cloud Security Posture Management:

Validating Cloud Effectiveness with Attack Simulation

image image image image