The knee-jerk overreaction to Microsoft Recall exposed the worst in all of us, but at least Microsoft is doing the right thing with Recall. After thoroughly reviewing the information it provided at the Copilot+ PC launch and comparing it to changes it promised this past week, I feel better than ever about this feature.
I’m sure you all know the story: Microsoft announced Recall as the key new “breakthrough AI experience” that will be exclusive to Snapdragon X-based Copilot+ PCs at the firm’s May 20 Copilot+ PC launch event. I had two immediate reactions as I watched this part of the presentation unfold. Here was Microsoft, 20 years later, still trying to solve the problem of finding documents and other information instantaneously. And two, privacy advocates were going to have a field day with this.
Microsoft Recall solves a very real problem: It’s difficult to find documents and other information that we previously accessed on our PCs. Part of the issue is tied to so-called “data silos,” where information is spread out between files, email messages, chat threads, web browser tabs, and other locations. And part of it is that PCs, to date, have forced us to
Recall took up less than 6 minutes of the 1 hour and 8 minute Copilot+ PC launch event, and so Microsoft’s explanation of why customers should trust this feature was decidedly light. Yusuf Mehdi made only three explicit claims related to privacy and security:
“We’ve built Recall with responsible AI principles and aligned it with our standards.”
You can read about Microsoft’s principles and approach to responsible AI on its website. At a high level, Microsoft says it develops AI systems responsibly and in ways that warrant people’s trust. More to the point, responsible AI must take into account how the system might work when used in ways its makers didn’t intend. And it must take privacy and security into account. This is all delightfully vague.
“We’re going to keep your Recall index private and local and secure on just the device.”
Recall requires a powerful NPU because it uses multiple on-device models at the same time, all running in the background, to do its work with harming system performance or battery life. But the on-device nature of Recall also requires data storage. And Microsoft never explained how that data will be “private, local, and secure” and stay “just on the device.”
“We won’t use any of that information to train any AI models.”
This is a key claim for any private AI solution these days, and it should be reassuring to users: Not only is the AI that’s processing your data and information only available on-device, what happens on your PC stays on your PC. Nothing is fed into Microsoft’s broader AI efforts, anonymized or not.
“We put you completely in control with the ability to edit and delete anything that has been captured.”
Here, again, vagueness won the day: Microsoft often claims that its customers are in control, but certain behaviors in…